Neil Edelstein of GoldenSource explains why data management is the featured project in most IT budgets this year and is often a key element of any enterprise-wide risk management strategy.
Q: How much of a priority will data management projects be for financial institutions in the year ahead when budgets will be tightened?
We are seeing a real drive towards ‘back to basics.’ Among financial institutions, there is a general movement and focus on untangling the mess of [internal processes]. And data management is really the foundation to do that untangling because all processes spring from data. Effective management of data can provide the tools to help a firm move forward with its operations and the business.
IT budgets will be tighter this year but we’ve seen recently firms adopting this notion of ‘quarantined projects,” which indicates there are specific types of projects that will remain on the budgets because a project is so essential to the organization’s operation and future growth. Both risk and data management programs are the top ‘untouchable’ projects for ’09. This means, as a data management solutions provider, we have not been hit by the credit crunch. In fact, we’ve definitely seen an increase in interest in the topic of data management and the infrastructure and solutions we provide. For us, client projects are going forward.
Q: What is the main driver for taking on a new data management program? What do financial institutions expect to gain by implementing a data warehousing solution?
Data management is not a driver for our clients: risk management is. Risk management has really become the catalyst to drive data management programs because any risk management system is only going to be as good as the data filtering into it. For example, we had one client who was installing an enterprise-wide risk management system but hadn’t considered data before beginning the project. When this client started to pull apart the elements of the risk management system and strategy to see what they needed, they quickly realized they were bringing in scattered data from all around the company (data from different silos and in different formats/versions) to create complete data full for a single security. Naturally, this realization led the firm to immediately embark on a data management program to complement the risk management plans. We see this scenario a lot. Really data management and risk management go hand in hand as projects. And I would say that aside from cost, these days risk management is the number one impetus for data management practices and especially now.
Q: What do firms expect to gain by implementing a data management solution such as a data warehouse?
When we talk about data management we are really talking about transparency. You need to have a complete set of data to have a transparency in processing and particularly the processing of complex instruments like derivatives, which have multiple legs as part of the overall financial product. With a consolidated and transparent view of data, firms have a better understand complex securities and improve risk management processes.
The term ‘data’ still means many things to different people. How to people view data management now?
In the past, reference data was seen as an entity on to itself. What we see now is that there is no more distinction between security product data, counterparty data and position data – it’s all part of the whole data management process. Today securities are so complex that you can’t have one slice of data without the other because only when you bring these pieces together, can you get the full view of the security and the risks across the board. Such a transparent and comprehensive view of data is, again, essential for a firm to effectively manage all of its risk exposures.
Q: Integration is a common obstacle in most operational or technology projects. Why is integration of systems and processes such an operational problem for financial institutions and why is data such a mainstay in any integration work?
Integration is always an issue. Integration was a problem even years ago when projects weren’t as complex (not built on mergers and acquisitions), and didn’t need to be done with the speed we need to get projects done today. There is no way to broad brush these types of projects. Integration is a sticky, ‘hands on’ type of project and although it is an integral part of any project, it is often not prepared for as much as other parts of the project are. As for data management, integration is especially difficult when dealing with different silos of data. Integrating the data from these siloed data marts is akin to mixing apples with oranges – to get the right cocktail requires dissecting and rebuilding multiple systems that are often years old.
As with the lifecycle of any project, the toughest part is always getting a good understanding of the status quo of the current operation, and with data this is extremely difficult to do. Often we see the larger financial institutions have anywhere from 30 to 35 different data silo marts, and if an organization acquires or merges with another, there are closer to 70 different data silos to integrate together. This gives you an idea of the scale of one of these projects. And systems and technology isn’t the only problem – finding staff that actually understand this is not easy either. So it’s not just a technology problem, it is a people problem as well.
Integration is crucial when dealing with all types of securities and areas of downstream processing but this is especially tricky with the more complex financial products, including derivatives. And with traders looking to use increasingly complex products regularly, there is extreme pressure on the back office to keep up with the processing of these new products. But the back office doesn’t have the man power or technological ability to do this. For example, new instruments can be easily maintained in a data master security file but derivative products cannot be held here. This means a back office has to create external files to hold the data. Often, this is just the beginning of the integration problem for many firms.
Q: Where can financial institutions begin improving their data management processes to achieve immediate benefits in derivatives processing and risk management?
In times like these, you have to attack the exact problem because large budgets are just not going to be approved right now. We’ve identified that derivatives processing is a major issue and will become increasingly significant in the future. And I think most financial institutions would agree there really is a dire need for derivatives warehousing and risk management in the current market climate.
So, in order to eliminate the need to do wide-sweeping programs, we’ve come up with what we call our ‘derivatives solutions,’ which is really a slice of the warehousing solution aimed at derivatives. For example, if you don’t want to do a full scale data project, you can still improve data management with derivatives (this is where most firms have the most problems anyway) and then expand the solution to other asset classes in time.
Q: Do you have any advice for financial institutions planning to implement a new data management solution this year? What can they do to execute such a project efficiently and for the optimal results?
I think the first thing is a solid current assessment of the data processing today. This front work is so important and this is often the area where firms are lacking. The assessment really needs to happen before you bring in the consultants because consultants don’t know your operations – the people who work for you do. And by this I mean firms need to get a good understanding of the operational aspects of data management including the data architecture, the usage of the data, the overall governance, practices and quality control – all the standard pieces that are included a managing the data throughout the processing chain. You need to know the foundation before jumping into a project.
I would also say that despite the complexity of data management projects, they aren’t as daunting as they used to be. I think if a financial institution isolates the most important pain points, solves them by implementing a robust data warehouse and creates measurable deliverables surrounding the processes, then this is a very doable project. Some firms can do such a specific project in 90 days. People should really realize data management is not such a tough road these days.
* Neil Edelstein is senior director of product solutions at GoldenSource in New York.