What are the major data management trends in the capital markets around technology, and what's driving those trends?
What's driving those trends is really many different things. But from a technology perspective, it boils down to a few key underlying forces. Firstly, there are several stressful financial market trends: increased volatility, greater difficulty in booking profits, and a reduction in volume.
Financial services firms are now very focused on cost, specifically cost reduction obviously. Firms are very, very focused on protecting their profits and being able to pay the bonuses to retain the best talent by reducing cost wherever they can and nowhere more so than within information technology and infrastructure.
So what we are seeing is a focus now on reducing data redundancy and waste around any kind of siloed process data environment. And a very keen attention is being focused on process automation where there is still a prevalence to have tasks accomplished manually.
#1 Ranking: Read how InetSoft was rated #1 for user adoption in G2's user survey-based index |
|
Read More |
More Agile Environment for Reporting
A very good example of this area of automation would be around equity derivatives, where a lot of the resetting of the rates and a lot of the settlements and confirmation of the trades is still very manual. And manual in this business equates to being extremely slow and very expensive.
The other area, and this is related to reduction of cost, is the reduction of complexity. Firms are seeking a far more agile environment to support their business processes and their product and service development. The driver for this really comes from the rapid changes that we see and the nature of the market as well as the ever-changing regulatory environment.
Every new change in the market in terms of a difference in the style of trading, a difference in the set of products and services being offered as well as changes in the regulatory environment incur an enormous initial overhead for these firms to be able to accommodate and respond to these new things.
And so what we have is this focus on cost and this focus on the reduction of complexity to enhance agility and responsiveness combined to continually change the nature of products, services and trading methods needed to ensure profitability. In turn, this demand for agility necessitates a reduction of the number and types of data environments.
And so another drive in terms of the major trends in technology around capital markets is the integration of data. However, something that I would like to direct special attention to is that this reduction of complexity and cost is never going to occur at the expense of capability and performance. Those two things really are the competitive advantage of a lot of the premier trading houses on Wall Street.
Attention Towards Data Warehousing
We have seen a rapid rise in the level of attention towards data warehousing from capital market firms simply because there is a need for a single source of the truth in regards of the data. And what I mean by that is the single data repository needs to have the most granular, the most voluminous and the most complete and fresh data as well as being enabling an optimal analytical environment.
So you have got the information, and you have got the ability to query the information and to derive intelligence and insight from it. So how does that relate to cost and complexity? Well we reduce complexity and cost simultaneously through the integration of data, processes and the enhancement of analytics.
What this means is you can perform a plethora of analytical functions with event based automation that will give firms a level of performance, control and capability that they have not been able to enjoy until now. So, where those processes have been manual, because there is an event-based capability built into the warehouse, a lot of those calculations, a lot of that reporting, a lot of that investigation can now be automated.
Where firms had to have multiple environments simply because the volume of data was too large or the number of users who had query against that data was too large and their queries too complex, firms can use a BI software such as InetSoft's in combination with a data warehouse to integrate all of those things into one environment, and achieve lower cost, lower maintenance, better quality information, and faster time-to-insight. That’s how InetSoft really aligns with what the market is demanding in terms of less cost, less complexity, greater capability.
With regard to analytics, data warehousing, business intelligence, there are lot of alternative vendors out there. What makes InetSoft different from the other technologies available in the market today?
If you compare what our capabilities are with what the actual problems and challenges of a lot of the firms within capital markets are, I would say that really our differentiator is that we align very well. In regards to what I have mentioned already, we are different because we approach the data challenge differently.
We focus on solving the problems rather than providing a patchwork approach that treats the symptoms. We address the core of the challenge, which is how do I get all of this information into my environment as quickly as completely as possible and allow my users to access it and query it for their analytics in the least amount of time and the least amount of cost.
The whole model of InetSoft’s technology is to bring business users and analytics directly to the data. There is no need for aggregation that denies the user the ability to see what are the granular transactions. It brings you closer to what is happening in your business as it happens and it enables you to automate a lot of your routine orientated responses as events occur.
What is the future for data management and analytics in capital markets, and if you were a new CIO, what should be on your horizon?
I see really the future as being a continuation of what we have going on right now. But what is really going to change is the scale, the scale of the data, the scale of the number of users, the complexity of the queries that these users will bring into that environment. I think those are going to escalate in ways that we are only just starting to really see.
So really I think the new things that are going to be of interest is going to be how to best make use of the cloud both in terms of internal clouds or private clouds, external clouds or public clouds and the hybrids between the two because those technology models are going to bring to bear a great deal of focus on concurrency and analytics in terms of the users accessing the cloud.
|
Learn how different InetSoft's data mashup technology is and how it turns traditional BI approaches upside down. |
I think we are entering a phase where scalability is really going to be measured in a multidimensional view concurrently rather than a one-dimensional iterative view. Where in the one dimensional model, you had singular expressions of data volume is what we want to focus on, or the number of users or the complexity of the query. I think what’s going to change in the future, and the challenge for the CIO, is how do I meet all of those things at the same time from a single solution?