When a company is large enough to encompass several offices, countries and even continents the data it amasses oozes potential.On the other hand, because of the above factors, the very same data can also be misused at best and utterly wasted at worst.With this, many companies have been advocating the “single view of truth” as a principle when it comes to data management – in other words, minimising the potential for errors by massively standardising the capture and representation of data.
Ian Murrin, CEO at Digiterre, has been deeply involved in the data journeys of his many clients over the last 2 decades and so knows a thing or two about what’s trending in that space.Ahead of his participation at ETOT again this year, we got the chance to ask him a couple of questions on the matter:
You’ve worked with a number of energy and commodities trading organisations over the years, what trends are you currently seeing in the sector?
The key thrusts are:
(a) doing more with less – less manual work and more automation with a view to cutting costs and streamlining operations. Broadly this comes under the digital transformation banner. However Digital transformation relies heavily upon the quantity, quality and timeliness of data.
(b) trying to make sense of and derive more value from their data – be that market or operational data – is the real challenge. Organisations are awash with data but struggle to derive value from much of it, in a timely manner, because they have no “single view of the truth”, particularly as it relates to their internal, cross-functional (Finance, Trading, Operations) data. Furthermore, organisations are struggling to ingest and analyse ever-higher volumes of market data, especially time-series data, on which do perform meaningful, quant-driven analysis in short timescales.
“We have clients that are seeking to ingest 4Tb + of data bring across 270,000 different time series from 205+ data sources.”
As a software professional, are there any features you get requests for which are simply ahead of their time?
Lots. If you go back to the last point, I made about ingestion of high volumes of market data, a project that we just completed. We have clients that are seeking to ingest 4Tb + of data bring across 270,000 different time series from 205+ data sources, slice them by time and analyse the correlations in near real-time. That’s a huge volume of different data and data sources. The sheer scale, volume and speed of the capabilities they require so much greater than ever and the timeframes for turning that from data to actionable insights, continue to fall as market pressures increase.
The second area relates to internal operational data. For example, in the commodities organisations with multiple ETRM’s – and in the financial markets, or multiple Order Management Systems, organisations – are realising that in order to satisfy both current and future regulatory demands they need to create a “single view of the truth” across all of their trade and transaction / middle office data to avoid developing ad hoc solutions to each new requirement and to reduce human involvement to a minimum by using computational power to validate that data at source, and then enrich that data for analytics and reporting purposes. So that’s the second area of future needs – organisations actively seeking to use technology to solve those data ingestion and validation problems instead of throwing bodies at the problem.
What do you think? Want to know more? Click here to register to join us at ETOT 2019!
Alternatively see the full ETOT AGENDA here.