“Everything is designed. Few things are designed well”
— Brian Reed, software developer & musician
By definition, the concept of enterprise software design appears to be at odds with the concept of user-centered design. The former starts problem-solving with the broader organization while the latter starts with the people. This conflict has been played out through decades of enterprise software solutions that are cluttered and hard to use in their efforts to manage large amounts of complex data and be flexible enough to easily integrate with future systems.
With conflict comes opportunity. Modern enterprise software companies like Salesforce have established a competitive advantage through design methodologies that build their core platform on open architectural concepts to facilitate integration and connection with external applications. This drives long-term adoption and success in a constantly shifting marketplace.
As part of our work to help our enterprise wealth management clients get the most benefit from their data assets, Ezra Group has partnered with Xtiva Financial Systems to produce a series of webinars. Our goal is to bring leading industry experts in data strategy, data architecture and systems implementation to share their experiences and best practices.
The fourth webinar in the series was called “The Tower of Babel: Tips on Consolidating Wealth Management Data From Multiple Sources” and included panelist Todd Winship, Head of Data and Analytics, Temenos.
He shared his observations on the move from batch processing to streaming data, how they’re leveraging AI to generate testing data devoid of privacy issues and how Temenos designed their data platforms with integration in mind.
In case you missed this webinar, you can click here to unlock your access to the full recording.
The Shift to Streaming Data
We’ve seen a slow shift from 100% batch processing overnight files, to sending critical data via FTP and flat files, moving more to API connections and streaming data. From a vendor perspective, everyone’s being forced to shift their technologies to accommodate these new client data requirements, according to Todd Winship, Head of Data and Analytics at Geneva-based Temenos.
Almost every wealth management firm and bank wants to move to more timely integrations via streaming or or API’s, Winship noted. This has made life harder in the short term for technology vendors like Temenos, but in the longer term, it should provide significant benefits, he stated.
A recent survey of global financial services executives conducted by software provider Finastra found that 78% are looking to use APIs to drive or enable Open Banking in the next 12 months, with 36% of businesses having started this process or having already opened APIs.
Temenos has a wide geographic footprint with over 3,000 customers across the globe. Winship said he was surprised at the number of legacy integration patterns they see in banks, especially the larger ones.
These shifts are making life harder for vendors in the short term, because moving from some of these legacy integration patterns to new ones requires shifting your technology, Winship explained. What used to be a relational database has moving to a different technology such as an in memory database or column store. Again, big technology shifts for everyone from a vendor perspective and everybody including the end customers, he said.
And with the technology change we’re also seeing a large shift in skills required for technical staff. The people that managed relational databases in the past aren’t the same people that are going to match your your real time data lakes and real time integration patterns.
So the future will bring some big shifts, but in the end, having a fully real-time in-memory, processing and capability. It’s going to benefit everyone from a digital and compliance perspective.
Integration by Design
The same Finastra survey reported that 85% of financial services executives agreed that ‘the integration of technology and innovation should be at the forefront of the financial services industry’. Over the next 12 months the implementation of technology and innovation will have an outsized impact on which companies thrive on the open sea of successful adaptation and which ones founder on the rocks of change.
We often focus on how wealth management platforms consolidate data from multiple sources, but they also produce large amounts of data as well. Temenos actually produces more data producer than they consume, Winship pointed out.
Temenos has an internal concept internally called “integration by design”, which entails being proactive about designing their information producing solutions so that they’re robust and deliver the best quality, highest availability data in the industry, Winship explained.
The integration concept starts in the engineering process at Temenos, Winship said. When they design a solution, whether it is operational or transactional, they make sure to understand where that data is going to end up not only within the product, but outside of it as well.
96% of organizations responded that they have not yet reached an optimized level of technology integration according to a global financial services survey conducted by Ernst & Young.
Every wealth management firm is a consumer of data and should be able to get all of the information from their key systems in any format they require, Winship insisted. It can be via file-based API integrations, streaming data, or direct database activity in some cases. But a larger percentage of the data that firms are sending and receiving is moving by way of APIs.
Temenos makes all of their data is made available via multiple delivery mechanisms with different integration patterns to support their wide variety of customers, Winship noted. They also have their trouble ticketing system configured to support data issues reported by staff, customers or partners. This way, issues get resolved right at their source, he said.
Crowdsourcing Data Quality
Temenos receives a steady stream of feedback on their data quality from their partners and their 3,000+ customers, Winship confirmed. Since they don’t know how these firms intend to use their data or their internal quality metrics, the results can sometimes be surprising.
Winship has been a strong proponent of building out more options for data consumers as well as providing a data taxonomy. This was partially driven by his time on the client side when he was a consumer of this data and was frustrated with the level of support.
Temenos took that understanding of what it’s like to be a data consumer and built it into their data producing solutions, Winship stated.
This crowdsourcing approach to data quality can save Temenos and their clients a bundle. A recent Gartner market survey found that poor data quality costs companies up to $15 million on average every year. And not only are organizations taking a financial hit, poor data quality practices undermine digital initiatives, weaken their competitive standing and sow customer distrust, the global research firm found.
Gartner recommends four steps to overcome data quality challenges:
- Measure the impact of poor data on your organization.
- Establish the role of chief data officer (CDO) to meet data quality objectives.
- Optimize the cost of data quality tools, which have a median cost of $150,000.
- Estimate a realistic time frame to deploy data quality tools
Synthetic Data
Every year, the world generates more data than the previous year. In 2020 alone, an estimated 59 zettabytes of data were “created, captured, copied, and consumed,” according to the International Data Corporation — enough to fill about a trillion 64-gigabyte hard drives.
Financial services firms are required to restrict access to Personally Identifiable Information (PII) and often block access to datasets that contain PII, sometimes even within their own teams. And the pandemic shut down offices, preventing people from directly connecting to centralized data stores, sharing data safely has become even more difficult.
Without access to this data, it’s impossible for many analytical tools to work. One solution to this problem is called synthetic data. It is artificial information that can be used as a stand-in for the real data.
Data sharing is essential in our industry, Winship noted, which is why Temenos has been investigating how to generate synthetic data. This would be data that could be freely shared with partners since it has been “desensitized” in a way that does not compromise client privacy.
Synthetic data would also facilitate sharing information such as bundling financial and non-financial products for end customers, Winship stated. This data can be generated by machine learning (MLM) systems that have progressed to the point where it is almost impossible to discern whether a particular data set is real or artificial.
Generating synthetic data is also much less expensive when compared to collecting large datasets and can support AI/deep learning model development or software testing without compromising customer privacy. It’s estimated that by 2024, 60% of the data used to develop AI and analytics projects will be synthetically generated.
In case you missed this webinar, you can click here to unlock your access to the full recording.