“Without big data, it’s like being blind and deaf in the middle of a freeway.”
― Geoffrey Moore, management consultant and theorist
At some point, we passed the threshold where overnight batch processing was good enough to run our business. We now demand cloud-based, infinite bandwidth, 24/7 information flow in all aspects of our lives. Customer experiences driven by real-time technology continues to set new standards for speed and convenience and batch systems are seen as major obstacle for enterprise wealth management firms.
Moving from batch to real-time data infrastructure can help CTOs and CIOs establish new ways of delivering value across their organizations. Streaming data is becoming a core component of enterprise data architecture due to explosive information growth from a combination of traditional sources such as exchanges, trading floors, and news, as well as non-traditional ones such as social media, ecommerce activity and web applications.
In 2020, in the face of a global pandemic, economic crisis, and an uncertain future, financial advisors reinvented how they did business almost overnight to provide stability, comfort, and peace of mind for their customers. What has become more clear as we recover is that the most useful financial products are the ones tailored to specific needs of the customer, and that hyper-personalization will continue to define the customer journey for the rest of 2021 and beyond.
In partnership with Xtiva Financial Systems, we have been producing a series of webinars to bring leading industry experts in data strategy, data architecture and systems implementation to share their experiences and best practices around leverage Data-as-an-Asset for enterprise wealth management firms.
Our most recent webinar, called The Data Engines Powering Wealth Management, included panelist Davyde Wachell, CEO at Responsive.ai, who provided a glimpse into how streaming data will change the way wealth management firms build and manage their data infrastructures.
In case you missed this webinar, click here to unlock your access to the full recording.
Structure Your Data to Feed Downstream Systems
Data Strategy Tip: It’s important to structure your data for consumption by downstream advice systems to ensure it supports client personas and can capture future value.
— Davyde Wachell
Wealth management firms typically working with many different data sources and tables, so CIOs/CTOs need to model their enterprise data in a way that enables downstream systems to quickly receive answers to ad hoc queries by connecting related fields in different tables, according to a report by AI-driven analytics vendor Sisense. The relationship between the various entities in a data model will determine the types of queries the future analysis will be able to answer, as well as the efficiency with which it does so.
Start by asking:
- What relationship will occur once these fields are connected? You’ll want to avoid many-to-many relationships.
- Will my data model scale?
- How easy will it be to add data sources and make changes to the model further down the road?
- Can we simplify the relationship without affecting performance? Note that this might depend on the data preparation and analytics tools that you’re using.
“You don’t know if your data works until you try to use it for something,” Wachell stated. “Is there such a thing as data separate from its usage?”
Questions When Considering Streaming Data Architecture
Two questions to ask before implementing new streaming data architecture: 1) will APIs replace your flat files? 2) what is the value of just-in-time data?
— Davyde Wachell
Wachell proposed that CQRS, which stands for Command Query Responsibility Segregation, is the future of data management. As the name suggests, CRQS splits a data application into two parts: Command-Side (writing data) and Query-Side (reading data).
Some of the benefits of implementing CQRS for your data architecture include:
- Simplified design and implementation of the system and overall reduction of complexity.
- Ability to optimize the read side of the system separately from the write side, allowing scaling of each differently as per its load. For example, read datastores often encounter greater load, and hence can be scaled without affecting the write datastores.
- Delivers multiple views of your data for querying purposes depending on the use case.
CQRS is can be tightly related to event sourcing patterns, which force the state of objects in the database as a sequence of events, events that modify the state of the object from the very beginning of its existence. Also, there can be as many events for the given entity as needed. In other words, there is no limit on the number of events per object.
This architecture enables working with events and projecting them into data state, which can be more expensive to support than legacy CRUD architectures, Wachell warned. For companies with more tech debt, they will need to consider improving their data supply chain by upgrading any flat files to APIs because when dealing with data entering an eventing system, you’re going to have to change that plumbing one day.
It’s the beginning of the end of batch processing for custodial data, according to Kit Lee, Head of Integration at Pershing. Clients can now subscribe to Pershing’s APIs that provide custodial data in realtime and can update their systems live, intra-day. Both pull and push updates are supported so advisors can get a balance update or check if an account is open and that it’s active and ready to trade, she explained.
Changing Human Behavior is the Key to Operational Improvement
“The goal of every data strategy project is to influence the actions of the organization. At the core are human beings. If you can’t engage the humans & change their behavior, you won’t generate the expected outcomes.”
— Jeff Marsden, Head of Product, Xtiva
As Marsden pointed out, spending millions on a data strategy that recommends changes to business processes is a fruitless endeavor if you are unable to influence the underlying employee behavior. Most of the inefficiencies in any company come from manual processes that are handled by staff. These limit the firm’s productivity and are the initial wave of targets for any data strategy.
Wachell supported Marsden’s point on influencing behavior and added that he believed the core virtues were humbleness and closure of scope. Knowing your limitations as organization is necessary to keep the scope manageable and ensure it can be delivered to produces value as quickly as possible.
We’re witnessing a data-driven revolution with analytics providers like DecisionMinds offering predictive analytics services to drive decision-making across a wide range of industries, including wealth management:
Analytics is no longer a value addition that wealth managers can offer to their clients; it’s a way of life and of business. For wealth management companies, decoding the propensity of an account to succeed, determining the probability of retaining a client, enhancing the clients’ ability to aggregate information about their assets, comparing the performance of wealth managers, and focusing on the top performing wealth managers are only some of the advantages that data-driven analytics can offer.
Those companies that will be in the best position to keep up with the future pace of change will be those that invest now in analytics capacity and gather the data that’s necessary to drive the analytics towards meaningful outcomes.
From the Harvard Business Review:
Organizations often seem obsessed by measuring fractional shifts in operational performance, capturing data on sales, inventory turns, and manufacturing efficiency. However, when it comes to change, few track performance from project to project beyond knowing which ones met their goals. Although projects have unique features, there are many similarities between process improvement, system change, M&A, and reorganization projects.
There are opportunities to capture information about the team involved, the population engaged in the change, how long it took to implement, what tactics were used, and so on. Building a reference data set like this may not yield immediate benefit, but as the overall data set grows, it will make it easier to build accurate predictive models of organizational change.
This is an aspect of management that is only reached by firms that reach level 5 of the Capability Maturity Model (CMM). This is called “Optimization” and companies at this level have the data and processes needed to manage their operations and adjust them based on long-term monitoring and analysis.
Why a Data Lake Isn’t The Holy Grail
Data Strategy Tip: Data Lakes are overbearing, expensive and can introduce security risks. You should be skeptical of anyone who says they’re the Holy Grail for your organization.
— Davyde Wachell
- Data lakes and data warehouses are at the intersection of multiple functions. Data from multiple sources is transformed, and various functions (whether people or apps) process the output with different methods and purposes. This variety makes any security project complicated, as many components, steps, and processes may fail.
- The data may be very versatile, especially in data lakes. Data can be structured (many different types), semi-structured, or unstructured.
- The high rate of data technology innovation, makes securing it an ever-moving target.
- The internal security controls implemented in data warehouses and data lakes are often insufficient and require additional external controls to ensure security.
Security for data lakes need to be handled the same way you would handle security for enterprise database systems, according to industry experts. That means implementing controls such as data encryption, user authentication, and role-based access control and security. Many of tools are available for doing this for data lakes built on data management solutions from the major vendors, such as IBM and Oracle.
Even data lakes won’t make it easier to aggregate bank data across deposits, loans, credit, and investments and co-leading entities across the organization, getting that data, dealing with risk, it was like Escape from Alcatraz.
Wachell shared an anecdote about challenges his firm overcame when working with a large European bank. They needed to aggregate their data across banking, loans, credit, and investments, which ran into difficulties as multiple entities across the organization had conflicting goals and processes for managing their data. The effort required became so high in acquiring data and dealing with risk, that it it was like “Escape from Alcatraz,” he stated.
It was immensely satisfying for Wachell and his team because they helped multiple departments reach consensus and produced the outcomes they wanted. It’s critical to find ways to solve the problems quickly so that you can move forward and not get bogged down with ideology if you want to deliver value.