Data is the main reason why corporate strategy is never permanent. As more data is collected, the output redefines your strategy in a never-ending cycle.
― Pooja Agnihotri, Author of 17 Reasons Why Businesses Fail
Every system at a financial services firm relies on data to function properly. Data is a valuable business asset, but is often ignored and not properly optimized in ways that could enhance the customer experience.
A comprehensive data strategy is a key component of every successful firm we surveyed. If your firm doesn’t have one, you’re not alone! In a recent survey, 87% of respondents said their company lacked a clear, repeatable process for data projects.
Ezra Group partnered with Xtiva Financial Systems to bring together a group of industry experts to share their advice, best practices and war stories about big data projects. The panelists were:
- Julia Carreon – Former Head of Digital Wealth, Wells Fargo
- Mark Pinto – President, Harbourfront Wealth Management
- Jeff Marsden – Chief Product Officer, Xtiva Financial Systems
- Damon Gladman – Head of Product, Skience
Click here to register and watch the full webinar video
Only 8% of attendees said their firm has a single executive responsible for their data strategy.
#wealthmanagement firms collect KYC data to make regulators happy but don't bother leveraging it to improve #ClientExperience — @juliaccarreon
This is a missed opportunity to take advantage of a hidden asset that exists at every company.#TheDataJourney @xtiva pic.twitter.com/JWXzbwoXyG
— Craig Iskowitz (@craigiskowitz) March 25, 2021
Government regulations are big drivers of technology spending and data acquisition for financial services firms. But KYC/AML data should be viewed as more than a painful regulatory constraint. It can be leveraged to support and improve other areas of the business.
Many companies hamstring their ability to utilize KYC/AML data by storing it in different systems, spread across departments. The enterprise winds up with many “data islands” living in separate infrastructure that creates governance and compliance headaches, making the unified search for data across the enterprise practically impossible. Without being centralized, it is difficult to run analytics and act on the results in a timely manner.
These data islands, are also known as dark data comprises almost 55% of the data collected by companies. Given that some estimates indicate that 7.5 septillion (7,700,000,000,000,000,000,000) gigabytes of data are generated every single day, not using most of it is a considerable issue.
Gartner defines dark data as: “The information assets organizations collect, process and store during regular business activities, but generally fail to use for other purposes (for example, analytics, business relationships and direct monetizing).”
The average information worker spends as much as 36% of their time searching for data and more than 40% of those searches are fruitless, according to a report by IDC.
Better organization and centralization of KYC/AML data enables the firm to gain valuable insights into their customers and helps lower business risks.
Despite their best efforts, many companies have data stuck in storage networks, squirreled away several layers deep in a complex file structure where it’s “out of sight, out of mind,” or located in a collaboration solution that’s not optimized for rapidly publishing data in an easily-understood format. This can lead to inefficient, labor-intensive processes, such as creating slide decks to share with executives that are outdated as soon as you capture the data.
Only 11% of customer-impacting decisions are backed with data insights, the rest are just educated guesses – HBR
As #wealthmanagement grows through M&A more effort is siphoned off to support the bureaucracy instead of improving #ClientExperience – @juliaccarreon #TheDataJourney pic.twitter.com/3429cf4qBW
— Craig Iskowitz (@craigiskowitz) April 9, 2021
Large enterprises that that were built through many mergers and acquisitions develop complex infrastructures and ecosystems, Carreon noted. It’s easy for priorities to get lost and too much time is spent dealing with internal issues instead of focusing on client needs. These mega-firms lose their sense of urgency to support new client requirements while spending more time supporting their own bureaucracy.
Bureaucracy can have some benefits such as protecting employees from workplace health and safety hazards and poor employment practices. When each employee is covered by the same, clearly defined employment practices and rules, the system feels fairer to all employees. By doing so, bureaucracies encourage a positive company culture, which can in turn increase employee satisfaction, productivity and retention rates. Put simply, bureaucracies help make a business a better place to work.
However, as Carreon pointed out, bureaucracies negatively impact business in many ways including reducing transparency, minimizing independent decision-making and encouraging inefficiency.
Bureaucratic processes can restrict the ability that any individual employee has to act independently or make decisions without approval from above. Even if it seems like the right thing to do, an employee may not be able to decide because they could face bureaucratic consequences, such as a reprimand or even termination, or because the process of getting approval is too time-consuming.
Employees surveyed by Harvard Business Review reported spending an average of 28% of their time on bureaucratic chores like preparing reports, attending meetings, complying with internal requests, securing sign-offs, and interacting with staff functions. That’s more than an entire day out of an employee’s week spent on work tasks that create little to no value for the company (and are a major drain on engagement).
From the blog of Emplify, a company that helps clients make more informed, data-driven people decisions, Is Bureaucracy Killing Engagement at Your Company?:
Of the 14 underlying drivers of employee engagement (i.e., utilization, fairness, competency, autonomy, purpose, shared values, role clarity, friendship, feedback, manager, trust, professional development, authenticity, and PTO), at least four of them run counter to a bureaucratic system.
Autonomy, for example, is a key driver of engagement, yet most employees in bureaucratic organizations don’t feel they have it. 89% of respondents to the HBR survey said they did not have “substantial” or “complete” autonomy to set their own priorities and decide on work methods.
The real power of #data comes from connecting it across the firm to generate actionable insights
Building client personas: clients that look like X have a propensity to buy Y product – @gladman22#TheDataJourney pic.twitter.com/8x4GEiVIQK
— Craig Iskowitz (@craigiskowitz) April 9, 2021
Buyer personas help firms understand their customers (and prospective customers) better. This makes it easier for them to tailor their content, messaging, product development, and services to meet the specific needs, behaviors, and concerns of the members of their target audience, as noted by marketing software firm Hubspot.
Creating personas involves collecting the set of characteristics that describe a large segment of users and assemble it into a persona. But Identifying the specific segments and extracting their traits is the real challenge.
Companies with well-defined data architecture are able to view customer distributions across various dimensions to identify the clear separation that indicates a good candidate for building a persona.
Gathering additional demographic information, called descriptive data, will further outline the customer personas. Once clarified, it is one step closer to using predictive analysis to implement optimal timing within the firm’s marketing efforts.
Statistical modeling of sales data should include analysis of client activity pre-purchase vs post-purchase to improve results of targeted #marketing – @gladman22
tbd
Statistical Modeling – be open to iterating your #datamodels, many firms never update them, which reduces their effectiveness over time – @gladman22 #TheDataJourney pic.twitter.com/nCMhaS9ZlB
— Craig Iskowitz (@craigiskowitz) April 9, 2021
Statistical modeling is the process of applying statistical analysis to a dataset. A statistical model is a mathematical representation (or mathematical model) of observed data.
“As machine learning and artificial intelligence become more commonplace, more and more companies and organizations are leveraging statistical modeling in order to make predictions about the future based off data,” stated Alice Mello, assistant professor for the Analytics Program at Northeastern University.
Data models can be evaluated from two different points of view:
-
Decisiveness: Reviewing the model’s performance and the accuracy of its predictions in terms of how well they improve business decision-making efficiency. Are the insights derived from the model making it easier for to make decisions? Have the insights resulted in staff spending less time in meetings and more time executing the company strategy?
-
Optimization & Speed: Reviewing the algorithms used to populate the model and the statistical techniques and methods applied. Are the algorithms optimized for the model’s stated purpose? Are the insights being generated fast enough to produce actionable advantages?
Successful deployment of a data model into production is no time to relax, warned Anasse Bari, Ph.D. a data science expert and professor at New York University. The model must be closely monitored because they tend to degrade over time reducing their accuracy and performance.
If conditions change so they no longer fit the model’s original training, then you’ll have to retrain the model to meet new conditions, such as:
- An overall change in the business objective
- The adoption of more powerful technology
- The emergence of new trends in the marketplace
- Evidence that the competition is catching up
Your strategic plan should include staying alert for any such emergent need to refresh your model and take it to the next level, but updating your model should be an ongoing process anyway. You’ll keep on tweaking inputs and outputs, incorporating new data streams, retraining the model for the new conditions and continuously refining its outputs. Keep these goals in mind:
- Stay on top of changing conditions by retraining and testing the model regularly; enhance it whenever necessary.
- Monitor your model’s accuracy to catch any degradation in its performance over time.
- Automate the monitoring of your model by developing customized applications that report and track the model’s performance.
#advisors will quickly lose interest in a #NextBestActions list if they do not see results. Continuous monitoring of outcomes & tweaking of your personas is needed to stay relevant – @gladman22 #TheDataJourney pic.twitter.com/OsGuG4HtT2
— Craig Iskowitz (@craigiskowitz) April 9, 2021
Most of the wealth management clients we work with have learned the lesson of “build it and they will come”. They realize that just launching new online tools doesn’t guarantee that they will attract new clients or that existing clients will use them.
The same holds true for advisor-facing tools. Advisor adoption is one of the biggest issues facing our enterprise clients that have spent millions on new proprietary technology or negotiated software licensing deals with vendors. Adoption rates of 25% or lower are typical for applications unless they’re required for trading or to get paid.
As Gladman noted, even systems that are widely used can quickly be abandoned if they fail to live up to expectations or see their usefulness degrade over time. It is a constant challenge to continually provide added value. Advisors are incentivized to generate more revenue and don’t have time to focus on any activities that get in the way of that goal, he stated.
Wealth Management firms that build listening into their culture have more successful data projects because communication is opened up and down the chain of command, Carreon stated.
After an acquisition, consolidate key back-office systems like #advisor commissions & payroll to quickly generate actionable insights, otherwise manual data merging will slow you down – Mark Pinto, @HFWealth #TheDataJourney pic.twitter.com/Wd8uFwXWDT
— Craig Iskowitz (@craigiskowitz) April 9, 2021
Pinto described how his firm made a series of 17 acquisitions as they grew to $23 billion in AUM and 36 combined technology support staff. The executive team came to a decision to harmonize their key systems to reduce manual intensive processes. After twelve months of consolidation, they reduced the staff from 36 to 6 while simultaneously increasing their accuracy rate from 80% to 95%.
Their team developed a methodology for integrating newly acquired firms within 60 days by creating a phased approach and incorporating lessons learned from each one. This significantly reduced the firm’s tech debt and enabled management to receive actionable data to power intuitive decisions, Pinto explained.
Gladman agreed that too many wealth management firms acquire multiple advisor compensation systems through M&A but never bite the bullet and consolidate them due to the high upfront costs. However, firms wind up spending 3-5X more over time supporting the new tech debt.
The bravest executives in wealth management are the CIOs that are willing to “go nuclear” by consolidating overlapping platforms immediately instead of kicking the can five or more years down the road, Gladman noted.
Successful data consolidation projects are ones that bypass the “tug of war” between three primary internal stakeholders: the board of directors, executive management and the implementation team, since they all have different goals, Pinto stated.
It's a mistake for #wealthmanagement executives to assume they know everything about #clientexperience & not include feedback from #advisors
– Mark Pinto, @HFWealth #TheDataJourney pic.twitter.com/twmu6sP9an— Craig Iskowitz (@craigiskowitz) April 9, 2021
There’s a tendency of wealth management executives to assume they understand all aspects of their business, but that’s a huge mistake, Pinto insisted. Advisors are much closer to their clients through cultivating their relationships over many years, so they should be included when gathering requirements for new projects.
Humble organizations that have a culture of inclusion, Pinto stated. They understand the power of involving advisors to help shape the destiny of the firm.
Pinto pointed out that managers have a common insight gap across all fields when it comes to soliciting employee input. A study published in the Academy of Management Journal found that insecure managers are less likely to listen and implement staff input.
Only three in ten U.S. employees strongly agree that at work their opinions seem to count. Even at many organizations where leaders specify that employee voices are important, employee feedback tends to get the short shrift. Or alternatively, leaders might not want to listen to employee feedback because of their past experiences with it.
Why don’t managers encourage voice and ideas from below when it is beneficial to them and their organizations?
Much current research on the topic suggests that managers are frequently stuck in their own ways of working and identify so strongly with the status quo that they are fearful of listening to contrary input from below. In a recent paper, published at Organization Science, an alternative perspective was proposed: that managers often fail to create speak-up cultures not because they care only about their own egos and ideas, but because their organizations put them in impossible positions.
Managers should be empowered to act on input from below and be incentivized to adopt a long-term outlook for their department, according to the Harvard Business Review.
Larger organizations can be held hostage by legacy systems that blocks them from implementing transformative technology – Mark Pinto, @HFWealth #TheDataJourney pic.twitter.com/xb1q2cnKii
— Craig Iskowitz (@craigiskowitz) April 9, 2021
Success eventually leads to problems with a firm’s technology infrastructure. As a company grows, the systems they started with gets older and slowly morphs into “tech debt”. Larger organizations can sometimes be held “hostage” by their legacy systems, Pinto reported.
Digital transformation (DX) is one answer to upgrading and replacing the legacy. Unfortunately, around seven out of 10 companies that embark on DX projects never complete them. And their legacy systems keep chugging along.
There are several challenges lie in the way of a successful technological transformation. First of all, support doesn’t always come from the top down. Bottom-up DX rarely gets enough traction to make a lasting splash. Plus, lots of companies make poor replacement choices, merely swapping one legacy system for another. Finally, even successful DX projects can lead to transformation fatigue if the team is expected to deliver without needed breaks.
It requires a lot of discussions around transformational technology and transformational thinking to be successful, Pinto warned. Important things to think about are cost and process to convert the firm’s DNA at multiple levels.
Firms w/o #datastrategy have difficulty combining consumer invested asset data with their current clients to generate geographic opportunity models – @gladman22 #TheDataJourney pic.twitter.com/6tJFKqp8T7
— Craig Iskowitz (@craigiskowitz) April 9, 2021
“The companies that are going to win are the ones who are using data, not guessing,” said Neil Hoyne, chief measurement strategist at Google and a senior fellow at Wharton Customer Analytics.
There is an iterative process behind the data strategy that drives successful marketing analytics:
- Data preparation and exploration
- Analysis
- Assessment
- Achieving goals
- Repeat
Companies are automating key services, predicting patterns and making recommendations that lead to greater client engagement by leveraging AI and machine learning tools. They’re following the example of the largest consumer retail companies such as Amazon where a third of their sales come from its recommendation algorithm, while YouTube’s algorithm drives 70% of the content watched on its platform.
A strong client segmentation methodology would include:
- Identify Business Issues
- Clarify Scope and Dimensions
- Generate and Refine Hypotheses
- Collect Needed Data
- Build the Segmentation Framework
- Link to Marketing & Business Strategy
An advanced data strategy allows marrying IXI data (demographic data) with client segmentation data in order to generate geographic opportunity models based on zip code allocations, Gladman pointed out. This is helpful for advisory firms looking to maximize the revenue potential of new offices as well as evaluate the ratio of revenue per prospective assets available, he stated.
Having properly aligned data architecture allows firms to identify trigger events that cause prospects to convert into potential clients, Gladman noted. The next step is partnering with the different business lines to create joint calling campaigns that should be more effective than the businesses running them separately, he suggested
#Banks w/o standardized data architecture often have customer data stored in different formats w/ different spellings across different systems (core deposits, loans, wealth) – @gladman22 #TheDataJourney pic.twitter.com/u0YFEjxjUM
— Craig Iskowitz (@craigiskowitz) April 9, 2021
No, that, that, that is home for me. So when we, we talk about, you know, where do you start, I think some of the challenges you run into with these, these large firms that have been around for a large, long period of time and they’ve accumulated a lot of facts that a lot of stuff that they tried to sweep under the rug, a lot of things that have been banded together.
Julia talks about acquisitions and mergers that they’ve kind of made work, but they’re not really scalable and they’re very hard to transform data is one of them.
So when you think about all the different systems that have house data whether you know you’re in a bank and you’ve got a deposit system you’ve got wealth, data mining, commercial system you have data and the customers listed three different ways.
Even though the claim is half the time. how do you get around that, how do you deal with all these legacy batch jobs that run the build, you know build all these things that you have all these dependencies downstream.
And I think the brave people, the ones that are willing to transform, are the ones that are willing to literally nuke it. And that is really difficult to sell internally a lot of the time.
Sometimes it takes having really experienced partners to work with to give you advice and to help you make that case, because I will tell you doing on your own.
Sometimes it’s really hard to get the business justification and buying to do that. But I think if you’re looking out and looking forward, you realize that you’re going to be saving yourself a lot of pain heartache and money in the long run if you’re willing to do now versus waiting five more years, Right, if you just keep kicking the can down the road, it becomes a bigger and bigger problem it’s hard in artists
We've seen wealth management firms with tens of millions of dollars of #datadebt that cries out for a #datastrategy to enable data assets to be mined for valuable insights – @Jeff_Marsden #TheDataJourney pic.twitter.com/Sl9UGXqJc5
— Craig Iskowitz (@craigiskowitz) April 9, 2021
In a McKinsey survey last year, CIOs reported that 10 to 20% of the technology budget dedicated to new products is diverted to resolving issues related to tech debt. More troubling still, they estimated that tech debt amounts to 20 to 40% of the value of their entire technology estate before depreciation.
For larger organizations, this translates into hundreds of millions of dollars of unpaid debt. And things are not improving: 60% of the CIOs surveyed felt their organization’s tech debt had risen perceptibly over the past three years.
Data debt is a subset of tech debt but should be addressed separately as part of the organization’s overall data strategy. Some Xtiva clients have reported that their internal analysis was well into the tens of millions of dollars of tech debt, so they assisted them with building a business case for a new data strategy, Marsden reported. This enabled the client to remove the roadblocks that held their infrastructure prisoner for years.
Before starting a multi-million$ data consolidation effort, make sure that all stakeholders agree to "call an audible" or pivot the project goals if things start to go sideways, don't get locked into an unattainable end result – @juliaccarreon#TheDataJourney pic.twitter.com/1qDoctbwzf
— Craig Iskowitz (@craigiskowitz) April 9, 2021
Resiliency and flexibility are business skills that have were common buzzwords that supported an organization’s ability to pivot. The biggest project pivot for every CIO was stopping everything to move their organizations to full remote work virtually overnight, after the COVID-19 pandemic began.
A global crisis is orders of magnitude different from individual projects that veer off course. But the ability to quickly recognize when a course correction is needed, and how to execute it, is something of an art.
Delaying the tough decision doesn’t make it any easier, Carreon warned. it just creates a bigger mess for someone else to clean up down the road.
Having lived through a number of very large and difficult transformation projects, she advised to be more deliberate when making choices. Gather all of the available information and be prepared to “call an audible” if the project is going sideways.
Not only do you have to be willing to pivot, but wealth management firms should solicit feedback from their advisors to ensure that the new direction still matches their requirements, said Gladman.
The Founder Institute provided a list of five tips that can help make a pivot successful:
- Do it as soon as you can
- Pick new goals that align with your vision
- Don’t automatically scrap all the work that you’ve already done
- Listen to your customers and employees
- Make sure your pivot creates opportunities for growth
Beware those who want to skip data cleansing before an enterprise platform deployment since it will probably go "smoothly" w/o it — they're driving the project into a ditch – @gladman22 #TheDataJourney pic.twitter.com/FrL7xi0lEQ
— Craig Iskowitz (@craigiskowitz) April 9, 2021
Data cleaning is the process of fixing or removing incorrect, corrupted, incorrectly formatted, duplicate, or incomplete data within a dataset.
When combining multiple data sources, there are many opportunities for data to be duplicated or mislabeled. If data is incorrect, outcomes and algorithms are unreliable, even though they may look correct.
Data cleansing is also important because it improves your data quality and in doing so, increases overall productivity. When you clean your data, all outdated or incorrect information is gone – leaving you with the highest quality information. This ensures your team do not have to wade through countless outdated documents and allows employees to make the most of their work hours.
Gladman described an enterprise project to implement Salesforce that had to be stopped for almost three months due to an 80% rejection rate because the team underestimated the amount of data cleansing required. Over-optimistic assumptions were made based on conjecture instead of reviewing data samples to build a more accurate estimate.
Millions of dollars of losses from project delays can be avoided if the management team was just honest with themselves and been more proactive in their approach, Gladman noted.
83% of IT projects can be classified as complete or partial failures — one reason is not having the right people on the bus (from Good to Great) before starting a data consolidation project – @juliaccarreon #TheDataJourney pic.twitter.com/qbn7rXK8ti
— Craig Iskowitz (@craigiskowitz) April 9, 2021
According to a survey by KPMG, a whopping 70% of organizations have suffered at least one project failure in the prior 12 months. That means wasted resources and efforts. Putting a value on it, for every $1 billion invested in technology projects, $122M was wasted.
Another survey revealed that, on average, large IT projects run 45% over budget and deliver 56% less value than predicted.
Research on failed projects by Gartner found that the main reason was organizations’ inability to reduce complexity to match the level that the assigned team could handle. This is a corollary of what Carreon stated from Jim Collin’s Good to Great: either get the right people on the bus or change the bus to one that the people you have can drive successfully.
The right people should also include project stakeholders from all impacted departments, IT representatives as well as external partners, Carreon suggested, since they are more likely to push back when you’re headed in the wrong direction.
Building an organizational #datastrategy enables a fuller understanding the firm's data & enables improved cross selling, increased wallet share & profitability – Mark Pinto, @HFWealth #TheDataJourney pic.twitter.com/NvVjcuYAl4
— Craig Iskowitz (@craigiskowitz) April 9, 2021
Another best practice for CTO or CIOs who are facing the challenge of consolidating technology across multiple businesses is to get alignment between the board, executive management and the people who have to implement the strategy, Pinto recommended. Most organizations have a constant tug of war between these three constituents over cost, time and quality, but without alignment from a cultural point of view, these conflicts can escalate and degrade project efficiency.
Pinto’s firm, Harbourfront Wealth Management, has been an Xtiva client for a number of years and has had a very positive experiences, not just from a commission perspective, but also understanding the meaning of their data, he noted. They have worked together to understand how to use data to segment end client relationships in order to increase cross selling opportunities and capture a bigger share of wallet.
Another result of Harbourfront’s Xtiva-powered data strategy is reduced client churn rates, Pinto explained. More business comes from referrals, which starts the relationship with a tighter bond than if they came in from marketing or another source. The average number of products sold to clients increases, which also reduces their flight risk, which has resulted in Harbourfront have a rate of clients leaving that is close to zero, he reported.
When meeting with regulators, try to educate them so they understand how your firm is using data assets to improve #clientexperience rather than simply reacting to what you're not doing – @juliaccarreon #TheDataJourney pic.twitter.com/r05CI9CAnq
— Craig Iskowitz (@craigiskowitz) April 9, 2021
Wealth Management is a highly regulated industry, although firms have different levels of contact with its regulators. Norm Champ, a lecturer at Harvard Law School, proposes that you think about your relationships with regulators in four categories:
- relationships in ordinary periods where no proposed regulation is being considered and no examination is underway,
- relationships when a rule is proposed or likely to be proposed by your regulator;
- relationships when you are being examined by your regulator, and
- relationships when your regulator is investigating your firm or individuals associated with your firm. It is critical for the success of your business and the success of your regulator that you interact with your regulator in a constructive way in each of these circumstances.
Each business, no matter what the industry, must decide what strategy it is going to pursue with regulators, Champ said. As a former CCO of an investment management business and a former regulator, he recommends a strategy of constructive engagement with regulators.
There are those who might disagree with that strategy and advocate a posture of avoidance of regulators and even those who advocate a strategy of opposition. Champ and Carreon both argue that the strategies of avoidance and opposition are misguided and that constructive engagement is the only viable choice for a business seeking an effective relationship with its regulator.
Treat your vendors more like strategic partners & solicit their input to help break down internal data silos – Mark Pinto, @HFWealth #TheDataJourney pic.twitter.com/quX3pdecHh
— Craig Iskowitz (@craigiskowitz) April 9, 2021
Vendors and their customers haven’t always had the best relationship beyond the transaction. Historically, customers were focused on getting the lowest price and reliable delivery and vendors on getting the highest price and fast payment. While this made sense on paper, it inevitably created a zero-sum game, as vendors were seen as expendable since customers were totally focused on the bottom line number. Vendors knew they could be replaced at any time and therefore had little interest in being flexible or working in the purchaser’s best interest.
Many companies have since flipped this Draconian approach on its head by starting to treat suppliers more like partners. In order to deliver the level of transparency and maintain a collaborative relationship there must be an open dialogue about finances.
Business leaders must give up on the old-fashioned “kiss your customers and kick your vendors” philosophy, Entrepreneur magazine stated. If you commoditize your vendors, you should expect commoditized products or services in return. In the end, that race-to-the-bottom thinking will limit a vendor’s ability or willingness to truly add value to your business.
Habourfront made a conscious decision to treat their vendors, like Xtiva, as strategic partners, Pinto reported. This required a shift in mindset and a different lens to view the value of the relationship.
Pinto also noted that vendors can provide valuable input as an external source that can see things differently. There’s no shame in that in admitting that someone is smarter than you, he advised, they will complement your business.