Why a strategic approach to data integration is key for mergers & acquisitions in insurance

Establishing a strategic approach to data integration is key to tackling the steady rise of M&A in insurance as well as accelerating the company’s digital transformation journey, Quantexa says.

In a new blog post, risk management firm Quantexa’s head of insurance Alex Johnson highlighted the key challenges for a successful M&A deal in the insurance industry. M&A is seemingly seen as one of the best ways to pursue innovation, find synergies and advance digital transformation initiatives. Recently, countless new M&A deals of varying sizes have been hitting the headlines. The takeover of RSA (and break-up) by Intact & Trygg A/S, Allstate’s acquisition of National General (their largest ever), Hollards purchase of Comminsure from Commonwealth Bank and the expected merger between AON & WTW are among the most prominent deals.

Indeed, there are tremendous opportunities to use M&A as a catalyst for digital transformation, gain a better understanding of customers and improve competitive advantage. As Johnson wrote, “Ultimately, an insurer gains access to richer and broader data which should inform customer experience and relationship management, and application, underwriting and claims processes in both personal and commercial lines. Not leveraging this data insight would be a missed opportunity.”

However, mergers and acquisitions require smart strategic planning in a range of functional areas – from legal, regulatory affairs and finance to IT and human resources. Consequently, many M&A deals fail or flounder in delivering maximum value due to problems with integrating disparate IT systems and data. “This can make it difficult for operational teams to serve customers effectively, often leading to friction in the customer experience as well as within portfolio risk management,” Johnson wrote.

Johnson attributes using conventional methods of data management as the key reason for shortcomings and roadblocks in M&A deals. He detailed that, even today, IT teams turn to large scale data migration projects or master data management (MDM) solutions as tools for managing their data consolidation challenges after an M&A. The main concern is that these traditional solutions are not built to scale for the high volumes of distributed, disparate data that is generated by various applications and external sources within M&A.

Highlighting the challenges with traditional MDM data management, Johnson said that it doesn’t work well with siloed data sources, leading to data duplication and inaccurate record linking. Given that most insurance companies have significant data silo challenges, even before any M&A, traditional MDM often misses connections and context. “[This] results in decision-making inaccuracy, and leaves business value on the table. In short, an ineffective MDM solution can negatively impact everything from customer experience to operational performance,” he wrote.

An alternative to traditional MDM is Contextual MDM. It connects siloed data and adds greater context using advanced Entity Resolution to create a joined-up view of all data assets across each business unit around your customers, third parties and supply chain from a range of data points across internal and external sources.

Johnson added that Quantexa’s Contextual MDM also enables InsurTech firms to maintain “’the golden record’ – records with the highest level of accuracy and trustworthiness that can inform your operational process and customer experience in everything from customer portfolio management, marketing, renewal and retention modelling to application pre-population – providing a richer view of all relevant attributes across those linked records.”

It updates data dynamically as each data point, not just in a periodic batch mode, enabling companies to maintain up-to-date customer views and trigger processes in response to any data changes.

In addition, it enables the data team to build a solid foundation for effective data and analytics by simplifying data and system migration through data fusion, predictive entity and network-based AI, Visualization and unified Exploration. “This dramatically reduces the time and effort your data scientists spend on preparing and maintaining data and dealing with data quality issues, enabling them to focus more on high value work – for predictive analysis, decision intelligence and process automation,” Johnson wrote.

Read the full blog post here.

Copyright © 2021 FinTech Global

Enjoying the stories?

Subscribe to our daily FinTech newsletter and get the latest industry news & research

Investors

The following investor(s) were tagged in this article.