The use of data is widespread but where insurers are on their integration journeys can be very different. Post, in association with Marklogic, brought a panel of experts together to discuss best practice in this area including effective compliance and regulatory reporting
Insurers are creating and devouring data on an industrial scale as they strive to grasp the opportunities they believe can be unlocked by managing data better and lifting it out of legacy silos. Ultimately, they are aiming to create a single view of their customers and streamline and simplify processes that seem overly complex and off-putting to customers as the third decade of the 21st century approaches.
Overlaying this are the growing demands of legislators and regulators for transparency, exemplary data governance and sensitivity as to how personal data is gathered, manipulated and shared. Regulators also expect high quality data to be used for compliance and reporting. They want to be able to see inside the firms they regulate.
The rigorous reality of these demands was seen in last year’s implementation of the General Data Protection Regulations
and the adoption of the new International Financial Reporting Standard IFRS 17, due for its full roll-out in 2022. These have different starting points, different objectives and, on paper at least, impact different parts of an insurer’s business but they, together with other regulatory initiatives, have helped force data management to the top of corporate agendas.
The impact of this was explored at the roundtable on future-proofing to meet the increasing regulator demands for data, in conjunction with Marklogic, where a glance down the guest list and at the job titles spoke volumes for how things are changing. Chief data officer, head of analytics and data science were job titles that did not exist a few years ago. And it would have been hard to imagine group CEOs contributing to a roundtable on data management a few years ago.
There was a speedy consensus around the table on the need to keep a clear focus on security and compliance in all data projects. Insurance businesses may now have a top-level view of how they should be using data and be on a journey towards data integration but they must not lose sight of this, warned Mark Smith, group IT and change director for the Ardonagh Group.
“Optimistically, very optimistically, there is a widespread understanding of why we need a single financial system and a single risk management system. The end solution is very obvious. The challenge is how do you get there when you have a diverse set of businesses and business applications. Alongside this is the increasing need to have robust security. That works hand-in-hand with having a clear vision of where data is going and how it is being used. It needs to be safe and secure where it is held and where it is going to.”
A shared challenge as firms embark on their data integration journeys is getting the right level of support from the boards and buy-in from across the business. One of the reasons for this is that effective compliance and regulatory reporting require a high degree of consistency in how data is collected, including developing a common language. This was an issue for everyone around the table with numerous examples of how big a challenge this is being shared. One firm found 14 different fields called ‘pricing’ when it embarked on its data integration project while another had eight different trade lists across its commercial insurance teams.
“This is not a technology problem. It is much more a cultural shift in how we approach the problem”, said Nicholas Gordon, chief data officer at the RSA Group.
“We need clarity around that common language. Proper data governance needs to be embedded within the business so that the business takes on accountability and ownership of data rather than it be seen as this big IT thing locked away in some repository.”
“This will enable us to future proof what we are doing for some of the regulatory issues that might come down the line,” he added.
This brought the discussion back to the need for senior management support. “This needs senior buy-in because it crosses so many business units. The natural inclination is to do things that benefit your own business unit but not to worry so much about other parts of the business”, said Alan Tua who has recently joined Direct Line as head of analytics and data science from EY.
Hard to achieve
There was an acknowledgement that this might not always be that easy to achieve. “The bigger you are the harder it is. When you have got the scale of some of the large businesses it is very hard to get that common appreciation. For instance, how do you get all of the branches to input the data in exactly the same way?” said Shaun Hooper, group CEO at the Complete Cover Group.
Tim Yorke, head of transformation at Axa, acknowledged the scale of the challenge large firms with multiple distribution channels are facing. He explained: “The whole customer journey will become a digital journey”, something that some intermediated distribution channels were not necessarily ready for: “What we get is something written in crayon on the back of a cereal packet. Now you need data definitions that work up and down the distribution channels. That is going to require big changes and buy-in from everyone.”
This is where the push from new regulation has been beneficial, said Maxine Allen, group head of compliance at the PIB Group: “The regulatory elements have helped because you now have to provide so many different reports to different people that the man hours put in previously now mean people are beginning to see the benefits of automating everything and getting it out of one system. You no longer need to transform everything before you send it out”.
Others quickly took up this theme. “The Senior Managers & Certification Regime has been a kick up the butt for a lot of executives. They have to comply with so much regulation that it is not an option to sideline it anymore,” said Jonel van Rooyen, director of operations for Vitality.
“We have found GDPR very helpful as we have been trawling across all of the estate trying to find out where all of these datasets are and what we have to do with them… we were able to use it to embark on a step-by-step process for filtering data,” said Yorke.
Using regulation as an incentive to engage senior management and as a push for the rest of the business to take data integration and management seriously is one route to getting some real momentum behind large scale projects. Many of these require grappling with legacy systems and extracting data from systems that established businesses often find they no longer have the people who understand them. This leads to a fear that another data project might just create another legacy silo of data or might just be so large that it requires a size of investment that boards might be reluctant to support.
“The pressures placed on legacy system capabilities by regulatory requirements can see multi-million pound projects take years to complete, or worse still, fail mid-project. Insurance companies need to embrace a culture shift and take a more pragmatic, iterative approach when it comes to regulatory reporting. This means taking a longer term look at how technology solutions implemented can support the wider business, and focusing on the most precious commodity at the heart of their operations – data,” said Alastair Burrows, account director at Marklogic.
“Across my executive board there is a long history of IT rocking up and asking for millions of pounds to do data stuff. If you pop up and ask to do another one of those the answer is going to be either no or ‘I am not going to give you all the money you are asking for’,” said Gordon.
He said many companies had taken the “big bang” approach and it had backfired: “They talked ambitiously about creating a data lake but found that a lake quickly becomes a swamp if not managed effectively.”
The answer is to take a more iterative approach, rolling out projects a step at a time with achievable objectives and avoiding the complex multi-year programmes of old.
“Think big, act small,” urged Gordon and aim to get value out of every step.
Another reason for avoiding complex multi-year projects was the simple fact of the pace of technological change, said Smith. It often meant that by time a project was competed technology will have overtaken it.
Tua agreed that it was hard, if not impossible, to make a big bang change in a large business. He said that in order to get support from senior management and that crucial buy-in from front-line business units there should also be a focus on the business benefits when making the case for a data integration project.
“The opportunity is leveraging these large programmes, which achieve compliance for their strategic value as well. If you have good quality reporting you have to look at how you can link that with organisational and product design change to make your business more innovative.”
Smith agreed: “When I asked if they [the board] would like to use this data to get a single view of the customer everyone’s eyes lit up. It became the icing on the cake.”
Gordon said that when he was with Gallaghers he was part of a large data project to comply with US General Accepted Accounting Principles rules and political sanctions regulations: “We said do you want to do more with this data than just sanctions screening and financial reporting because we can look at the new business opportunities it might unlock? Then the marketing people joined in and they said it was a great conversation.”
Using data this way can bring great businesses benefits, especially in forging new, stronger relationships with customers but it has to be done with effective governance at the heart, said van Rooyen.
“A company that has done this well is Aviva. As a customer you can go onto their platform and you can see all of your products, you can switch and do everything. It might not be perfect yet but it has been very successful.
“It started from a governance perspective. It set up a whole new entity that owned all of the data and had the CEOs of all the business units sitting on the board.”
Gordon said he could see the attraction of creating a single legal entity but cautioned that “you are going to have to integrate back into the mother ship at some stage”.
When it comes to the future, it is the regulatory emphasis on conduct risk around the world that will require the biggest adjustments, said Yorke.
“From a regulatory point of view, one of the things that is really driving us is the increasing requirement to demonstrate the impact on the customer of the product we have sold them. You have to be able to evidence that. This means you have to start thinking differently about your dataset. It is no longer just a pricing dataset, or a coverage dataset. It has got to represent the customer perspective. How do you assess that?
“This is very helpful because it means we have got to change the way we collect and store that data.”
It also impacted on how you gather and share data across complex supply chains, said Allen: “You have to be able to demonstrate that this applies to everyone in the supply chain above and below you.”
It would, however, unlock opportunities for new approaches to monitoring conduct risk and internal compliance efficiency, said Tua.
“You can measure your conduct efficiency. Technology scales up possibilities. Historically, in a risk function you sample something; for example 1% or a half of 1% of calls to check for conduct. Now you can get technology to do 100% of the calls and capture the 1% that are likely to cause you an issue”.
This was something the Financial Conduct Authority is keen to encourage now it has shed it previous wariness about large scale data applications being used in this way.
One point everyone agreed on was that the pace of change – both technological and regulatory – means flexibility is key.
“You have to take a very agile and flexible approach to the type of technology you are purchasing. It has got to be open because you can’t predict the future so we need to build that flexibility into every project,” said Gordon.