Roundtable: Scaling up new technology while avoiding the pitfalls


A recent Insurance Post roundtable held in association with Insurants explored the complexity issue. Valerie Hart reports.

In association with


  • Sam Bagnall: Commercial transformation director at Axa
  • Jonathan Davis: Data science lead at Zurich
  • Richard Easterbrook: Director of commercial insurance at Howden
  • Rob Flynn: Chief transformation director at RSA
  • Sarah Hyman: Retail operations director at Brown & Brown Europe
  • Matthew Lambert: UK&I head of pricing and underwriting – commercial and specialty lines at WTW
  • Simon Pink: UK head of emerging technology at QBE Ventures
  • Ben Platts: Chief operating officer at Insurants
  • Meric Stanley: Lead developer at Konsileo
  • Laurence Trigwell: Chief revenue officer at Insurants
  • John Wakeman: CEO at Accelerator, Ardonagh Advisory

The biggest challenges

Increased automation and artificial intelligence-powered solutions should herald considerable benefits in efficiency, competitiveness, and client outcomes. However, progress is slow due to complexity issues and legacy infrastructure and processes.

Processes are harder to fix in corporate insurance’s complex and bespoke world than in the more standardised personal lines/SME sectors. The roundtable participants agreed that it becomes more difficult to get insurers, brokers and technology partners to aggregate into a shared equivalent standard when describing more complex risk features.

Firstly, what are the biggest data extraction, enrichment, and movement challenges between various parties?

According to Matthew Lambert, UK&I head of pricing and underwriting for commercial and specialty lines at WTW, it is the mismatch between what level of data a broker might view as relevant or necessary to execute a risk and the granularity and type of data the underwriter or carrier wants.

I think data is really important, but it’s not a panacea. At times, we can view it as the only important thing in driving the right client outcome.
Richard Easterbrook, director of commercial insurance at Howden

This is compounded by a lack of consistency among insurers and products regarding what data capture is required and ways of sharing data.

Many brokers continue to rely on manual processes. “We are still forming our strategy around this because our processes are still quite manual and done in different ways, and we end up having to share multiple spreadsheets of data. So potentially, we are missing out on important outcomes, which, quite frankly, takes away from trying to get quality placements,” acknowledges Sarah Hyman, retail operations director at Brown & Brown Europe.

One size does not fit all

Richard Easterbrook, director of commercial insurance at Howden, warns the industry not to forget that clients are different and that one size doesn’t fit all. He also cautions the industry not to be deskilled through standardisation. He argues against the idea that clients who might look the same on paper because of external data can be presented with a solution without any interaction.

“I think data is really important, but it’s not a panacea. At times, we can view it as the only important thing in driving the right client outcome,” says Easterbrook.

He points out lessons to be learned from personal lines and SMEs. “The increased use of data and technology has led to less great client outcomes and the extraction of capital from the market because when you standardise, the danger is that the only measure of value is price. It then becomes a race to the bottom, and you end up with covers being extracted because there is only one way to go: to reduce cover.”

Sam Bagnall, commercial transformation director at Axa, explains that her firm hasn’t imposed a standard and insisted everyone gives them the same data. Instead, the insurer has employed new technology that enables them to extract what they need from what they’re given. “We’re getting cleverer on our side and interpreting the data rather than insisting on a template being filled in.”

The attendees agreed the industry must not lose focus on the client.

“There’s almost a view that if it involves technology and data, then it’s good, and, at worst, the outcome is immaterial because the toys are really whizzy and we have to use them. However, unless we benefit clients and it’s not just about driving efficiency for us, we will potentially struggle as an industry,” says Easterbrook.

Lambert highlighted how challenging it is in commercial lines because they are trying to standardise the data without standardising the product. Meanwhile, within personal lines and, to a certain degree, SMEs, data has been standardised by creating standardised products.

“The product needs to remain flexible and editable by the underwriter as part of the negotiation, but we want to standardise a good chunk of the data for process efficiency and for the value that can be extracted from that data once it becomes structured. It is about lowering the cost of getting that data into the shape where we can analyse and understand it,” says Lambert.

Taking away the boring stuff

Everyone agreed that data-driven solutions aren’t about replacing humans. They’re about eliminating mundane tasks, such as keying in or doing administration, that divert from one’s skills.

Rob Flynn, chief transformation director at RSA, doesn’t see why brokers can’t continue submitting their cases the way they’ve always done. “We can then enrich it with things the broker can’t get, form a triage, and direct their skills to the right tasks.”

We’re getting cleverer on our side and interpreting the data rather than insisting on a template being filled in.
Sam Bagnall, commercial transformation director at Axa

Simon Pink, UK head of emerging technology at QBE, says: “I’m very conscious that this is an industry built on data. However, we need to be cognisant of spending more time with our clients and understanding their needs. We are now in a position where we can personalise that interaction based on our data.”

He explains how QBE has had some success deploying a large language model in North America to triage its cyber submissions. If bits of data are missing, the model can craft an email for the underwriter to review, helping the underwriters organise their work.

“This has been well received. They are still in control and look at everything, but it’s given them the ability to prioritise their work within minutes,” Pink says.

Data quality

The way data is presented is often hard to interpret and understand. John Wakeman, CEO at Accelerator, Ardonagh Advisory, contends that data quality needs to be considered in conjunction with the process or the people engaging in that process and how they can benefit; otherwise, there won’t be a lens on what quality problem is being solved.

“Commercial lines are inherently opaque and need to be at certain levels, but as you come down that continuum, you end up at some point where they can be standardised. Where is that appropriate point of intervention where we are prepared to accept a level of opacity and lack of content because there is enough value in it?” says Wakeman.

According to Easterbrook, there has to be an end goal in improving data quality and accuracy. Even if technology solves a problem, if there’s no fundamental outcome for clients or the industry, there’s no point.

“Is it more accurate pricing? What does that mean? Accurate pricing could mean that someone we know now from the data will never claim, and we can charge them a pound. People with a much higher propensity to claim will get charged a fortune, and you will lose a degree of the pooling of risk,” he says.

There were calls for redesigning the whole process, particularly in relation to emails. Wakeman gives an example of the 20-step process his start-up used to try to get PI cover: “At least half of that 20-step process was due to the owner not understanding the questions being asked, nor did the broker, who therefore couldn’t translate it back to the owner. If we had all jumped on a call together, we would have nailed it in five minutes—it ended up taking five weeks back and forth via email.”

“At Ardonagh, our teams capture risk data and want to send it as a presentation to the market. They press a button, which creates a presentation automatically, but then it goes to email. I look at our account executives, and the defining characteristic of a good account executive in our business is someone who can manage their folders in Outlook unbelievably well. At any one time, they are working on 20 or so risks, and maybe half a dozen requests to quote, and it’s just not an efficient use of time,” says Wakeman.

Meric Stanley, Konsileo’s lead developer, agrees the industry needs to find better ways of sharing data between brokers and insurers. “This is a problem we experience specifically because we are small. We have our in-house software, and the first thing we did was get rid of our Outlook folders. But what we really want to get rid of is the slew of emails.”

Wakeman says: “This is a platform-centric process if I ever saw one. With all the right communication mechanisms, you have shared the first set of information, and then move into referral, but it’s not quite enough for what the underwriter needs. How then can the underwriter quickly get a response so it’s live and not coming a week later?”

A coordinated industry response

Even if insurers automate broker processes with no industry platform, they will still have to deal with emails and attachments.

Lambert explains how many insurers are investing in data injection technology to extract around 20% of the most valuable pieces of data from emails and their attachments. Most of that 20% exists in a broking platform or solution. However, the broker is faced with the data the client sends having already been rekeyed or them having invested in their injection technology. Therefore, this capability is moving increasingly toward the broker and client. Even bigger corporations with an internal risk management function might start to have this technology and, therefore, their own management platform.

The industry’s attempt to bring together a standardised question set is imarket, which connects brokers to insurers and digitally traded businesses. However, coming off imarket and going direct to insurers via extranets, for example, lead to different outcomes due to more flexibility in the questions.

Even if you get as much information as your actuaries and data scientists need, can you actually use it? Is there still the constraint of your system only accepting certain things, or will it cost a fortune to make changes to the system?
Ben Platts, chief operating officer at Insurants

Meanwhile, brokers still need to be able to differentiate themselves to remain competitive.

Easterbrook believes that AI can really help brokers identify valuable talking points with clients. “Brokers can’t be experts in everything. AI can pick up and draw relevant information in and help us shape the conversation with clients,” he says.

According to Jonathan Davis, data science lead at Zurich, as data grows exponentially, AI could help a human navigate the vast data landscape and try to understand what to ask for.

Lambert agrees. “Every insurer has different bits of information that are important to them. That’s also really important for a functioning market.”

Legacy challenge and moving forward

Many companies are faced with how to integrate these solutions into their existing processes with the least disruption.

“Even if you get as much information as your actuaries and data scientists need, can you actually use it? Is there still the constraint of your system only accepting certain things, or will it cost a fortune to make changes to the system?” Ben Platts, chief operating officer at Insurants, says.

For Lambert, the challenge and costs come from pooling data and linking different systems. He thinks it’s a significant step for the big insurers due to a lack of investment in technology.

“In insurance, to really innovate, you have to be small, but to underwrite lots of insurance, you have to be big, and that’s a market problem,” Lambert says.

Easterbrook argues the market needs to see a real challenge or something that will make a big difference to prompt change. “Innovation and AI use tends to be around the edge – where can we apply it more easily to make a difference rather than fundamentally? And the investment required to change insurers’ back offices and systems fundamentally would be so huge that the outcome doesn’t warrant the investment.”

However, Davis claims climate change now presents a challenge, with the industry having to fundamentally reevaluate property risk. “At Zurich, we have an algorithm that allows us to combine address data, which gives us a holistic view of property, and anticipate risks that are not experience-based anymore.”

Lambert says: “These are big, complex organisations rolling something out across every line of business, product, and distribution channel – and it takes a huge amount of effort to see that through. I think there is tangible evidence that progress is being made, but it takes time.”

You need to sign in to use this feature. If you don’t have an Insurance Post account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here