Analysis: Bad data - A risky business for insurers?

Data overload

Data is everywhere: good data, partial data, valuable data, bad data. Insurers need data to measure and price risks but do they take enough care in assessing the quality and provenance of the data they are using and are their processes sufficiently robust?

The tools that are now available to consume, manage and interpret massive datasets are so powerful that there is growing concern insurers could be feeding in data that contains underlying weaknesses

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@postonline.co.uk or view our subscription options here: http://subscriptions.postonline.co.uk/subscribe

You are currently unable to copy this content. Please contact info@postonline.co.uk to find out more.

Sorry, our subscription options are not loading right now

Please try again later. Get in touch with our customer services team if this issue persists.

New to Insurance Post? View our subscription options

How Blueprint Two will transform insurance

Post Podcast: Insurance has moved from parchment, to paper, to PDFs in the last few hundred years and Blueprint Two will hopefully help the sector transition towards frictionless processing, according to Rob Myers, operations director of the Lloyd’s Market Association.

You need to sign in to use this feature. If you don’t have an Insurance Post account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here