AI in claims prompts liability concerns for insurers

Insurers are under growing pressure to integrate artificial intelligence into claims handling, but a new report has warned that efficiency gains may come at the cost of new liability exposures.

DAC Beachcroft’s AI in Claims study found that insurers see significant potential in generative AI to accelerate processes such as fraud detection, triage and analysis of unstructured data. But the law firm cautioned that adoption also raises fundamental governance questions around bias, explainability and accountability.

Pete Allchorne, partner at DAC Beachcroft, said: “AI is a hot topic just now, but there is a lot of hyperbole and PR noise around it, which confuses the picture for those needing to make proper and diligent assessments to guide their organisations into the future. This report is designed to bring some much-needed clarity.”

While most executives interviewed for the report rejected the prospect of fully automated claims decisions, they acknowledged that the growing use of algorithms places added pressure on boards to demonstrate oversight. Regulators and courts are unlikely to accept 'the machine did it' as a defence, meaning liability will rest with the insurer – and potentially individual directors.

The full report will be covered in detail in the next issue of CIR Magazine.



Share Story:

YOU MIGHT ALSO LIKE


The Future of Risk & Resilience with AI & Data
CLDigital's Co-Founder, Tejas Katwala, joins CIR Magazine to discuss how CLDigital is transforming enterprise risk and resilience. By integrating business processes, AI and data-centric strategies, organisations can move beyond compliance to proactive risk management – simplifying operations, strengthening resilience, and driving business performance. Listen now to explore the future of intelligent risk management.

Investec is disrupting premium finance – Podcast
Investec made waves in entering the premium finance market, where listening and evolving in response to brokers made a real difference.