A technical detail?

CIR looks into preparations for Solvency II and investigates how technology is paving the way for compliance

With Solvency II regulations due to come into effect in less than three years, the pressure is on insurers and reinsurers to improve their risk frameworks. A recent survey commissioned by the European Commission found that 41 per cent are currently in the process of doing just that and those that have yet to do so will have to make it a priority for 2010.

"It is fundamental that insurers do not underestimate the level of cost and management time that will be needed to implement Solvency II and demonstrate compliance by October 2012," says Kirstie Gordon, an insurance specialist at financial services group BDO. "A timely, detailed GAP analysis is the key to a smooth implementation process and to managing associated cost effectively."

According to Simon Margetts, senior manager in Ernst & Young's European Actuarial Services practice, organisations should by now have started their projects in relation to satisfying the pre-application criteria, be well on the way to completing their implementation plan for their internal model and have started preparing for the pre-application work and documentation surrounding their data streams and systems.

A vital part of all of this is to select the appropriate risk assessment software that will allow insurers to consolidate data from multiple sources into a single, centralised solution that will enable them to generate concise reports and audit trails through a single interface. Meeting such a requirement will not only help organisations comply with the Solvency II demands but will also allow senior managers to make more informed decisions about risk and manage their cash reserves more effectively. Bart Patrick, head of insurance at business intelligence software provider SAS UK, says organisations have been gradually replacing more tactical solutions based purely on the commercial aspect of risk assessment and implementing more strategic systems over the past two years. "Insurers are now looking at adopting group-wide solutions to enable them identify, assess, measure and control risk exposures, creating an accurate understanding of risks in line with appetite and helping them to create competitive advantage," he says.

The last 12 months have also seen simpler and cheaper software packages come on to the market, says Margetts, making them more feasible for smaller companies. These are often explicitly linked to the QIS 4 template, he says, giving firms a simple starting point for the implementation of an internal model. "Most providers have a range of options in terms of the software and support that they offer. This goes from providing an off-the-shelf product along with some training to a full hand-in-hand model implement, where the model is handed over along with the associated software at the end of the project," he explains.

The key issues for most insurers when it comes to upgrading or installing new software are transparency, power, flexibility and scalability, says Karl Murphy, a partner at EMB, which offers different versions of its Igloo product for those with small models, those running larger models on desktop PCs and very large models to be run across multiple processors and computers and over the internet.

"They require transparency in the sense that they need to see what the model is doing and interrogate the results; power in terms of the range of stochastic techniques and business applications within the software and the time it takes to run sophisticated models; and flexibility from the point of view of linking models and importing and exporting from other packages, such as Excel or an economic scenario generator," he says. "The scalability issue is also important as firms might want to start modelling in a small way and grow their capability."

But while insurers and reinsurers are being forced to confront the issue of which model and software they wish to deploy, so too are software vendors facing a critical period, claims Thomas Brouwer, head of product management at risk compliance software provider FRSGlobal. "Vendors have to satisfy demand from the insurance industry not just for improved modelling tools, but also faster computing speeds involving ever-more complex calculations," he says.

"There's also a demand for a whole range of other features in systems, including coverage of the broader regulatory issues relating to models and the more qualitative requirements of pillars II and III of Solvency II." Solutions range from in-house built solutions to bespoke and tailored packages, he adds, but to reduce maintenance costs most insurers tend to opt for standard software with a high degree of integration that also offer elements of customisation. Margarita Von Tautphoeus, head of the Solvency II consultancy team at reinsurer Munich Re, agrees that service providers have work to do. "They tend to offer more solutions based on the current QIS4-standard formula spreadsheet but there must be a trend towards server-based, auditable software solutions," she says.

She believes there will be a mix of different software tools in the future, run by either an in-house risk management unit or an external provider. These could include a local data warehouse, a tailored actuarial calibration tool from a service provider and another license for external scenario generators. But she warns against the so-called "black box" solutions, as regulators will need to audit not just the findings but also the data, methods and tools used to reach decisions.
Locally stored files spread over different countries or legal entities without a well documented concept behind will not be accepted by European regulators in a few years' time, she adds. "Open source solutions with customised modelling applications and data integration are one way of overcoming this," she says. To this end, Munich Re is the main sponsor for an open-source software platform known as Pillar One, which focuses on ERM needs for insurers, she says.

The race to prepare for Solvency II has, however, revealed other issues that need to be factored into any decisions around software investment. Srini Venkat, vice-president for product strategy within Oracle's Global Insurance Business Unit unit, points out that many insurers still rely on multiple legacy core applications, which has led to unreliable data and a reliance on manual processes.

"This aspect of the implementation needs to be carefully considered to produce accurate and up-to-date data," he says. "Carefully designed data models and corresponding analytic applications will ensure the data transparency necessary to quickly meet the Solvency II demands." Indeed, due to volume and complexity, data organisation and planning typically represent 60 to 80 per cent of a risk project's cost.

"In terms of implementation of software, insurers need to consider the underlying sources of data, their provenance, management and quality, and the associated controls," adds Chris Ling, senior adviser, Ernst & Young IT Advisory Services. "Most software tools will reside in and around the reporting architecture area and implementers need to consider the population architecture and varying sources of data, and the controls and governance around the data storage architecture. It is not always acceptable to point software at the most convenient data sources because of implications on control frameworks and the wider architecture."

There are other barriers, too, relating to hardware performance that must be taken into account, says Johnston. "The extensive computing power required to perform projections can create bottlenecks in the reporting process," he warns. "Additionally, manual interfaces such as Excel and sequential processes are required to intervene and prepare these calculations. The more layers on the reporting process, the more computing power is required."

Internal communication issues will also need to be addressed if organisations are to benefit from the efforts being put in now to ensure compliance with Solvency II and improve their risk mitigation. EMB's Murphy argues models need to become embedded within the organisation and used right across the business. "That means everyone within an organisation understanding how they input to the model and benefit from its outputs," he says. "Modellers themselves have to be able to communicate their analyses in the language that the senior management team will understand."

The Solvency II requirements are not about to go away and time is running out for those insurers or reinsurers that have yet to seriously tackle this issue. But those organisations that can create a transparent organisation with easy access to data early on will not only ensure they meet the regulatory requirements but ensure they are working on a viable model going forward. In light of the recent failures of the financial sector in general, that is certainly no bad thing.

    Share Story:

YOU MIGHT ALSO LIKE


The Future of Risk & Resilience with AI & Data
CLDigital's Co-Founder, Tejas Katwala, joins CIR Magazine to discuss how CLDigital is transforming enterprise risk and resilience. By integrating business processes, AI and data-centric strategies, organisations can move beyond compliance to proactive risk management – simplifying operations, strengthening resilience, and driving business performance. Listen now to explore the future of intelligent risk management.

Communicating in a crisis
Deborah Ritchie speaks to Chief Inspector Tracy Mortimer of the Specialist Operations Planning Unit in Greater Manchester Police's Civil Contingencies and Resilience Unit; Inspector Darren Spurgeon, AtHoc lead at Greater Manchester Police; and Chris Ullah, Solutions Expert at BlackBerry AtHoc, and himself a former Police Superintendent. For more information click here

Advertisement