International re-insurance company builds data lake to accelerate processes

Single global data source in the cloud enhances information flow

An international re-insurance company, which provides financial protection to insurance companies, wanted to improve processes with actuaries and provide faster access to data for smart decision making. It was also looking for a powerful way for customers to introduce new services in the future to differentiate themselves in an increasingly dynamic digital marketplace.

Fast, efficient access to data sources

The company uses vast amounts of data, including weather reports, satellite images, sensor data from vehicles and more, which it buys from a variety of sources. This data is key in protecting insurance companies against large and complex risks, such as natural disasters and cyber attacks, and specialist underwriters use it to make informed judgements.

The company was looking to change its business model to become a “data-as-a-service” company. As such it needed an innovative solution capable of storing and handling huge quantities of data for a diverse group of analysts and actuaries to access globally. This would enable on-site appraisals to be deployed faster, for example, reducing both claim processing times and cost.

It was essential that the solution be able to work with unstructured and semi-structured data, including emails, images and video files. Data could include satellite images following tornado damage, climate change data or emails following up a security breach claim. The data also had to be trustworthy and easy to extract.

At the same time, the company wanted to use the cataloged data to explore and develop new business services and further improve customer service by offering faster, easier access to data.

The company opted to build a data lake to store data in its source format at scale. This would give it the flexibility to store data in different formats and define the data structure only when it is actually needed. To provide real agility, reliability, availability and delivery services in a modular way, the company opted to build its data lake in the cloud.

In order to architect the data lake, a team of experts was brought in, including those from The unbelievable Machine Company (*um), to carefully map out the project and its migration to the cloud, so that day-to-day business was minimally disrupted.

Starting on premises

The company started out with an on-premises proof-of-concept data lake built on Hadoop technology, using SAS analytics and SAP Hana for structured data. This was then to be moved to the cloud. Initially, the company didn’t specify what cloud, but eventually selected Microsoft Azure.

To make all this possible, the company needed a partner with a multicloud approach, who had good experience of Hadoop, could architect a data lake in the cloud and migrate data to it. In essence, it wanted a company that was a Swiss Army knife – one that could do everything. The company conducted research and *um fit the profile.

Moving to the cloud

Moving the data lake to the cloud was the next stage in the project. This was complex as the data lake needed to be set up in the cloud without interrupting business operations. The key to doing this was to keep services running on-premise until they were stable in the cloud and only then switch them over.

In addition, data needed to be cataloged so users can easily navigate the data, find what they wanted and retrieve it. Without careful cataloging, a data lake can easily turn into a data swamp.

Some 1,200 people across the company will be using the data lake by the end of the year, including analysts, sales managers and service architects looking for new services to launch. The company is encouraging users to put data into the data lake, because the more data it has, the more accurate its forecasts can be.

*um manages the database and support on an ongoing basis, allowing the company’s IT team to focus on business initiatives.

Data lake: powering effective analytics

The data lake enables the re-insurance company to better store, prepare, analyze and glean valuable insights from its big data. It acts as an on-demand data repository that gives users around the world the ability to securely access information to better serve customer requests. Without any additional hardware investment or traditional licensing agreements, the re-insurance company has the added benefit of only paying for what it uses.

The company is also looking at launching new services on the back of the data lake that will enable insurance companies to differentiate themselves. On a basic level, this could include a service providing natural disaster damage images that would enable insurance companies to more quickly pay out on claims. The company believes there is huge potential for a host of innovative services and has asked employees and customers to volunteer their own ideas moving forward.

By putting your data lake in the cloud, you can store all your data in its native format for better accessibility and flexibility. Learn how to improve your analytics capabilities in our guide: Make Your Analytics More Agile.