Phy-gital Roundtable: Breakfast Roundup from Germany and Netherlands

02 May '15 | Debjyoti Paul

German Shoppers: Meet Them in the Fast Lane to Phy-gital

15 January '15 | Ralf Reich

Shoppers Will Share Personal Information (But They Don’t Want to be “Friends”)

15 January '15 | Anil Venkat

Modernize or Perish: Property and Casualty Insurers and IT Solutions

14 January '15 | Manesh Rajendran

Benelux Reaches the Phy-gital Tipping Point: Omnichannel Readiness is Crucial

13 January '15 | Anil Gandharve

The New Omnichannel Dynamic: Finding Core Principles Across Industries

13 January '15 | Debjyoti Paul

Technology does not disrupt business – CIO day 2014 Roundup

02 December '14 | Anshuman Singh

Apple Pay – The Best Is Yet To Come

02 December '14 | Indy Sawhney

Digital transformation is a business transformation enabled by technology

01 December '14 | Amit Varma

3 Stages of FATCA Testing and Quality Assurance

06 October '14 | Raman Suprajarama

3 Reasons why Apple Pay could dominate the payments space

18 September '14 | Gaurav Johri

Beacon of Hope: Serving Growth and Customer Satisfaction

05 August '14 | Debjyoti Paul

The Dos and Don’ts of Emerging Technologies Like iBeacon

30 July '14 | Debjyoti Paul

What You Sold Us On – eCommerce Award Finalist Selections

17 July '14 | Anshuman Singh

3 Steps to Getting Started with Microsoft Azure Cloud Services

04 June '14 | Koushik Ramani

8 Steps to Building a Successful Self Service Portal

03 June '14 | Giridhar LV

Innovation outsourced – a myth or a mirage or a truth staring at us?

13 January '14 | Ramesh Hosahalli

What does a mobile user want?

03 January '14 | Gopikrishna Aravindan

Understanding Enterprise Risk through Modeling

Posted on: 30 May '12

Ten days after Japan’s 2011 earthquake and tsunami, the Japanese government estimated a total national economic loss of USD 200-300 billion. The natural disaster brought about a ripple effect across the world. Operations of multinational enterprises with manufacturing units in Japan were stalled or slowed down as their supply chain took a hit.

Meanwhile, in the insurance world, a specialized process called catastrophe modeling was engaged to calculate estimated losses as a direct and indirect result of this natural disaster. Catastrophe or CAT modeling tools are statistical tools which quantitatively overlay natural (flood, earthquake, etc.) and man-made disasters (terrorism) on man-made structures (buildings, railroads, dams, etc.) to assess property damages. These tools are primarily used by insurers to estimate risks associated with insuring businesses, large or small.

CAT modeling is engaged early in a policy life cycle, when a customer approaches an insurance company to buy an insurance policy for his property (say an office building or a factory). An underwriter (who estimates the premium for the policy) evaluates the property, by quantitatively accounting for various parameters of the property. This usually includes:

  • Accurate location information
  • Physical characteristics of the property

Accuracy of location depends on the geography since advanced countries have relatively precise geocoding resolution. For instance, chances of exactly placing an US address on Google maps are higher compared to an emerging economy. Accurate placement on a map provides an added advantage of better estimating catastrophe damage.

Once a location is well defined, it is the job of the CAT modeling tools to apply relevant catastrophe (mathematical) models, based on the coverage sought in the policy. The result of this operation is a set of relevant statistics that outlines probable loss – although intimidating at first sight, quite often the winning emotion is the regret of not paying enough attention during high school statistics classes. In the insurance world, the underwriter is able to calculate the risk borne by the insurer based on the results generated by the CAT tool.

By integrating the above process seamlessly into its daily operations, an enterprise is able to better measure its risk aggregation as a result of adding new businesses to its existing portfolio. Advanced analytics enables better understanding of existing risks at various levels. Understanding existing risks and potential risk appetite empowers risk management to make informed capital and business decisions for future growth.

Like they say – In God we trust, for everything else there is data!


Gopikrishna Aravindan

Gopikrishna is a Senior Consultant with Mindtree's Enterprise Solutions Consulting Group. Previously, Gopi has worked for Deloitte's technology practice in Chicago, USA and has a Masters degree in Information Systems from Carnegie Mellon University, USA.

  • Souri

    This problem can be resolved by linking web data and enterprise data.

    Provided the customer’s come to the site through online. By enabling enterprise search feature in the site, the exact location of the user can specified and the data can be fed into the CAT model.

    • I agree. This is an added advantage for emerging economies which can leverage wide & collective mobility usage to off-set resolution challenges related to geo-coding locations.

  • Kaarthik Hariharan

    Nice article.GIS mapping, GeoCoding and Geofencing is the way forward. Historic data on the location would also aid in knowing/arriving at the risk factor involved in insuring an entity in locations that are prone to natural disasters.

    • Great point! Capturing and accounting for historic location data is a likely progression in the Risk Modeling maturity ‘model’:). It would be interesting to know how this data will be managed – I suspect there might be privacy concerns associated with a third party (like MSB) centrally storing such information. Also I wonder, if enterprises today, capture and store historical event information for their locations.
      Some car insurance companies provide incentives to its customer for sharing information related to driving habits. Perhaps, an idea to leverage from the auto insurance world.

  • Kaarthik Hariharan

    Exactly. I think with the way technology is moving forward, it wouldnt be long before the the devices we use are programmed to automatically transmit data regarding the usage pattern etc. 1) it would allow the enterprises to know abt their usage 2)help in arriving at a dynamic model. As you said, data is the king! Exciting times ahead!