I.I.I.’s new California representative Janet Ruiz brings us this timely report from the insurance industry’s first philanthropic roundtable of the new year:

The first of three 2015 insurance industry philanthropic roundtables was held earlier this week in Woodland Hills, CA at Farmers Insurance to discuss the landscape of philanthropy with the theme of disaster resilience.

Speakers at the meeting presented case studies of successes such as the partnership of Farmers Insurance with the Saint Bernard Project to rebuild Joplin, Missouri. The Insurance Information Institute (I.I.I.) discussed the role of catastrophe communications in getting important information out to media and consumers before, during, and after a catastrophe. Team Rubicon talked about their mission to bridge the gap for veterans and how they engage veterans, first responders and volunteers in rebuilding communities after a disaster.

The Insurance Industry Charitable Foundation (IICF) leads the philanthropic roundtables attended by member insurance companies involved in philanthropy and community giving. It was born out of the passion of insurance professionals to make a positive community impact.

The IICF Early Literacy Initiative and Sesame Workshop Partnership recently launched – ‘Every Day is a Reading and Writing Day’ – working to provide every American child the opportunity to read and write. As Melissa Duncan, IICF Western Division says: “Early education makes true social progress.”

Bill Ross, CEO, IICF wrapped up the roundtable by reminding all of the impact the insurance industry has giving $1 billion annually in direct giving and sponsorships to charity.

It was a powerful session!

You can read more on the insurance industry’s contribution to community and charitable causes here.

Measures and methods widely used in the financial services industry to value and quantify risk could be used by organizations to better quantify cyber risks, according to a new framework and report unveiled at the World Economic Forum annual meeting.

The framework, called “cyber value-at-risk” requires companies to understand key cyber risks and the dependencies between them. It will also help them establish how much of their value they could protect if they were victims of a data breach and for how long they can ensure their cyber protection.

The purpose of the cyber value-at-risk approach is to help organizations make better decisions about investments in cyber security, develop comprehensive risk management strategies and help stimulate the development of global risk transfer markets.

Among the key questions addressed by the cyber value-at-risk model concept are: how vulnerable are organizations to cyberthreats? how valuable are the key assets at stake? and, who might be targeting them?

The proposed framework is part of a new report, Partnering for Cyber Resilience: Towards the Quantification of Cyber Threats, that was created in collaboration with Deloitte and the input of 50 leading organizations around the world.

As the report states:

The financial services industry has used sophisticated quantitative modeling for the past three decades and has a great deal of experience in achieving accurate and reliable risk quantification estimates. To quantify cyber resilience, stakeholders should learn from and adopt such approaches in order to increase awareness and reliability of cyber threat measurements.”

One potential option, it suggests, is to link corporate enterprise risk management models to perspectives and methods for valuing and quantifying “probability of loss” common to capital adequacy assessment exercises in the financial services industry, such as Solvency II, Basel III, albeit customized to recognize cyber resilience as a distinct phenomenon.

The report points out that the goal is not to provide a single model for quantifying risk. Indeed for cyber resilience assurance to be effective, it says participants need to make a concerted effort to develop and validate a shared, standardized cyber threat quantification framework that incorporates diverse but overlapping approaches to modeling cyber risk:

A shared approach to modeling would increase confidence regarding organizational decisions to invest (for risk reduction), distribute, offload and/or retain cyber threat risks. Implicit is the notion that standardizing and quantifying such measures is a prerequisite for the desirable development and smooth operation of cyber risk transfer markets. Such developments require ERM frameworks to merge with insurance and financial valuation perspectives on cyber resilience metrics.”

 

As we look ahead to tonight’s State of the Union address, I.I.I. chief actuary Jim Lynch brings us a book review on the perennial issue of health insurance:

When Target wants to sell more shirts, it puts them on sale. The retailer knows that the less something costs, the more likely you are to buy it.

Health care is more complicated, in no small part because the customer is buying something he or she would rather not need. If your doctor halved the fee for open-heart surgery, for example, you wouldn’t submit to it twice.

For other procedures, the situation is murkier. Most people would submit to an extra blood stick to ensure they were disease-free, particularly if somebody else (read: the insurance company) paid the bill.

To an economist, the possibility that consumers run up a tab on health insurers is a moral hazard. Another moral hazard is the tendency of insured people to smoke and eat more, because someone else will pay for the resulting maladies. Both were an important points in Moral Hazard in Health Insurance, a book culled from lectures at Columbia University in 2012. I reviewed the book in the latest issue of Contingencies, the magazine of the American Academy of Actuaries.

The main lecture, by respected MIT economist Amy Finkelstein, dissected a natural experiment that resulted from a funding shortage in Oregon. The state only had enough money to put 10,000 people on Medicaid, but it had far more people who qualified for the program.

The state held a lottery. Some people held the metaphorical winning tickets, and they got health insurance. The rest did not.

Though potentially tragic for the losers, the lottery created something social scientists like, a randomized sample that let them study how the behaviors of the insured and uninsured differ in the real world. They found that the insured did indeed consume more health care than the uninsured.

This finding is important because it supports ideas long held in the insurance world that higher deductibles and other forms of cost sharing reduce losses by giving all participants “skin in the game.”

My review also noted that some medical professionals participate in their own variety of moral hazard.

To find out more about health insurance, check out this Facts and Stats item at the I.I.I. website.

How to balance the risks and rewards of emerging technologies is a key underlying theme of the just-released World Economic Forum (WEF) 2015 Global Risks Report.

The rapid pace of innovation in emerging technologies, from synthetic biology to artificial intelligence has far-reaching societal, economic and ethical implications, the report says.

Developing regulatory environments that can adapt to safeguard their rapid development and allow their benefits to be reaped, while also preventing their misuse and any unforeseen negative consequences is a critical challenge for leaders.

John Drzik, president of Global Risk and Specialties at Marsh, says:

Innovation is critical to global prosperity, but also creates new risks. We must anticipate the issues that will arise from emerging technologies, and develop the safeguards and governance to prevent avoidable disasters.”

The growing complexity of new technologies, combined with a lack of scientific knowledge about their future evolution and often a lack of transparency, makes them harder for both individuals and regulatory bodies to understand.

But the current regulatory framework is insufficient, the WEF says. While regulations are comprehensive in some specific areas, they are weak or non-existent in others.

It gives the example of two kinds of self-flying aeroplane: the use of autopilot on commercial aeroplanes has long been tightly regulated, whereas no satisfactory national and international policies have yet been defined for the use of drones.

Even if the ramifications of technologies could be foreseen as they emerge, the trade-offs would still need to be considered. As the WEF says:

Would the large-scale use of fossil fuels for industrial development have proceeded had it been clear in advance that it would lift many out of poverty but introduce the legacy of climate change?”

Geopolitical and societal risks dominate the 2015 report. Interstate conflict with regional consequences is viewed as the number one global risk in terms of likelihood, with water crisis ranking highest in terms of impact.

The report also provides analysis related to global risks for which respondents feel their own region is least prepared, as highlighted in this infographic:

s8a_889ef1e4f38b366c74acca12feb902a8

The report was developed with the support of Marsh & McLennan Companies and Zurich Insurance Group and with the collaboration of its academic advises: the Oxford Martin School (University of Oxford), the National University of Singapore, the Wharton Risk Management and Decision Processes Center (University of Pennsylvania), and the Advisory Board of the Global Risks 2015 report.

We’re reading that self-driving cars are no longer a thing of the future, but it’s in the subhead of this Time article: how long will it be before your car no longer needs you? where the heart of the story lies.

Jason H. Harper writes of how he earned one of the first new driverless motor licenses – technically known as an “autonomous vehicle testing” permit – from the California DMV.

He then describes his chauffeured ride by a prototype Audi from Silicon Valley to Las Vegas for last week’s Consumer Electronics Show:

The car uses an array of sensors, radars and a front-facing camera to negotiate traffic. At this point, the system works only on the freeway and cannot handle construction zones or areas with poor lane markings. When the car reaches a construction zone or the end of a highway, a voice orders you to take the wheel back.”

Before taking the 550-mile road trip, Harper had to get special instruction on how not to drive, per California regulations:

The training included basics like turning the system on and off and learning the circumstances in which it could be used. The rest was about handling emergencies, such as making lane changes to avoid crashing.”

Harper says the training was far more difficult and involved than a regular driving test. However, average buyers will not need such training.

Why?

Because rollout of this technology is gradual. Audi’s program for example would allow the car to self-drive in stop-and-go highway traffic, but when traffic clears the driver takes the wheel again.

It’s at the very end of the article that a voice from academia reminds us that this approach may be no bad thing as both technology and driver acceptance need time to mature.

Dr. Jeffrey Miller, an associate professor at the University of Southern California, tells Time that in his opinion licenses and drivers will never be obsolete because “the driver will always have to take over in case of a failure.”

It’s an interesting point. From the insurance perspective, too, while self-driving cars are definitely on the way, the implications for insurers are evolving. In its issue update Self-Driving Cars and Insurance, the I.I.I. notes:

Except that the number of crashes will be greatly reduced, the insurance aspects of this gradual transformation are at present unclear. However, as crash avoidance technology gradually becomes standard equipment, insurers will be able to better determine the extent to which these various components reduce the frequency and cost of accidents.”

And:

They will also be able to determine whether the accidents that do occur lead to a higher percentage of product liability claims, as claimants blame the manufacturer or suppliers for what went wrong rather than their own behavior.”

More on auto insurance here.

Hot off the press, the latest edition of the Insurance Information Institute’s flagship publication Insurance Fact Book is now available. I.I.I. chief actuary Jim Lynch reflects on this comprehensive resource:

It’s not important why, but the other day I needed to look up auto insurance written premiums for 1963.

My source: Insurance Facts 1964, an Insurance Information Institute publication that was forerunner to the Insurance Fact Book, our one-stop property/casualty almanac whose 2015 edition went on sale this week.

FactBooks

I.I.I. has been printing some version of the Fact Book for more than 50 years, and we have earned a reputation for scrupulous accuracy.

This excursion was where I saw how well-deserved that reputation is.

Auto written premiums were $6.839 billion in 1963, according to Insurance Facts. I wanted to verify the number. To do that, I was stumped for a minute – who else would have this bit of information?

First stop: the federal government. That’s the sort of minutiae that would fill up the Statistical Abstract of the United States, the Census Bureau’s collation of the nation’s vital signs. And it was there – but the government got the information from I.I.I. – that same Insurance Facts 1964. I shouldn’t have been surprised; we continue to provide information for the Statistical Abstract and similar works.

So I went back to where I.I.I. got the data all those years ago – A.M. Best’s Aggregates & Averages, another statistical compendium with a peerage. (We get much of our data now from SNL Financial.)

In those days before the PC, Excel and Big Data, Aggregates & Averages was much simpler. For the most part, it resembled a bound computer printout, with most information divided among three types of insurers: stock companies, mutuals and reciprocals. To calculate an industry total, you had to pick out numbers from each section.

That’s what I did, 50 years after the fact. Sure enough, all Best’s parts added to $6.839 billion, just like our old Insurance Facts said it would.

I was reassured, but I shouldn’t have been surprised. I got to see firsthand how much double-checking and questioning every line of the 242-page book received. Our editing is scrupulous, now, just as it was in 1964, when the first Mustang rolled onto the street and some band named the Beatles put out some records.

You can buy this year’s Fact Book at www.iii.org/store or by emailing publications@iii.org or calling (212) 346-5500.

The presence or lack of catastrophes is a defining event when it comes to the financial state of the U.S. property/casualty insurance industry.

At the 2014 Natural Catastrophe Year in Review webinar hosted by Munich Re and the Insurance Information Institute (I.I.I.), we can see just how defining the influence of catastrophes can be.

U.S. property/casualty insurers had their second best year in 2014 since the financial crisis – 2013 was the best – according to estimates presented by I.I.I. president Dr. Robert Hartwig.

P/C industry net income after taxes (profits) are estimated at around $50 billion in 2014, after 2013 when net income rose by 82 percent to $63.8 billion on lower catastrophe losses and capital gains.

P/C profitability is subject to cyclicality and ordinary volatility, typically due to catastrophe activity, Hartwig noted.

In 2014, natural catastrophe losses in the United States totaled $15.3 billion, far below the 2000 to 2013 average annual loss of $29 billion, according to Carl Hedde, head of risk accumulation, Munich Re America.

Lower catastrophe losses helped p/c industry ROEs in 2013 and 2014, relative to 2011 and 2012, and helped the p/c industry finish 2014 in very strong financial shape, despite the impact of low interest rates on their investments, Dr. Hartwig noted.

PCROE_major_event

 

Overall industry capacity, as measured by policyholder surplus, is projected to have increased to $675 billion in 2014 – a record high.

The industry’s overall underwriting profit in 2014 is also estimated at $5.7 billion, on a combined ratio of 97.8.

Underwriting results in 2014 and 2013 were helped by generally modest catastrophe losses, a welcome respite from 2012 and 2011 when the industry felt the effects of Hurricane Sandy and record tornado losses, Dr. Hartwig noted.

Matthew Sturdevant of the Hartford Courant has a good round-up of the other webinar presentations here.

 

 

 

 

With frigid temperatures and snow expected to fall around the New York City area and other parts of the United States this week, it’s a good time to review how winter storms can impact catastrophe losses.

For insurers, winter storms are historically very expensive and the third-largest cause of catastrophe losses, behind only hurricanes and tornadoes, according to the I.I.I.

Despite below average catastrophe losses overall in 2014, insured losses from winter storms were significant. In fact winter storms in the U.S. and Japan accounted for two of the most costly insured catastrophe losses in 2014.

According to preliminary estimates from sigma, extreme winter storms in the U.S. at the beginning of 2014 caused insured losses of $1.7 billion, above the average full-year winter storm loss number of $1.1 billion of the previous 10 years.

And Aon Benfield’s 2015 Reinsurance Market Outlook notes that a multi-billion-dollar February winter weather insured loss event in Japan was one of the costliest ever for the country’s industry.

Sigma estimates the insured loss payout from that Japan winter storm at $2.5 billion and ranks it as the third costliest insured catastrophe loss of 2014.

What about the year prior?

Winter storms caused $1.9 billion in insured losses in 2013, up dramatically from $38 million in 2012, according to reports from Munich Re.

From 1994 to 2013 winter storms resulted in about $27 billion in insured catastrophe losses (in 2013 dollars), or more than $1 billion a year on average, according to Property Claim Services (PCS).

The good news is that NOAA’s U.S. Winter Outlook predicted early on that a repeat of last year’s winter of record cold and snow is unlikely.

In a release, NOAA’s Climate Prediction Center said:

Last year’s winter was exceptionally cold and snowy across most of the United States, east of the Rockies. A repeat of this extreme pattern is unlikely this year, although the Outlook does favor below-average temperatures in the south-central and southeastern states.”

As another year comes to an end, we thought it would be fun to take a look back at our most popular posts in 2014.

Our most-read posts here at Terms + Conditions ran the gamut from extreme weather, to drones, Obamacare and cyber risk.

Perhaps not surprisingly, three of our top 10 posts during the year were on the topic of cyber risk and its impact on companies large and small.

In Latest Cyber Security Breach: 1.2B Passwords Stolen we reported on the largest known data breach to-date, in which a Russian crime ring amassed billions of stolen Internet credentials, including 1.2 billion user name and password combinations and more than 500 million email addresses.

Our post Data Breaches Becoming More Damaging revealed that data breaches are now the greatest risk factor for identity fraud. In 2013, one in three consumers who received notification of a data breach became a victim of fraud, up from one in four in 2012, according to a report by Javelin Strategy & Research.

And in The Importance of Having a Cyber Liability Policy we highlighted that while companies hit by a data breach look to their insurance policies for coverage, recent legal developments indicate that reliance on traditional insurance policies is not enough.

In case you missed them the first time round, here’s a complete list of our top 10 posts:

1. NOAA: Extreme Cold and Snow Unlikely This Winter
2. Drones and Insurance
3. IRC: P/C Insurers Not Immune to Effects of Affordable Care Act
4. Cavalcade of Risk #209: Risk Assessment
5. Latest Cyber Security Breach: 1.2B Passwords Stolen
6. Poor Service, Not Price Drives Auto Insurance Customers to Shop
7. Data Breaches Becoming More Damaging
8. Sports, Concussion Risk and Liability
9. To Lie or Not To Lie
10. The Importance of Having a Cyber Liability Policy

Thanks for following and commenting. We wish all our readers a very happy new year!

December 26 marks the 10th anniversary of the Indonesian earthquake and tsunami which killed more than a quarter of a million people in Indonesia, Thailand, Sri Lanka, India and other countries surrounding the Indian Ocean.

A decade later, it’s perhaps surprising to read that weaknesses remain in the tsunami warning system across the region.

Yet maybe the best protection for residents living in tsunami-vulnerable areas is to learn natural tsunami warning signals and which areas have the highest flood risk.

A gallery of tsunami protection lessons posted by Allianz cites three key signs from GeoHazards International’s Tsunami Preparedness Guidebook:

-Strong earthquake shaking, particularly shaking lasting longer than 30 seconds;
-Withdrawal of the sea to unusually low levels; and
-Loud roar from the ocean, similar to a jet airplane, explosion or sudden, intense rainfall.

Identifying evacuation routes — creating hazard and evacuation maps showing the quickest and safest routes to higher ground or other safe areas — is also a key recommendation. Allianz notes that it is critical to involve government and emergency responders when developing these maps.

Education and awareness among residents in tsunami-prone areas then, can play as important a role as instrument-based tsunami warning systems.

In addition to high mortality risk, earthquakes and tsunamis can cause significant insured property damages.

While insured losses from earthquakes and tsunamis amounted to just $45 million in 2013, this was far below the record $54 billion recorded in 2011, according to facts and statistics compiled by the I.I.I.

On March 11, 2011 a devastating tsunami hit the coast of northeast Japan, triggered by a powerful earthquake approximately 80 miles offshore. The quake and tsunami caused $35.7 billion in insured damages, according to Swiss Re.

Also, early in 2011, a powerful earthquake struck Christchurch, New Zealand, resulting in $15.3 billion in insured losses.

The Japan and New Zealand quakes are among the 10 costliest world earthquakes and tsunamis, based on insured damages, according to Munich Re.

10MostCostlyQuakeTsunamiInsuredLosses

Next Page »