In this video, Sean Kevelighan, CEO of the Insurance Information Institute (Triple-I), talks about the Triple-I’s Resilience Hub that the organization began developing in 2019 in partnership with Aon and the Colorado State University Department of Atmospheric Science.
The Hub’s goal is to use data in a way that helps people visualize and understand the risk of natural catastrophes with which they are living as catastrophes become more severe and more people move into high-risk areas.
“We’re tracking hurricane paths all the way back to 1990 so that when we forecast with those relative years, people can better understand what the impact might be in today’s economy,” said Kevelighan.
The project also tracks public flood insurance take-up rates through the National Flood Insurance Program. The average take-up rate for flood insurance is only 12 percent for the nation.
The Hub is part of the Triple-I’s overall insurance for resilience project, which aims to build a coalition that includes government agencies such as FEMA, private sector stakeholders such as Aon, and academic institutions such as the Wharton Risk Center to maximize impact. The Hub’s goal is to provide in one location easy-to-use content to empower consumers to make data-driven decisions when it comes to managing their exposure to extreme weather events. “What we want to drive in the long run is behavioral change. We want people to think twice about where they are living and how they’re living so that they can be more resilient.”
It was like music to my ears to hear risk and resilience experts at Triple-I’s Joint Industry Forum, in a panel on extreme weather, talk so much about communication.
Moderator Charles Chamness, president and chief executive officer of the National Association of Mutual Insurance Companies (NAMIC), kicked off the session by asking Dr. Rick Knabb – on-air hurricane expert for the Weather Channel (TWC) – about the impact on disaster preparedness of tools like TWC’s “storm surge depth simulator,” which Chamness described as “somewhat terrifying.”
If you haven’t seen it, the simulator uses virtual reality technology to show viewers what different water depths could look like and the kind of damage they could generate (see video below).
“We’ve gotten a lot of feedback,” Knabb replied. “Some people tell us, `Wow, I didn’t know how bad water can be.’ Some people tell us ‘You’re scaring me.’ And on some level, we’re trying to scare people just enough to respond and to prepare.”
Knabb added that he had no data to prove people who watch such simulations take immediate steps to improve their preparedness, “but we’re seeing the conversation change. Social media is one of the best ways I have to see that happening.”
The challenge remains, he said, to overcome “the positive bias” of people saying, “That looks really scary – but I don’t think it will ever happen to me.”
Francis Bouchard, Zurich’s group head of public affairs and sustainability, took the insurance industry to task for talking about risks in language customers don’t necessarily understand.
“We’re all risk elites here,” Bouchard said. “Our vernacular is not what normal people speak. And yet we insist on using our language to describe something that’s totally alien to most of the public.”
“At FEMA, we no longer speak in these technical terms like `a one in 100-year event’” – a phrase, he said, that “makes a homeowner who’s just purchase their home think they have 99 years before they have to worry.”
Prepare, Mitigate, Insure
“When we at FEMA talk about ‘resilience,’” Kaniewski said, “what do we mean? We mean preparedness. We mean mitigation. We mean insurance.”
Kaniewski cited evidence from FEMA’s annual household surveys indicating that people in disaster-prone states are “more risk aware and better prepared” than elsewhere in the nation.
“But it’s not enough,” he said. “They have to do so much more.”
Beyond physical preparedness, Kaniewski said, “we have to talk to people about being financially prepared. That means having cash on hand. That also means insurance. Insurance is the best resilience tool.”
“Demand flood insurance”
Knabb agreed, calling upon meteorologists around the world to “talk about insurance more.” He also called on insurance agents to discuss flood coverage for their customers who aren’t in flood zones.
“If it can rain where you live,” he said, “it can flood where you live.”
He recounted buying a new home, asking his agent about flood insurance, and being told, “You don’t need it.”
“I told him, ‘Get it for me anyway,’” Knabb said. “And I’ve changed the graphics I use on The Weather Channel – instead of saying, ‘Ask Your Agent If You Need Flood Insurance’ to ‘Demand Flood Insurance.’”
The panel discussion covered a range of topics, including insurers’ need to emphasize risk reduction and resilience and the “data fluency” of insurance regulators. You can watch the session below.
To distill the insights I collected would take far more than one blog post. Speakers, panelists, and attendees spanned the insurance “ecosystem” (a word that came up a lot!) – from CEOs, consultants, and data scientists to academics, actuaries, and even a regulator or two to keep things real. I’m sure the presentations and conversations I participated in will feed several posts in weeks to come.
Just getting started
Keynote speaker James Bramblet, Accenture’s North American insurance practice lead, “set the table” by discussing where the industry has been and where some of the greatest opportunities for success lie. He described an evolution from functional silos (data hiding in different formats and databases) through the emergence of function-specific platforms (more efficient, better organized silos) to today’s environment, characterized by “business intelligence and reporting overload”.
“Investment in big data is just getting started,” Jim said, adding that he expects the next wave of competitive advantage to be “at the intersection of customization and real time” – facilitating service delivery in the manner and with the speed customers have come to expect from other industries.
Jim pointed to several areas in which insurers are making progress and flagged one – workforce effectiveness – that he considers a “largely untapped” area of opportunity. Panelists and audience members seemed to agree that, while insurers are getting better at aggregating and analyzing vast amounts of data, their operations still look much as they have forever: paper based and labor intensive. While technology and process improvement methodologies that could address this exist, several attendees said they found organizational culture to be the biggest obstacle, with one citing Peter Drucker’s observation that “culture eats strategy for breakfast.”
Lake or pond? Raw or cooked?
Paul Bailo, global head of digital strategy and innovation for Infosys Digital, threw some shade on big data and the currently popular idea of “data lakes” stocked with raw, unstructured data. Paul said he prefers “to fish in data ponds, where I have some idea what I can catch.”
Data lakes, he said, lack the context to deliver real business insights. Data ponds, by contrast, “contain critical data points that drive 80-90 percent of decisions.”
Stephen Mildenhall, assistant professor of risk management and insurance and director of insurance data analytics at the School of Risk Management, went as far as to say the term “raw data” is flawed.
“Deciding to collect a piece of data is part of a structuring process,” he said, adding that, to be useful, “all data should be thoroughly cooked.”
Practical advice was available in abundance for the 80-plus attendees, as was recognition of technical and regulatory challenges to implementation. James Regalbuto, deputy superintendent for insurance with the New York State Department of Financial Services, explained – thoroughly and with good humor – that regulators really aren’t out to stifle innovation. He provided several examples of privacy and bias concerns inherent in some solutions intended to streamline underwriting and other functions.
Perhaps the most broadly applicable advice came from Accenture’s Jim Bramblet, who cautioned against overthinking the features and attributes of the many solutions available to insurers.
“Pick your platform and go,” Jim said. “Create a runway for your business and ‘use case’ your way to greatness.”
It was a balmy 67-degree day in New York on March 15, which prompted the inevitable joke that since it’s warm outside, then climate change must be real. The wry comment was made by one of the speakers at the New York Academy of Science’s symposiumScience for decision making in a warmer word: 10 years of the NPCC.
The NPCC is the New York City Panel on Climate Change, an independent body of scientists that advises the city on climate risks and resiliency. The symposium coincided with the release of the NPCC’s 2019 report, which found that in the New York City area extreme weather events are becoming more pronounced, high temperatures in summer are rising, and heavy downpours are increasing.
“The report tracks increasing risks for the city and region due to climate change,” says Cynthia Rosenzweig, co-chair of the NPCC and senior research scientist at Columbia University’s Earth Institute. “It continues to lay the science foundation for development of flexible adaptation pathways for changing climate conditions.”
“What you can’t measure, you can’t manage,” said Columbia University’s Klaus Jacob, paraphrasing Peter Drucker and making a concise case for the importance of the work the NPCC is doing.
The changes in temperature and precipitation that New Yorkers are experiencing are broadly tracking the climate change projections made by the NPCC in 2015. However, the 2019 report notes that such comparisons should be viewed with caution because of the role that natural variation plays in the short term.
William Solecki, co-chair of the NPCC said “Recent scientific advances have…helped the panel craft new sets of tools and methods, such as a prototype system for tracking these risks and the effectiveness of corresponding climate strategies.”
One such tool is the Antarctic Rapid Ice Melt Scenario, which the NPCC created to model the effects of melting ice sheets on sea level rise around NYC. The model predicts that under a high-end scenario, monthly tidal flooding will begin to affect many neighborhoods around Jamaica Bay by the 2050s and other coastal areas throughout the city by the 2080s.
The NPCC 2019 report recommends that the city establish a coordinated indicator and monitoring system to enable the city and its communities to better monitor climate change trends, impacts, vulnerability, and adaptation measures.
The report also notes the important role of insurance in support of climate change adaptation and mitigation. “Public–private partnerships are essential for facilitating infrastructure resilience, particularly for publicly owned infrastructure systems that often lack resources for resilience improvements. Coordination of insurance and finance is an important future direction to achieve comprehensive resiliency in infrastructure that reduces negative climate change consequences,” said the report.
The I.I.I.’s primer on climate change and insurance issues can be found here.