How often does a “one-in-a-hundred-year” storm occur? It’s not a trick question and it is something that is being asked repeatedly as the realities of climate change make themselves felt.
The meteorological models of the past cannot predict the extent, intensity and frequency of extreme events in the future, and this presents a problem. Both insurance and climate modelling are numbers games: you roll the dice and know the chances that it will show a particular number.
But when underwriting business assets in 21st century, not only are the dice no longer square, they are probably weighted and possibly magnetised. And the dots have been obliterated. Fancy the odds now?
Impact of climate change
This conundrum is at the heart of the December 2016 ClimateWise report Investing for Resilience. An initiative by insurers, brokers and service providers, convened by the University of Cambridge Institute for Sustainability Leadership, ClimateWise estimates there is now a $100-billion annual “protection gap” in the insurance market as a result of climate change.
The frequency of weather-related catastrophes such as windstorms and floods has increased six-fold since the 1950s, it says. Put another way, 100-year events now have a repeat interval of around 17 years.
This protection gap, between exposure to climate risk and insurance coverage, means the insurance sector must adapt or face the consequences, says Maurice Tulloch, chairman of Global General Insurance at Aviva and chair of ClimateWise. “The insurance industry’s role as society’s risk manager is under threat. Our sector will struggle if response is limited to avoiding, rather than managing, exposure to climate risk.”
ClimateWise advocates a more hands-on approach to organising resilience. An analogy might be the Thames Barrier: had this not been built, it would make economic sense for its construction to be funded by the insurance industry, as the expense would be a small fraction of the pay-outs when London disappears beneath a storm surge.
The insurance industry’s role as society’s risk manager is under threat
Recommendations by ClimateWise are not quite so radical, calling for a realignment of asset management, risk management and underwriting to support greater resilience, and for the introduction of a resilience rating system.
Central to all this is the assessment of what the climate will actually be like in 20, 50 or 100 years. Just about the only thing that can be known with any confidence is that frequency analysis of past events is of less use than ever in predicting the future.
Innovative tech
A new industry is emerging to fill this information void. New sources of data are being tapped, new methods of interpretation developed and new techniques of prediction employed in a bid to get ahead of the curve on future climate risks.
Among the new sources is geospatial data, which has increased enormously as satellites and other Earth-monitoring technologies flourish. Financial software consultancy First Derivatives has recently signed a deal with Airbus Industries to access historic and future satellite imagery to feed into its Kx number-crunching system, specifically to improve how climate-related risk is measured. The company says traditional relational databases are unable to cope with today’s data explosion or the need for time-critical processing and analysis.
Environmental consultancy Ambiental specialises in flood modelling and forecasting, and with Landmark Information Group has developed a technology using probabilistic climate predictions to generate more realistic flood models. The FloodFutures project strengthens planning by improving understanding of how the risk profile changes over time. It includes the generation of future flood maps, predictions of erosion, and the risk to transport links and power supplies.
This “future-facing” data supports adaptation reporting requirements, says Ambiental chief executive Dr Justin Butler. “It represents a step-change in the way we view risk. The potential for improved planning, risk mitigation and adaptation is considerable,” he says.
As with any computer modelling, the quality of the output is only as good as the input and the techniques used to crunch the numbers. Both historic and contemporary data has an appreciable value, and the insurance sector faces the challenge of gaining rapid access to top-quality information at a reasonable price. This is where the Oasis Loss Modelling Framework comes in.
Created as a not-for-profit company in 2011, Oasis aims to bring down the cost of modelling while improving prediction models. Chief executive Dickie Whitaker says this enables decisions to be made on a granular level, taking into account actual developments at real-world locations. Oasis uses Monte Carlo simulation techniques to provide a probabilistic density output including figures for financial loss.
“I firmly believe that knowledge of risk is not something only the rich should have. I co-founded this open source loss modelling software to provide data in the format needed by the insurance industry,” says Mr Whitaker.
Oasis allows the effect of construction decisions on resilience to be modelled. A stronger roof may prove a better bet over the life of an asset and reduce insurance costs. Oasis is being used by the Potsdam Institute to create a new flood model of the Danube basin, to create a range of “synthetic events” and so determine what can be done to reduce risk. Its value is also recognised with an endorsement from the G7-backed Global Innovation Lab for Climate Finance.
Extreme weather events may still be known as acts of God, even though climate change is the result of acts of humans. God, after all, does not play dice.