The new science of wildfire risk
Insurers are adopting trailblazing solutions to predict the locations of large-scale infernos
As the frequency and severity of wildfire claims continue to climb, new advances in predictive analytics are helping P&C insurers understand and price this risk more accurately, competitively and profitably than ever before.
While these technologies can’t tell you when wildfire will strike, they can tell you where. For a sector pummelled by nearly $10bn in insured wildfire losses in just the past two years, higher visibility into wildfire risk couldn’t come at a more crucial moment.
Today, the total landmass susceptible to large-scale infernos continues to expand as wildfire season stretches across a larger portion of the calendar. According to a troubling new UN report, the risk of catastrophic wildfires is expected to increase by 50% in the coming decades as rising temperatures intensify a “global wildfire crisis”. And as we have seen this summer, Europe is no more immune than North America.
To counter this threat, insurers can add exclusions, increase rates or abandon perceived high-risk areas altogether. But the same regions most vulnerable to wildfire are also some of the fastest growing. Instead of giving up or pricing themselves out of the market, savvy insurers will leverage predictive analytics to their competitive advantage. To understand how, let’s look at the bigger picture.
Fighting a growing firestorm
Wildfires don’t just ravage remote locations anymore. They’re taking lives and shattering communities within a rapidly growing number of wildland-urban interfaces (WUIs). These are regions with heavy residential development close to wilderness areas with little to no fire clearance.
Today, there are nearly 50mn homes in 70,000 communities designated as WUIs at risk of wildfire, according to the US Fire Administration. As the UN study points out, this form of land use, combined with the effects of climate change, could lead to a 14% increase in properties exposed to wildfire risk by 2030.
For insurers, wildfire risk has been growing exponentially costlier. From 1964 through the 1990s, US insurers paid an average $100mn annually toward wildfire loss. Beginning in 2000, that figure rose to $600mn per year. But from 2011 to 2018, annual losses skyrocketed to nearly $4bn with no signs of slowing down.
Dialling down wildfire risk
With climate change acting as an accelerant to wildfires, some insurers now offer only E&S lines – with many increasing rates by as much as 20%. Others restrict or exclude coverage for entire zip codes based on data pertaining to only a limited number of high-risk properties.
But insurers can only push prices up so far. And giving up isn’t exactly a growth strategy. As a result of this, access to modern sources of property-level hazard data has become critically important.
Top predictive analytics providers translate vast amounts of geospatial data from an array of high-quality sources to provide instant, accurate risk scores for any property in the US via API in just seconds. Sourcing options that go beyond nonpredictive data elements like slope and aspect is key. Today’s most robust options factor rainfall as well as vegetation burn points into risk scores.
Proximity to fuel load is also vital as embers sent aloft can travel as far as 2.4km, endangering homes along the way. And because wildfires tend to cluster in the same places over time, acquiring data on previous burn perimeters is essential.
While data sources in other countries can sometimes lack the precision of their US counterparts, efforts are underway to make these tools available globally.
After all, as the frequency and severity of wildfire-related claims escalate worldwide, insurers and the communities they serve will need all the help they can get.