Chapter 7 – Reinsurance Event Attributed Carbon Tax (REACT)

Richard H. Clarke FIChemE, Director (Research) Predict Ability Ltd.

The basis for a real, global carbon price

Clear evidence of climate change in disaster loss data enables the insurance industry to effectively price the excess damage and provide a carbon price signal to businesses and governments worldwide.

Event Attribution

The motivation for having an insurance-led response to climate change came from Prof. Myles Allen’s 2011 inaugural lecture, in which he focused on the emerging science of probabilistic event attribution (PEA), described in Chapter 7.

Much progress has been made with PEA and in a few years the science and extraordinary computing power needed may be sufficient to provide near real-time x data, the extent to which climate change damage can be attributed to man-made (anthropogenic) CO2 emissions.

If we focus on disasters and catastrophes, it soon becomes clear that there are many reasons for these major loss events.  Finding and predicting these reasons, is Predict Ability Ltd’s raison d’être.  It is also evident that climate change is rarely the fundamental reason why weather-related loss events occur i.e. x <<1.

The data Predict Ability Ltd relies on to make these assertions, comes from the historical and well-established databases of reinsurance company (Munich RE)[1] and (CRED EM-DAT)[2], an initiative aimed at rationalising decision making for disaster preparedness.  All these databases have flaws, of course, because ultimately what comprises a disaster is a matter of human judgement and circumstances.

By examining all the results of Predict Ability Ltd’s PALgamma model we find that, compared to the early 1960’s, the total number of weather-related loss events, worldwide, now exceeds the expected number by around 12 per cent or x ~0.12.  In the case of disasters and catastrophes however, the figure is nearer 20 per cent (x ~0.2).

When averaged across a range of studies, this is consistent with the magnitude of attribution (11-19 per cent) predicted by PEA[3], as discussed in Chapter 7.  It could be a coincidence that the disaster shortfall observed by Predict Ability Ltd is similar but, as we shall see, it is almost certainly not.

If all other factors held steady, the IPCC expects the number of disasters to rise by about 3 per cent per decade, as the global temperature anomaly (global warming) continues to increase.  There is a link between temperature anomaly and disaster numbers.

Probabilistic event attribution science is focused on specific climate-related events, such as the Russian Federation (Moscow) heat wave of 2010[4] or the North of England (Storm Desmond) floods 2015[5].  Yet there is a pressing need for a global attribution methodology today – even if it is inevitably an approximate one.  As PEA progresses, it could – and it should – provide the detailed pieces of the jigsaw, enabling an even more exact picture of loss and damage to emerge[6] over time.

Global Attribution

Predict Ability Ltd. has investigated the global attribution question.  The approach taken was to identify the year-by-year extent of disaster under-prediction in PALgamma.  Predict Ability Ltd’s model takes into account the factors needed to accurately predict the detailed form of the disaster numbers graph published annually by Munich RE[7].  This level of modelling was required to attain a 96 per cent correlation[8] (the p-value is 10-18)[9].  What has been achieved is shown below; the solid red line includes the effect of climate change, the dotted red line does not i.e. it represents the ‘counterfactual[10]’, the ‘what if there were no climate change’ model. (The fact that counterfactual thinking is a concept of psychology and risk is no accident – our core problem is being able to accept climate change.)

PALgamma and Munich Re disaster and catastrophe loss events

Comparison of Predict Ability Ltd’s PALgamma model (with and without climate change; dotted line) with Munich RE’s disaster and catastrophe data.  The disaster categories (meteorological etc.) are all-encompassing terms for storms, droughts, earthquakes and floods[11].

For reference, a straight-line linear regression of the raw data yields a 93 per cent correlation.  The PALgamma prediction, however, is completely independent  of the previous history of disaster numbers and yet it is also more accurate. (PALgamma is based on a mass of richly textured, public domain data with huge potential for expansion and granularity enhancement.  If that is of interest, please visit the Predict Ability Ltd website here[12]).

A normalisation[13] of the Munich RE and PALgamma data provides a key insight.  The PALgamma algorithm does not, by itself, incorporate the effects of climate change.  Thus, if there is any climate effect on the number of disasters, it will be apparent in the normalised disasters and catastrophes ratio, NDC, that was calculated for each year i = 1980 to 2014:

NDCi = ( disastersi + catastrophesi ) / PALgammai

NDC is plotted, below, as a blue line for the period 1980 to 2014 – the 35 years for which there is Munich RE data.  The slopes of the straight lines are about 5 per cent per decade, slightly higher than the IPCC’s estimate.

Correlation between disasters and PALca

The correlation between normalised disasters and the year-by-year warming trend.

Compared to the early 1980’s, this chart shows there are about 15 per cent more disasters than can be explained by the underlying factors that drive the PALgamma model.  Although there is residual noise in the NDC ratio, there is a clear upward trend.  New data, not shown here, indicates that the trend has continued during the last few years.  If earthquakes and volcanoes (both geological events) are removed the correlation improves further still.

The red line, PALca, is another ratio.  It represents the impact of temperature anomaly and thus climate change – more on PALca[14] shortly.  Through PALca, the position of the red trend and blue trend were made equal in order to reveal something remarkable.  There is a significant underlying relationship between the normalised disasters (NDC) and the year-by-year warming trend (PALca).  Comparing the two lines “by eye” shows a definite correlation.  This is the “smoking gun” confirming that part of the increase in global disaster numbers is related to climate change.

For the statistically minded, there are a number of ways to test such a claim.  The p-value is 0.0052; p-values are used extensively in many sciences to show that the chance of there not being a correlation i.e. the null hypothesis is acceptably low.  A p-value of 0.005 suggests there is very good confidence that there is a hypothesis[15].  Finally, if we doubly-normalise the data i.e. take the blue and red data trends and divide them by their linear trend lines, there is a significant level of correlation: the ‘coefficient of determination[16]’ R2 is about 0.13.  But if the Tanomaly data (the NOAA temperature anomaly data) and hence PALca are shifted by ±1 year, any correlation completely disappears (R2 ~ 0).  The link between the temperature fluctuations and disaster numbers is not random.

PALca – Predict Ability Ltd (Lightning) Claims Algorithm

In Science magazine, one of the leading US research journals, there is a comprehensive overview of the evolving impact of climate change on the insurance industry by Evan Mills[17], a senior scientist at LBNL (Lawrence Berkeley National Laboratory).  Mills is a respected chronicler of the insurance industry’s action on climate change.  One graph stands out.  It shows data obtained by a long-established US insurance firm, the Hartford Steam Boiler Inspection and Insurance Co.  Along the y-axis was temperature (°F) and on the x-axis were the number of claims filed in several north eastern US states for lightning strikes.  Several years’ data from the mid 1990’s lay on the general trend, a logarithmic curve (inset).  Here, below, the data has been transposed and temperatures converted to °C.  The log-linear plot shows there is an exponential relationship with temperature.  It can be used to make an estimation of the relative number of claims there will be as the temperature anomaly increases.

PALca and Lightning claims recorded by Hartford Boiler Insurance

Plot of lightning claims recorded by Hartford Boiler Insurance Co. (inset = original)

This is how the PALca algorithm works.  Suppose the global, pre-industrial temperature, T, was 16 °C – actually, the precise value does not matter.  According to the algorithm, there would have been 23.5 hypothetical claims.  If the world is now 0.8 °C warmer i.e. the globally averaged Tanomaly = 0.8 °C, there would be 27.1 claims.

Of course, for a simpler correlation a linear equation would suffice; but it might under-predict the effect on claims of an increasing temperature anomaly.

The 2013 IPCC climate change science report[18] defines Tanomaly and the data from the US National Ocean and Atmospheric Administration (NOAA) provides us with an excellent and coherent source of yearly, monthly and daily (imagery) data[19].  NOAA’s ‘land and sea’ temperature has been found to be the most suitable.

In Chapter 3 the benefits and drawbacks of using temperature anomaly as a proxy for the effects of climate change were discussed.  Whatever the concerns, Tanomaly is recognised worldwide and the ‘2° C target’ is one of the IPCC’s clear goals[20].

As explained in Chapter 7, we define the extent to which climate change damage can be attributed to man-made (anthropogenic) CO2 emissions using x, a term that can now be defined as follows:

x = PALca{ Tnow } / PALca{ Tpre-industrial } – 1


Tnow = Tanomaly (now) + Tpre-industrial and

Tpre-industrial = T before global warming began.

This is the average fraction of losses that is attributable to man-made (anthropogenic) warming (assuming that natural variations in temperature have been fully taken into account).  In the case of 0.8 °C of warming x = 27.1/23.5 – 1 = 0.15.  By extension, 15 per cent of today’s weather related losses need to be attributed to the cause: the producers of CO2 – all of them, from Newcomen[21] in 1712 right through to the Big Energy, Big Cement, Big Land and Big Everything companies of today.  Pragmatically, though, the 1960’s make a good baseline.  But more of that later!

Returning briefly to the question of lightning, in 2014 there was a major study published in Science magazine by David M. Romps[22] et al.  Their algorithm for lightning strike prediction, links several mechanisms that are likely to be significantly affected by increasing Tanomaly.  Using a number of leading climate prediction models, which are well suited to work with their methods, Romps et al. determined that the number of lightning strikes (in the continental US) will increase by 12 ± 5 per cent per °C.  The PALca prediction lies within those bounds.  Romps et al.’s results are illustrated below.

Mean of CAPE, precipitation, CAPE times precipitation, lightning flashes.

Romps et al.[23] ‘Projected increase in lightning strikes in the United States due to global warming’ (Science, November 2014) predicted flashes bottom left, actual flashes bottom right.

Disaster numbers, disaster dollar losses and the global total dollar losses

To determine the actual number of disasters, or the disaster dollar losses, or the global total of dollar losses, the PALgamma algorithm first has to be multiplied by the appropriate climate change global attribution factor PALca, thus:

Total number of events (disasters and catastrophes) = PALgamma × PALca

From the total number of events, Predict Ability Ltd has formulated methods to determine the disaster and catastrophe overall dollar losses.  They have been compared with data provided by Munich RE as follows.

Disaster and catastrophe overall dollar losses = k1 × (insured disaster and catastrophe losses) + k2

where k1 ~ 2.67 i.e. the ratio of (overall disaster and catastrophe dollar losses) / (insured disaster + catastrophe dollar losses) and k2 ~ $40 billion, as shown in the highly (87 per cent) correlated trends below.

Total and insured disaster and catastrophe losses (data courtesy: Munich Re)

Total and insured disaster and catastrophe losses (data courtesy: Munich RE).

Global, weather related, total dollar losses = k3 × (overall disaster and catastrophe dollar losses)

The constant k3 is derived as follows.  In the paper An Insurance-Led Response to Climate Change submitted to the journal Climatic Change by Anthony Webster and Richard Clarke[24] the authors estimate that the insurance industry has a premium volume of around $4 trillion per annum (5.63 per cent of global GDP) and that ⅓ of that is connected with weather related losses.  In 2014 global GDP was $77.3 trillion.  From this we can estimate the loss ratio k3

k3 ~ ( ⅓ × 5.63 per cent × $77.3 trillion × k1) / ( overall disaster and catastrophe losses – k2)

thus k3 ~ 27 for 2014

With k3 in place, we can calculate the weather related, global total dollar losses.

Reinsurance Event Attributed Carbon Taxation (REACT)

Now we are close to determining the price of carbon.  There is one more term required.  From Chapter 7, we have the basic carbon pricing formula

y = attributable losses / carbon emissions

= 1/C × ∑ ( Li × xi )


y = carbon price ($/tonne CO2), C = global carbon emissions (tonnes / year), Li = global, weather-related, total losses and xi = the per event attribution ratio and the subscript i applies to each loss event.

If the global attribution method discussed in this chapter is applied then y can be approximated by

y = L × x / C

In 2014, L ~ ( ⅓ × 5.63 per cent × $77.3 trillion × k1) ~ $3.9 trillion.  If x ~ 0.15 and C ~ 37 billion tonnes CO2 then y ~ $15.70 per tonne CO2.  That is a credible carbon price and, if imposed broadly, would impact consumer prices by less than 0.5 per cent a new LSE study indicates[25].

Predict Ability Ltd’s methodology creates a much higher resolution picture about both risk and pricing.  This we term as ‘fine gamma’.  The resulting carbon price is shown on Predict Ability Ltd’s website and it is being updated as new insight and information becomes available.

Our carbon legacy must be paid for

So there it is, finally, the price of carbon.  It is meaningful and realistic and, if needs be, an instantaneous (tradable?) carbon price be calculated.  Alternatively, an integrated carbon price can readily be generated to capture both the cumulative nature of emissions and the damage they cause.  This is shown in the diagram below.  The variation in prices is partly related to variations in Tanomaly but PALgamma values also change from year to year.

Predict Ability Ltd.’s historic, current and future carbon price projections

Predict Ability Ltd’s historic, current and future carbon price projections.

In essence, Predict Ability Ltd’s carbon pricing system captures a fundamental aspect of mankind’s interaction with Nature and our growing civilisations within it.  Two major questions arise:

  • Until recently there has been no attempt to put a price on carbon – on CO2 and other GHG emissions – although belatedly a number of ‘cap and trade’ carbon markets have evolved[26].  But because we, collectively, have failed to address what those in authority are alleged to have known[27] since the 1980’s – that CO2 emissions cause climate change – there is now a gigantic debt.  That will be our carbon legacy for future generations.  Should we – could we – pay down that debt?  Or will it be written off?
  • Let us make no mistake, carbon fuels and hydrocarbons have enabled a huge human transformation in the late 20th and early 21st Centuries.  That is our carbon inheritance[28].  But putting a price on carbon without considering the long term is unwise.  If carbon markets proceed as they are today, then a really good idea for tackling climate change is in danger of being squandered.  That is what Chapter 10 (Carbon Intensity Weighting) seeks to address.  We must tax energy of all kinds in a fair and progressive way, not just the CO2 emissions.  Why?  Because the CO2 we have emitted (and will yet emit) will persist in the atmosphere for a very long time until the biosphere or the oceans and rocks can absorb it.

The persistence of CO2 in the atmosphere

How long will anthropogenic (man made) CO2 remain in the atmosphere?  It is an easy question to ask.  There is no simple answer, although ‘forever’ is a good and somewhat terrifying approximation.  However,  it partly depends on how fast we emit the CO2 and at what concentration we stop emitting CO2. “Equilibration among the various carbon reservoirs of the atmosphere, the ocean and the terrestrial biosphere occurs on timescales of a few centuries” say David Archer et al. at the University of Chicago[29].

There is considerable variation amongst the mathematical models available.  Archer et al. note that “a sizeable fraction [20-35 per cent] of the CO2 remains in the atmosphere, awaiting a return to the solid earth by much slower weathering processes and deposition of CaCO3 [calcium carbonate[30]]”.  Or, to put it another way, a recent paper in Nature[31] suggests that mankind has effectively cancelled the next one and maybe two ice ages.  Our carbon footprint already extends hundreds of thousands of years ahead of us!

Simulation of the effect of a 1 trillion tonne pulse of CO2 (which in effect is pretty much what we are doing today) yields a form of decay that illustrates the two mechanisms Archer et al. describe.  The predictions are shown below.

Simulating our 1 trillion tonne pulse of CO2 into the atmosphere

Simulating our 1 trillion tonne pulse of CO2 into the atmosphere (Archer et al, 2009).

These decay curves were approximated by Predict Ability Ltd to produce a function of CO2 with time, following the sudden (artificial) cessation of emissions.  The predicted CO2 concentrations were then converted into estimated temperature anomalies[32] to see how long the loss and damage caused by the emissions would last.  The answer, in practical terms, is for thousands of years.  Possibly one third to one half of the CO2 that was produced by Newcomen’s engine in 1712 is still up there somewhere in the atmosphere.

REACT in a nutshell

The main ideas introduced in this chapter were gathered together into a spread sheet model called ‘REACT in a nutshell’.  It shows the impact of temperature anomaly on losses for (a) the total losses (PALca_body) and (b) the disaster and catastrophe losses (PALca_tail).  The concept of ‘body’ and ‘tail’ was introduced in Chapter 3.  Disasters and catastrophes are governed by the statistics of the probability curve’s tail, whereas the main body of the curve dictates the likely influence of climate change on the total losses and hence carbon prices.

Losses continue for centuries after the abrupt emissions stop.  Moreover, the painfully slow rate of fall of temperature anomaly means that the expected growth in PALgamma becomes the dominant term.  Losses are expected to continue to increase through at least the first half of the 21st Century. The summary page from the spread sheet is shown below.  It is illustrates a 2 °C global warming scenario.

Loss and damage pathways for two classes of loss: total losses and disasters and catastrophes (dis+cat) for a 2 °C global warming target.

Loss and damage pathways for two classes of loss: total losses and disasters and catastrophes (dis+cat) for a 2 °C global warming target.  Even 1000 years after emissions stop, much of the emitted CO2 remains in the atmosphere causing loss and damage.

We now have the basis for a realistic carbon price.  It is directly linked to loss and damage and that is increasing all the time.  Even after the last gigatonne of CO2 is emitted, whether there is an abrupt cessation of emissions or not, the CO2 lingers in the atmosphere for a very long time.  This means there must be a mechanism for fairly transferring the costs of these emissions to current and future generations, even if the damage already done is written off, as seems to be inevitable.  But we cannot go on ignoring this damage.  If we do, then as Chapter 4 suggests, we may be ‘written off’ by Nature!














[13] Normalisation involves creating a set of ratios of raw data divided by what the model predicts. In a perfect model all members of the set should all have a value of 1.

[14] PLCA = Predict Ability Ltd.’s (Lightning) Claims Algorithm uses insurance claims data from Hartford (Insurance Co)












[26] Cap and trade schemes have a specific job only, to reduce carbon emissions; they are not designed to put a price on carbon.






[32] CO2 concentration/ppm ~ 316 × exp( Tanomaly / 3.61 )

Copyright © 2015 Predict Ability Limited. All rights reserved.

Chapter 7 – Reinsurance Event Attributed Carbon Tax (REACT) was last modified: September 7th, 2017 by admin