When the IPCC outlined its so-called risk management framework in section 2.3 of AR5 WG3, Chapter 2, it drew a distinction between a descriptive analysis of decision making and a normative analysis. The former, subject of Part 3 of this series of articles, makes a great deal of the shortcomings of intuitive thinking when applied to the climate change problem. In contrast, a more deliberative approach was advocated, using an array of tools used to assist decision making under risk and uncertainty. These are the supposedly normative techniques and are the subject of section 2.5 of AR5 WG3, Chapter 2. However, upon reading section 2.5, it becomes apparent that the IPCC’s division between flawed intuition, on the one hand, and deliberation as a normative on the other, is a misleading one. In practice, most of the intuitive errors can be made within the context of employing one or more of the deliberative and supposedly normative tools.

After reading section 2.5 of AR5 WG3, Chapter 2, I have to admit that I had difficulty determining the motive behind its writing. It comes across as a student dissertation, and not a terribly good one at that (remember, it was written by the same people who, in section 2.4, chose not to distinguish between risk aversion and uncertainty aversion). Far from a presentation covering cutting edge developments in decision analysis, it is a rambling, flawed and incomplete account that could have been compiled by anyone with access to Wikipedia. There is nothing particularly ground breaking in it and there is plenty that is plain wrong.  If there is a discernible purpose to the account, then it seems to be that it introduces each of the deliberative methods that are available, before then highlighting a susceptibility to intuitive reasoning (implying limited application to the climate change problem). It then finally settles upon the IPCC’s deliberative tool of choice, i.e. the precautionary principle. Add to this an appreciative account of climate model ensembles and the virtues of structured expert judgement, and you have a reasonably accurate description of the IPCC’s conception of normative decision making.

Be that as it may, whilst the IPCC may deem the precautionary approach to be the most appropriate for the climate change problem, it is actually the most intuitive and least deliberative of the normative approaches on offer in section 2.5. It also just so happens to be the one that comes closest to vindicating the declaration of a climate change emergency, so one cannot be too surprised to see it promoted as a normative.

Confusion Reigns

One of the main difficulties and confusions to be found in section 2.5 stems from its incorrect use of terminology. The section itself is titled ‘Tools and Decision Aids for Analysing Uncertainty and Risk’. Nevertheless, a more succinct title might have been ‘Decision Analysis’, since this is the term universally recognised as referring to the application of formal and systematic analysis in the support of decision making under uncertainty and risk. As such, decision analysis covers a wide range of analytical methods, embracing, for example, expected utility (E(U)) theory, decision tree analysis, multi-criteria decision analysis (MCDA), cost-benefit analysis (CBA) and cost-effectiveness analysis (CEA). The IPCC, however, despite covering most of the above, chooses not to use the term ‘decision analysis’ as the collective term, preferring instead to use it in reference to yet an additional analytical technique which, when detailed, appears to be indistinguishable from the previously described E(U) theory! At the same time, fundamentally important techniques such as decision trees and risk influence diagrams do not get a mention. For a group of experts trying to give the impression that they know all there is to know about the application of deliberative decision-making techniques, they don’t do a particularly good job.

Deliberation Under Uncertainty

It is difficult to escape the conclusion that this is a subject that the authors of Chapter 2 have read about, and perhaps even studied as academics, but they have never earned a living from implementing it. Even so, credit should be given for the fact that a number of the widely recognised limitations of the various methods of decision analysis are properly identified by the IPCC document. For example:

“At the same time, the limitations of E(U) must be clearly understood, as the procedures for determining an optimal choice do not capture the full range of information about outcomes and their risks and uncertainties.”  
“In the standard E(U) model, each individual has his / her own subjective probability estimates. When there is uncertainty on the scientific evidence, experts’ probability estimates may diverge from each other, sometimes significantly.”  
“For example, the uncertainty surrounding the potential impacts of climate change, including possible irreversible and catastrophic effects on ecosystems, and their asymmetric distribution around the planet, suggests CBA may be inappropriate for assessing optimal responses to climate change in these circumstances.”  
“A strong and recurrent argument against CBA (Azar and Lindgren, 2003; Tol, 2003; Weitzman, 2009, 2011) relates to its failure in dealing with infinite (negative) expected utilities arising from low-probability catastrophic events often referred to as ‘fat tails’. In these situations, CBA is unable to produce meaningful results, and thus more robust techniques are required.”  

These problems all basically boil down to the fact that any method that evaluates risk in order to identify the correct course of action cannot do so if there is insufficient information to reliably determine the probabilities involved. Furthermore, such decisions are often made with uncertainty in mind rather than risk. This is particularly problematic when dealing with low probability, high-consequence events, in which the avoidance of the worst case scenario imagined becomes the prime objective. But what does the IPCC have in mind when they say that ‘more robust techniques are required”?

The Precautionary Principle to the Rescue

Having recognised that decision analysis depends upon relatively complete information in order that risky alternatives may be evaluated, a climate science friendly solution to the problem is then offered by the IPCC:

“The precautionary principle allows policymakers to ban products or substances in situations where there is the possibility of their causing harm and / or where extensive scientific knowledge on their risks is lacking.”  

A lot has already been said about the precautionary principle and I do not wish to add too much here. The IPCC clearly felt the same way, and so they deemed it sufficient in section 2.5 to simply point out that:

“An influential statement of the precautionary principle with respect to climate change is principle 15 of the 1992 Rio Declaration on Environment and Development: “where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation.”  

After that bold declaration, all that remained in section 2.5 was to indicate how the precautionary principle could be seen as the ‘more robust technique’ called for in order to address the limitations of CBA, etc. The IPCC therefore makes the following connection:

“Robust decision making (RDM) is a particular set of methods developed over the last decade to address the precautionary principle in a systematic manner.”  

 ‘Robust’ is such a lovely word, and who wouldn’t want it to apply to their favourite approach? Unfortunately, there is nothing robust about the precautionary principle, and RDM was most certainly not developed for its benefit; the ‘R’ in RDM does not stand for ‘precautionary’. In fact, the RDM approach is precautionary in only one sense; it enables the decision maker to construct a strategy that keeps open options as long as possible, thereby minimising the regret function. Where standard decision analytics seek to optimise between a number of options, the RDM approach turns this on the head and seeks to satisfice. Nevertheless, RDM is still an approach in which alternatives are considered and compared, albeit in the context of deep uncertainty. The precautionary principle, on the other hand, is the focusing effect writ large.

RDM was not developed to ‘address the precautionary principle in a systematic manner’ – indeed, the RDM paper cited by the IPCC fails to mention the precautionary principle even once. Instead, RDM approaches decision analysis in a manner that addresses deep uncertainty, thereby providing a plausible alternative to the precautionary principle. What RDM says is that you don’t have to resort to the precautionary principle when confronted with deep uncertainty and if you do it may lead to regret. So unless by ‘to address the precautionary principle in a systematic manner’  the IPCC means ‘to highlight the shortcomings of the precautionary principle’, I have to say that, once again, it has got its basic facts wrong.

From Decision Analysis to Uncertainty Analysis

Having somewhat mauled the subject of decision analysis, section 2.5 of AR5 WG3, Chapter 2 then turns its attention to the subject of uncertainty analysis. Once again, I am left wondering what the IPCC’s motives for doing so were, other than to take the opportunity to extol the virtues of structured expert judgement and climate model ensembles. Uncertainty is a difficult concept to tackle and a comprehensive treatment of the philosophical difficulties involved would have seemed appropriate. But this appears to have been either beyond the authors’ ability or outside their area of interest. Instead, the IPCC contents itself with the assertion that uncertainty is not the sceptics’ friend before leaving it at that. In fact, according to Lewandowsky, uncertainty is ‘actionable knowledge’. So anything that establishes a high level of uncertainty, whilst doing nothing about it, has to be viewed approvingly by the climate activist.

Uncertainty as Opinion

Structured expert judgment has been around for some time (it originated in the nuclear power industry) and, as the IPCC enthuses, it is gaining employment in many sectors:

“As attested by a number of governmental guidelines, structured expert judgment is increasingly accepted as quality science that is applicable when other methods are unavailable (U. S.  Environmental Protection Agency, 2005).”  

The IPCC then eagerly draws attention to how such ‘quality science’ has found application in climate science:

“Structured expert judgments of climate scientists were recently used to quantify uncertainty in the ice sheet contribution to sea level rise, revealing that experts’ uncertainty regarding the 2100 contribution to sea level rise from ice sheets increased between 2010 and 2012 (Bamber and Aspinall, 2013).”  

This is all looking very good for the climate activist and very bad for the sceptic. Basically, the more experts are thrown at a problem, the worse things seem. But therein lies the key problem with structured expert judgment. Whilst it may be better to place more credence in those experts who have a better track record of estimating uncertainty (and that, when all is said and done, is all there is to it), one cannot escape the fact that structured expert judgment is just the science of consensus measurement. As such, it is more about modelling opinion than it is about modelling a physical system. It isn’t aleatory uncertainty that is being measured but the epistemic uncertainty regarding aleatory uncertainty. This is worth keeping in mind before one gets too excited about contributions to sea level rise.

Scenarios and Climate Model Ensembles

The IPCC concludes its survey of normative decision making by outlining the theory behind Representative Concentration Pathways (RCPs) and climate model ensembles. This is pretty standard stuff and so I won’t waste any time going into the detail. However, it would be remiss of me not to point out that this is another of the areas where credit is actually due to the authors. It would have been easy for them to have presented the uncertainty analysis afforded by such scenarios and ensembles as being straightforward and conclusive, but they don’t. Despite having failed to explain the limitations of probabilistic approaches, despite having failed to mention alternatives such as possibility theory, despite having failed to clarify the importance of distinguishing between epistemic and aleatoric components of uncertainty before attempting its propagation, the IPCC is still commendably candid when it comes to the limitations of the scenario and ensemble approach. In its own words:

“On the downside, it is easy to read more into these analyses than is justified. Analysts often forget that scenarios are illustrative possible futures along a continuum. They tend to use one of those scenarios in a deterministic fashion without recognizing that they have a low probability of occurrence and are only one of many possible outcomes. The use of probabilistic language in describing the swaths of scenarios (such as standard deviations in Figure 2.4) may also encourage the misunderstandings that these represent science-based ranges of confidence.”  

Actually, I don’t think I could have put it any better myself. And this limitation is such a shame, seeing as how important a probabilistic interpretation of ensemble output is to extreme weather event attribution. Epistemic uncertainty is key here, and one should never forget the extent to which the validity of extreme weather attribution depends upon the uncertainties contained in the models upon which it is based.

And Finally, the Elephant not in the Room

In surveying the tools and methods available to the deliberative decision maker, the IPCC is guilty of a number of significant omissions, some of which I have already mentioned. It would be unfair, however, to have expected everything to be covered. Even so, there is one glaring omission that is particularly noteworthy. Given how much was made in section 2.4, regarding the intuitive thinker’s flawed evaluation of weather events, one would have thought that the IPCC would take particular care to mention that a deliberative approach to the evaluation of extreme weather risk had already been developed in the guise of Detection and Attribution (D&A). It’s not as if AR5 fails to cover D&A, since a whole chapter is devoted to the subject (Chapter 10). Section 10.6 of Chapter 10, in particular, describes how D&A techniques have been used to address extreme weather event attribution. Nevertheless, when it mattered, those AR5 authors tasked with the discussion of risk perception and how it should be managed failed to make that crucial connection in their own section.

Could it be that back in 2014, when AR5 was written, the IPCC was still more concerned with the precautionary principle and the promotion of the plausibility of future disaster scenarios? Was it still more concerned with establishing the collective credibility of the experts that formulated them? Is this an indication that the value of using D&A to create a narrative of emergency based upon present-day impact had not yet taken root? Indeed, had the IPCC produced the manifesto for establishing a climate emergency policy without even fully realising it at the time?

To explore that intriguing possibility one has to turn one’s attention to the final element of the IPCC’s risk management framework, in which the impact of uncertainty on the formulation of climate change policy is discussed. Accordingly, I shall cover the subject in the next, and final, article within this series.


  1. John, I have mentioned before my belief in the necessity of us knowing who it was that wrote Chapter 2 of AR5, their backgrounds, and your evaluation of their competency. I gather that your criticisms commonly imply that the authors are scholars rather than practitioners and that this explains much of what they omit or give insufficient credit to. This would add another dimension to your study.


  2. Now that I have published the last article in the series, I will look into the background of the Chapter 2 authors to see what I can find. I will post my findings as a comment.

    Liked by 1 person

  3. I once looked briefly at authors to find out how many of the them cross-correlated with those scientists who are propagating climate catastrophe narrative. There are lists of the authors by chapter etc on the IPCC website. There’s a heck of a lot of them. At the working group level, they are pretty much all scientists, as far as I recall. At the SPM level, there’s only a minority of scientists. As a proportion of the former, those who propagate catastrophe narrative, in searchable English at least on the web, is very small indeed.


  4. John, I’m just catching up, but I’m pretty sure that in the first 4 parts of your series of articles, I’ve seen words to this effect on several occasions:

    “This is particularly problematic when dealing with low probability, high-consequence events, in which the avoidance of the worst case scenario imagined becomes the prime objective. ”

    Can I just be clear, here – can you help? Is it the case that the IPCC is suggesting that “catastrophic climate change” is a low-probability event?


  5. Mark,

    I think that the IPCC believes there to be a continuum of possible scenarios, going from the likely to the unlikely. Some of the unlikely scenarios also happen to be very high consequence. What they are pointing out is that if the consequence is so high as to be unacceptable, no matter what, then there is little consolation in the probability being very low. In other words, utility calculations are not very useful when very low numbers are multiplied by very high numbers. Some people refer to these as catastrophe scenarios but I am not aware if that term is used by the IPCC in AR5. Also, keep in mind that many believe that even the least consequential scenarios are still above the acceptability threshold, and they carry high probabilities, though I think I’m on pretty safe ground saying that the IPCC does not class these as catastrophic. Let’s wait to see what they say in AR6 though.

    Liked by 1 person

  6. John:
    “Some people refer to these as catastrophe scenarios but I am not aware if that term is used by the IPCC in AR5. Also, keep in mind that many believe that even the least consequential scenarios are still above the acceptability threshold, and they carry high probabilities, though I think I’m on pretty safe ground saying that the IPCC does not class these as catastrophic.”

    Some time ago I searched the entire AR5 for the context behind every single mention of catastrophe and catastrophic and all the equivalent terms, plus all terms like tipping-point, cascade, and others. There are hundreds of mentions, which include an extremely wide applicability, like say periodic century level catastrophic river flooding (which occurs with or without man-made CC), or the catastrophe a hurricane can visit upon a local population (which again occurs naturally anyhow, whether or not they increase), or ‘catastrophe modelling’ to probe the area, or within the economics chapter normal domain usage of ‘catastrophic financial events’ which amount to a potential recession or a national default whatever. But across all these, nowhere do they ever say anything even slightly close to an interpretation that there is any significant chance of imminent global catastrophe from climate change. No mentions could possibly be recruited for medium or high probability scenarios in any global sense, notwithstanding that increased flooding on xyz coast or drying on mno wetland or SL rise for abc island, is sometimes described as a potential local catastrophe. So I’d say you are dead right that they do not class medium / high probability events (or even low, really, for such things as are tractable to probability) as in any way catastrophic in the sense that the catastrophe narrative does, i.e. meaning a global and civilizational / existential sense. Notwithstanding some nation might lose a treasured wetland or whatever in some scenarios.


  7. @John – thanks for this series – IPCC on “Risk”

    seem to remember a few years back some climate doomsters complaining the IPCC were not “doom” enough, or am I making that up? can’t find at link!!!


  8. Hunter: “seem to remember a few years back some climate doomsters complaining the IPCC were not “doom” enough”

    You remembered rightly. The public don’t generally know that the IPCC science is not full-on doom anyway, nor most politicians and influencers, so such complaints don’t typically come from them. But there is a very small group of scientists, and a larger group of advocates who tend to be more domain knowledgeable, who have called out the IPCC as being too group-think but in the *opposite direction* to the group-think that sceptics complain about. Or even politically motivated / corrupted. Generally, they hang these accusations upon an inappropriate underestimation, in their opinion, of the tipping point considerations. They are on record in public press with these complaints. The public don’t know these individuals are not mainstream, so get confused and think that the doom already predicted must be even more doom-full, if that’s possible; or perhaps with a bit of luck realise it’s all b*ll*cks. I mention some of this group in a post at Climate Etc I think: https://judithcurry.com/2018/11/26/cagw-a-snarl-word/ .


  9. O/T found this from 2020 – ” – https://alumni.berkeley.edu/california-magazine/summer-2020/michael-mann-on-climate-denial-and-doom

    snippet – “including novelist Jonathan Franzen, who wrote in The New Yorker, “The goal [of cutting greenhouse gas emissions] has been clear for thirty years, and despite earnest efforts we’ve made essentially no progress toward reaching it.” Mann, who was contacted by a New Yorker fact checker for the story, said Franzen played fast and loose with facts and misrepresented the spread of possible warming trajectories “in a way that plays to the alarmist narrative.” He sees stories like these as practically “a new genre” in journalism, one which leads to the same place as denialism: disengagement and inaction. And he’s concerned it’s contagious.”

    to late Mike, but good try.

    Liked by 1 person

  10. John – I see, the precautionary principle in action, I suppose.

    Thanks for that.

    Personally I consider it rational, rather than intuitive, not to buy a house insurance policy, if the policy costs more than it would cost to re-build my house. I’d rather spend a smaller amount of money on fire safety precautions, flood defences, etc.


  11. @ Dougie: Mann’s new schtick is against what he has christened “inactivists”, folk whose attitude for whatever reason weakens measures to “fight” climate change. The sceptics, er deniers, are enemy number one, but the doom-mongers are apparently now on the same side as the sceptics because they leave the public with the impression that it’s too late to do anything anyway so why bother.

    So on the one hand, you have the sceptics with “it’s not as bad as we thought” and on the other the doom-mongers with “it’s too late to do anything.” The second group one could argue are a function of the barking of alarmist scientists themselves, i.e. Mann and friends.

    Liked by 1 person

  12. This article includes the following statement:

    “It is difficult to escape the conclusion that this is a subject that the authors of Chapter 2 have read about, and perhaps even studied as academics, but they have never earned a living from implementing it.”

    I have received a request that I provide information relating to the backgrounds of the lead authors. This I provide below via a collection of links. By way of a summary:

    a) As you would expect, all of the authors have strong academic backgrounds, specializing in a number of relevant areas. It is relatively easy to identify each author with their likely contribution.

    b) In all cases, the individual concerned appears to have little or no practical or commercial experience of implementing the precepts of decision analysis in support of their own decision making: Their understanding is almost entirely theoretical. That said, many of them have acted as consultants to industry or commerce (as is often the case with academics).

    c) Many have a professional background in economics rather than science. The two Coordinating Lead Authors, in particular, are professors of economics.

    d) Omissions in Chapter 2 probably reflect the fact that each of the contributors has restricted themselves to covering their own areas of research interest. No one seems to have stood back to see if they are any resulting gaps in coverage.

    e) At least two of the authors now refer to themselves as Nobel Prize winners because of their involvement in the IPCC.

    The authors concerned are as follows:

    Prof. Howard Kunreuther: https://oid.wharton.upenn.edu/profile/kunreuth/

    Prof. Shreekant Gupta: Shreekant Gupta – Professor – Department of Economics, Delhi School of Economics, Delhi University | LinkedIn

    Prof. Valentina Bosetti: http://faculty.unibocconi.eu/valentinabosetti/

    Prof. Roger M. Cooke: https://www.rff.org/people/roger-m-cooke/

    Ass. Prof. Varun Dutt: https://faculty.iitmandi.ac.in/~varun/

    Prof. Minh Ha-Duong: Minh Ha Duong – Directeur de Recherche – Centre National de la Recherche Scientifique | LinkedIn

    Prof. Dr. Hermann Held: https://www.fnu.uni-hamburg.de/en/staff/held.html

    Juan Llanes Regueiro Phd: https://uwispace.sta.uwi.edu/dspace/handle/2139/12066

    Prof. Dr. Anthony Patt: https://usys.ethz.ch/en/people/profile.anthony-patt.html

    Ass. Prof. Ekundayo Shittu: https://www.seas.gwu.edu/ekundayo-shittu

    Prof. Elke Weber: https://psych.princeton.edu/person/elke-weber

    Liked by 1 person

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.