Judith Curry has posted an interesting article on the mischaracterisation of climate risk, and I recommend that you read it. I was tempted to post a comment but experience has highlighted the perils of commenting on a website when one cannot influence the moderation policy. I shall therefore limit myself to passing on my main observation here.
Dr Curry remarks that:
“Many social scientists have argued that the disciplinary constrictions imposed by the IPCC and UNFCCC have neglected many important insights arising from a wide range of expert and unaccredited sources.”
However, you don’t have to be a social scientist to appreciate that the usual treatment of risk in relationship to the climate change problem leaves a lot to be desired. I suspect that anyone outside of climate science who has a background in risk management would be equally concerned, should they be willing to play close enough attention. The concerns I have are numerous, but I would suggest that the most basic errors commonly made in the climate change debate include:
- A failure to appreciate that whilst risk is predicated upon uncertainty it is not a function of uncertainty. When probabilities are insufficiently known for the purposes of risk assessment, decisions are necessarily driven instead by a consideration of uncertainty, and yet in such circumstances the climate science community still claims to be making risk-based decisions.
- A failure to appreciate the extent to which scientific uncertainty can be both aleatoric (due to variability) and epistemic (due to incertitude). A preoccupation with the objectivity afforded by an aleatoric analysis leads to the false claim that scientists alone know how to properly analyse uncertainty. By contrast, the layperson is supposed to suffer from a misguided focus upon the epistemic, resulting in an unscientific subjectivity. This viewpoint can be seen, for example, in the writings of Peter Gleick. In truth, there is no such thing as a scientist’s conception of uncertainty that may be distinguished from the unscientific.
- A failure to adequately distinguish the epistemic and aleatoric components of uncertainties prior to their analysis and propagation. This can lead to inappropriate analytical methods being employed. For example, there is the mistake of using aleatoric approaches for the aggregation of climate model ensemble outputs even though this has no epistemic validity.
- A failure to understand the interplay between epistemic and aleatoric uncertainty. This can lead to errors such as that made by Stefan Lewandowsky when he maintains that an ordinal approach to risk analysis demonstrates that increased uncertainty regarding ECS values necessarily increases risk (and, hence, uncertainty can be viewed as ‘actionable knowledge’). His ‘proof’ fails to appreciate that a decrease in epistemic uncertainty can accompany both a widening and a narrowing of a probability distribution, or indeed leave it unaltered. The same is true for an increase in epistemic uncertainty.
- A failure to appreciate the interplay between evidential weight and consensus when attributing confidence levels. This error is particularly disconcerting since it can be found in the guidance provided by the IPCC to its authors to help them decide the confidence to be attributed to statements. In particular, the method prescribed allows for high confidence levels to be attributed to situations where good consensus prevails in the face of poor evidence. Equally, according to the guidance, low confidence can be inferred when low consensus accompanies good evidence. Neither situation should appertain, and the guidance suggests an insufficient understanding of evidence theories and a worrying failure to take account of the many factors that drive the development of consensus within a social enterprise.
- Failure to employ a well-defined risk management framework. The IPCC claims to have introduced a ‘unified risk management framework’ within AR6, but a close inspection reveals that it has done nothing of the sort. Instead, there is nothing more than a review cycle to enable continuous monitoring of the IPCC’s success or otherwise in persuading communities to accept and act upon its policies. A risk management lifecycle should be defined in accordance with best practice within other disciplines such as safety management.
- A disturbing preoccupation with the psychology of risk perception and the benefits of exploiting cognitive bias in a way that facilitates the implementation of IPCC policy. This was first evident in AR5, WG3, Chapter 2, and the fruition and deployment of the ideas can be seen in AR6. Too much attention is payed to how cognitive bias may feature in the acceptance of IPCC policy and not enough is payed to the role it may have played in its development.
- Insufficient interest in techniques employed in other fields, e.g. causal modelling and non-probabilistic approaches to uncertainty analysis.
- A failure to establish basic principles of risk acceptance policy, such as the principle that net risk shall not increase as a result of a given risk management intervention. This can be seen in the failure to properly appraise transition risk. Transition risk is too easily characterised as an acceptable and predictable price to pay to avoid an uncertain, non-ergodic climate risk, when in fact the transition risks are often equally uncertain and non-ergodic in their nature. No risk management framework that fails to place this principle at its heart is worthy of the name.
- A failure to appreciate the concept of risk efficiency and its central importance in risk management. Too often the risks are treated as potentially catastrophic and to be avoided at all cost without regard for the need to seek the most cost-effective solutions. Risk efficiency considerations may lead to a preference for adaption rather than a costly avoidance of risk.
- A failure to understand that the robust decision is not the one that leads to the greatest reduction in risk, it will be the one that remains valid for the widest range of possible futures. It is all about minimising the regret function and, in this regard, it is sometimes the sufficient that is more important than the optimal.
Most of the above errors could be avoided if the community seeking to manage climate risk were more open to the views of “a wide range of expert and unaccredited sources”. In the meantime, I have only one verdict to offer the supposed experts who characterise climate risk as a no-brainer requiring an accelerated transition to Net Zero:
Must try harder.
John: Very helpful. Judith would I’m sure welcome a comment from you. Pointing here, for example.
Should “resulting in an unscientific subjectively” end with a noun, like subjectivity?
LikeLiked by 1 person
I will probably take your advice, and you are correct, that was a typo. Auto correct can be a right pain.
I tend to be more interested in why the failures occur, rather than what they are. However, this list seems to me about as fundamental as it could get. From both points of view, I find it tempting to summarize by lopping the last word off your own 3-word verdict.
I have no deep insights to offer as to why the mistakes are made, other than to reiterate that there seems to be an overconfidence within the climate science community. They seem to think they have nothing to learn from looking at how others do things. In the meantime, those who could help have to line up with all the others who the media would brand as upstart laymen who dare to challenge the experts. 😦
LikeLiked by 1 person
John, my eyes glaze over with too many references to aleatoric and epistemic risk. I know I should be able to cope, but I have a blind spot here. Nevertheless, my weakness notwithstanding, your final three bullet points really resonated with me.
John – when you say (pretend I’m thick as a brick) –
“A failure to appreciate that whilst risk is predicated upon uncertainty it is not a function of uncertainty. When probabilities are insufficiently known for the purposes of risk assessment, decisions are necessarily driven instead by a consideration of uncertainty, and yet in such circumstances the climate science community still claims to be making risk-based decisions.”
can you elaborate or give examples ?
Take the game of Russian roulette. It is a game predicated upon uncertainty in which the risk is very easy to calculate (assume a six shot revolver), i.e. there is a one sixth chance of certain death. Although the game is predicated upon uncertainty, it is the probability that is used to calculate the risk. The reason I say this is because you can imagine a modified version of the game that would be more representative of real life, in which the player is not allowed to know the number of bullets in the revolver. In this version of the game an uncertainty can be calculated which reflects the range of possible probabilities (i.e. a risk profile can be constructed by positing the full range of possibilities ranging from zero to six bullets). But the actual risk being taken is not known precisely.
In the general case, there will be a probability distribution, the spread of which is a measure of the uncertainty. The greater the uncertainty, the less is known about the actual risk. The important thing to appreciate is that the uncertainty and the risk are two different things. For example, when considering a range of values of differing impact, one can imagine a wide probability distribution centred on the low impact values or a narrow probability distribution centred upon high impact values. The former is a high uncertainty / low risk scenario, and the latter is a low uncertainty / high risk scenario. In the extreme case when even the shape of the curve is not known, the precautionary principle is applied. Some would also apply this principle even if the curve is known but it has a fat tail that encompasses very high impact. People still say they are taking a risk based decision even when the risks are a matter of complete speculation.
Hope this helps.
LikeLiked by 1 person
PS. I could summarise by saying that there are two questions one should ask of oneself:
1) How much risk do I want to take?
2) How certain do I need to be about the risk I am taking?
People instinctively feel that an unknown risk is necessarily a greater risk, but this is not the case. The effect is known as Ellsberg’s Paradox, which I explored in this article:
Also, I should have pointed out that the spread of the probability distribution is only a provisional measure of uncertainty since one can expect it to change shape as more data is collected. This is a reflection of the extent to which a probability distribution does not fully capture the uncertainty at any given time (the epistemic component is not fully captured). But do not make Lewandowsky’s error and assume that removing data will necessarily spread the probability distribution.
LikeLiked by 2 people
People assume tailes to get fatter, but it seems the opposite is happening. Extremes can increase while the harmful extremes decrease. Extreme is just a statistical bin. The shape (&change) of the distribution matters. Eg is flooding decreasing despite increase in extreme precip
Thank you for your comment. I tend not to get too involved in examining statistical trends; there are those on this website who do such a better job than could I. However, the example you give is important because it draws attention to the distinction to be made between a weather event (rainfall) and a social impact event (flooding). The former is a causal factor for the latter but, as Judith points out, if one is to properly characterize the risks for the latter, one will need a much more sophisticated causal model than the one usually promoted by the media.
You can’t determine risks for other people.
Determining risk for the entire world is merely, insanity.
I propose that no one knows what global warming is.
And I propose, that Venus at Earth distance, would colder than Earth is.
I give some clues.
Earth is in an Icehouse climate.
Peak Holocene was about 8000 year ago.
Venus at Venus distance from the Sun absorbs and emits less sunlight than Earth does-
therefore at Earth distance it would absorb and emits even less.