I was doing a failure mode effects analysis on my past life the other day and I can tell you the results were not reassuring. They seemed to be suggesting that my curriculum vitae owed a lot to a faulty career choice in my formative years. Instead of plumping for the obvious attractions of an education in high integrity systems development, I instead opted to take a degree in physics; and the rest, as they say, is a chronologically ordered causal narrative. I assure you physics seemed a good decision at the time, but the virtually non-existent impact of my academic efforts appears to be a grim and fossilising testimony to my lack of good judgement. Indeed, it was only relatively late in my career that I finally happened upon my true calling. At last I discovered safety-systems engineering and, in a desperate bid for temporal reparation, I found myself saying, ‘Cut! Can we start that again from the top please, but this time can we leave out the physics bit? Oh, and try putting a bit more fame and fortune into it this time’.
Alas, it is all a bit late now and I’m long since retired. If you find me at all, you will find me wondering that my life had been a movie for which I hadn’t even passed the audition.
So what can I do now to fashion the myth of filmic success from the dismal reality? How about if I were to start by telling you what my safety-systems engineering career taught me about the eminently more successful, Dr Daniel Ellsberg, and what his discoveries mean for climate change policy?
The quite interesting Daniel Ellsberg
Daniel Ellsberg, Ph.D. is quite an interesting character who, despite being an influential thinker in the fields of decision theory and economics, is better known for being the Julian Assange of the 1970s. By releasing the so-called ‘Pentagon Papers’1 to the New York Times, he revealed to the American public that the Johnson administration had systematically lied about the motives for taking the country to war against North Vietnam. However, it is not Ellsberg’s whistle-blowing that demands the attention of the good readers of CliScep – it is his studies regarding the psychology of decision-making. These studies2 posed a paradox that, in its simplest form, can be expressed thus:
Having been told there are an equal number of red and black balls in a black bag, you are asked to stake £100 on whether a black or red ball is retrieved after selecting just one ball (you win £100 if the ball is your chosen colour but lose your stake if it is not). Then the exercise is repeated; however, this time you are told there are red and black balls in the bag but you are not allowed to know how many of each. Are you as keen to take the gamble?
In repeated experiments,3 the answer obtained when people were asked was a resounding ‘No!’ It seems that the average Joe is freaked out by not knowing the ratio for the number of balls, and this is taken to represent a bigger gamble. Yet, logically, the subjects had been presented with the same decision. Faced with total uncertainty regarding the ratio, they should take the gamble as a fifty-fifty choice; which is exactly what they had when they knew that there were equal numbers of each colour. In fact, it is the ambiguity implicit in the set-up of the second gamble that spooks people and this leads to what is known as ambiguity aversion.4
I tried this out on my wife last night by asking her what she thought was the bigger gamble (choosing with or choosing without knowledge of the number of balls). She knew ‘for a fact’ it was the latter and took my insistence that they were logically equivalent as further evidence that she had married the village idiot. My counter proposal that she had, in fact, married a genius led to an interesting debate, but the matter remains unresolved.
But why is this a paradox?
Thus far, you may be wondering why I am referring to this phenomenon as a paradox (if you are not, you can safely skip this section).
Well, with a slightly more elaborate version of the game, you can set up a situation in which the individual’s aversion to ambiguity can lead to the acceptance of a lesser pay-out, even though it would only require the sense that God gave geese to work out what would have been the better decision. Even more puzzling, it leads people to expressing self-contradictory preferences that violate utility theory (and the economists’ faith in the truth of utility theory, it appears, is strong enough for them to see any violation as a paradox). Consider the following:
You have a bag containing 30 red balls and 60 other balls that are either black or yellow (you don’t know how many yellow balls there are or the number of black balls, but you do know that the total number of black and yellow balls equals 60). You are now given a choice between two gambles that a gambler should treat as equivalent:
Gamble A: You win if you draw a red ball
Gamble B: You win if you draw a black ball
Subsequently, you are given the choice between a further two gambles that should also be treated as equivalent:
Gamble C: You win if you draw a red or yellow ball
Gamble D: You win if you draw a black or yellow ball
As with Gambles A and B, there is nothing in the set-up that should suggest to the gambler that Gambles C and D are anything other than equivalent (notwithstanding the gambler’s personal theory). However, when asked to state a preference, most people strictly prefer Gamble A to Gamble B whilst then preferring Gamble D to Gamble C. This, once again, demonstrates the average individual’s aversion to those gambles with the greater component of epistemic uncertainty. The paradox lies in the fact that having a consistent personal theory of the expected utilities for each gamble (which would at least be a logical position to take) means that a preference for Gamble A over Gamble B should actually be accompanied by a preference for Gamble C over Gamble D.5
So what has Ellsberg’s Paradox got to do with climate change?
If Ellsberg’s Paradox teaches us anything, it is that our aversion to uncertainty makes us do stupid things. We like to think we are rational creatures, deciding upon courses of action that promise the greatest benefit with the minimum of risk. But sadly this is not true. Most people who claim to be making a risk-based decision are doing nothing of the sort – they are, instead, being ambiguity averse. They seek the course of action that involves the least ambiguity and they wouldn’t have a clue what the risks truly are.
Ambiguity aversion is what we sceptics stand accused of when we insist on the removal of uncertainties before committing to climate change policies. In contrast, the presupposition made by those demanding action is that the delays incurred will heighten the risk, therefore a precautionary approach is advocated.6 However, those who advocate such precaution are just as much running scared from ambiguity as the sceptics are. They would much rather go for the devil they know than accept the psychological torment that comes with the uncertainty of catastrophe. At the end of the day, this is just Pascal’s wager.
Removing the ambiguity
Of course, the other way of avoiding having to deal with ambiguity is to convince oneself it doesn’t exist – a strategy that seems to lie behind much of the rhetoric that increasingly dominates the climate change debate. It seems that the possibility of future catastrophe is no longer motive enough to take evasive action; what we need instead is the certainty of such catastrophe. Surely, according to the rhetoric du jour, there is no uncertainty and it is not a matter of risk. In fact, we are not even talking about the future. Just look at what is happening right now! And yet, in the midst of all of this, we still have the conspiracist merchants of doubt, seemingly unaware that there is no longer any room for their merchandise. Why aren’t these people in jail already?
Well, this might be the sort of self-assured narrative that suits those who do not like ambiguity, but it certainly doesn’t help when the reality is that we are attempting a risk-based decision in the face of deep uncertainties that obscure the levels of risk. One can speak of climate change deniers, but there is nothing more denialist than the construction of a fantasy world bereft of ambiguity, just because one desperately seeks a gamble that is free from it.
There can be no arguing that the ambiguity of climate change evidence is a bad thing, basically because it can lead to the miscalculation of risk on both sides of the debate. Both sides are averse to the ambiguity and both sides have their strategies for dealing with it: The advocates for action seek to exploit ambiguity or to ignore it; the sceptics want to see its reduction. There are potential consequences either way, but if the sceptics prove to be misguided, then at least it will be an honest mistake. Those who deny that the ambiguity exists have no need of such honesty because they have God on their side.
Robust decisions in a fragile world
The potential effects of failure can indeed be difficult to analyse, and history can be a very harsh judge. When I look back upon my career-defining decisions it is tempting to regret those that appeared to delay any form of ultimate fulfilment. But this misses the point. My decision to study physics was taken at a time in my life when experience was at a premium and the evidence for the best route forward was replete with ambiguity. In such circumstances it made sense not to commit too early to too focused a course of action. The decision to study physics was a robust one because it remained valid for the widest range of possible futures. It is easy for me in my dotage to scoff at my younger self but, by doing so, I give myself no credit for pragmatism and adaptability.
As with career decisions, so with climate policy. The aim should not be to minimise, avoid or deny ambiguity, it should be to embrace it by employing a robust decision-making strategy. Fittingly, it is the Rand Corporation that is leading the way here with its Robust Decision-Making (RDM) tools and methodologies. Speaking on behalf of the corporation, Professor R. J. Lempert writes:
“We agree that significant uncertainty exists regarding the future impacts of climate change and the costs of avoiding those impacts, that it is dangerous to ignore or downplay that uncertainty, and that acknowledging these uncertainties can provide a strong foundation for dialogue… To help decision makers, planners and investors make decisions in uncertainty, various methods and tools exists to identify typically robust or adaptive plans.”
Indeed, various methods and tools do exist to identify robust and adaptive plans, but you wouldn’t think so listening to those who prefer the unambiguous pathway. Which is a shame, because the powers that be, clamouring for solutions in a fit of Torschlusspanik, run the risk of killing the patient to cure the disease. This is not a recipe for success. It is a recipe for regret.
 These were top secret documents to which he had access as a member of the RAND Corporation.
 Ellsberg, D. (1961). “Risk, Ambiguity, and the Savage Axioms”, Quarterly Journal of Economics 75 (4): 643–669.
 Ellsberg’s Paradox was originally conceived as a thought experiment but it has since been confirmed empirically many times over.
 Another name for this is uncertainty aversion. Whatever you call it, it does appear to be a real phenomenon; fMRI scanning has demonstrated that the human brain does indeed behave very differently when forced to make decisions in the face of ambiguity (ref. Hsu, M. and C.F. Camerer (2004), Ambiguity Aversion in the Brain, Academy of Behavioral Finance & Economics).
 For those of you who are interested in the technicalities of decision theory, the preference for Gamble D over Gamble C involves a violation of a dominance principle known as L. J. Savage’s Sure-thing principle.
 Actually, if the actions proposed were not so obviously self-destructive, I might have some sympathy with such logic.