If there is one thing guaranteed to draw the attention of a climate sceptic it is the production of half-baked psychological analysis, fabricated to convince the broader public that scepticism has no intellectual or rational basis. It seems at times that the whole of the psychology profession is committed to the cause, but there are three luminaries in particular that shine out in this galaxy of dedicated ne’er-do-wells: John Cook, Stephan Lewandowsky and Sander van der Linden. All three, in their way, are guilty of adding to the pile of dodgy ‘best practice’ within the field of climate denial debunking, a pile that was started in 2011 when Cook and Lewandowsky published the first incarnation of their Debunking Handbook. Its introduction made clear the nature of the challenge:
Refuting misinformation involves dealing with complex cognitive processes. To successfully impart knowledge, communicators need to understand how people process information, how they modify their existing knowledge and how worldviews affect their ability to think rationally. It’s not just what people think that matters, but how they think.
This introductory insight is all very well, but it was followed by a set of guidelines focused entirely upon countering a so-called backfire effect, an effect in which the very mention of the myth to be debunked tends to reinforce the subject’s misinformed belief. Good stuff, one might think, until one discovers that all subsequent efforts to empirically confirm the existence of the backfire effect have failed – a fact that had to be sheepishly acknowledged in a highly revised incarnation of the handbook. It was a severely embarrassing exposure of two so-called experts peddling snake oil, but in the world of climate science advocacy no-one seemed to notice. On the contrary, fast forward to 2024 and we now find one of the two authors, John Cook, in search of the “holy grail of fact-checking”, taking it to the next level by claiming “effective, corrective interventions…deployed at scale and faster than misinformation can spread”. Cook et al have dubbed their great new idea generative debunking:
This paper presents efforts towards the completion of this “holy grail” by synthesising generative AI with past research on climate contrarian claim classification and fallacy detection, in an approach we call generative debunking.
Central to this strategy is the notion of the ‘truth sandwich’. The paper’s abstract explains:
Automatic detection and correction of misinformation offers a solution to the misinformation problem. This study documents the development of large language models that accept as input a climate myth and produce a debunking that adheres to the fact-myth-fallacy-fact (“truth sandwich”) structure, by incorporating contrarian claim classification and fallacy detection into an LLM prompting framework.
Now, you might have thought that John Cook’s track record of latching on to notions that don’t pass muster in the laboratory would mean that no-one would want to involve themselves with him again. Unfortunately, the psychology profession seems incapable of learning. So, after the debacle of the backfire effect we now find ourselves being invited to place our trust in the idea of a ‘truth sandwich’. But what is the evidence for the practical value of this epistemological delicacy? What, for example, does a study, ominously titled “The truth sandwich format does not enhance the correction of misinformation”, have to say in its abstract:
The “truth sandwich” correction format, in which false information is bookended by factual information, has frequently been presented as an optimal method for correcting misinformation. Despite recurring recommendations, there is little empirical evidence for enhanced benefits.
Surprise, surprise! Not content with the ‘backfire effect’ fiasco, John Cook is now pushing an idea, which he describes in his paper as being “structured and psychologically grounded”, but which turns out to lack empirical grounding. It’s just garbage, bereft of evidential support. The fact-myth-fallacy-fact structure is a sandwich but it is one with little nutritional value. This fact is exposed by the data. Unfortunately for Cook, whilst he is very hot on psychological speculations such as backfire effects and truth sandwiches, he is not so interested in laboratory confirmation. Consequently, he has the habit of placing false conjecture at the heart of his strategies, and that is not a good thing. But it gets worse – much worse.
Anyone who knows the basics of LLMs will know that their output is only as reliable as the training data. So, it matters a great deal to know upon which diet Cook and company placed their LLMs. The paper reassures us as follows:
Specifically, we build upon the CARDS (Computer Assisted Recognition of Denial & Skepticism) classifier which was developed to detect specific contrarian claims about climate change (Coan et al.,2021; Rojas et al., 2024), and the FLICC model (Zanartu et al., 2024) that detects fallacies in climate misinformation, such as Fake experts, Logical fallacies, Impossible expectations, Cherry picking, and Conspiracy theories (Cook, 2020).
You have probably come across the FLICC model before here on Climate Scepticism, in which case you already know what I think of it. In case you were wondering about the CARDS dataset, the paper explains as follows:
Additionally, we have developed a dataset of gold-standard truth sandwich debunkings and gold fallacy labels for 62 instances of misinformation, referred to as the CARDS-examples dataset. The debunkings were created by a misinformation expert who has taught climate debunking in a Massive Open Online Course that has received over 51,000 enrolments, and is a co-author of this paper.
Gold-standard? Climate misinformation expert? Remember, this is Cook writing about himself and the stuff he helped produce (alongside Michael Mann and Travis Coan, as it happens), so maybe we ought to suspend our judgement and look at how the so-called generative debunking performed under test. Such an evaluation is discussed in the paper but, unfortunately, the testing appears to be an exercise in which the authors marked their own homework and, try as I may, I can’t fathom what it is exactly that they were hoping to achieve by this. Amongst all of the conclusions drawn, perhaps the most significant was:
Considering the propensity of LLMs to generate fluent and seemingly plausible, but incorrect facts, this presents a major challenge for evaluating climate misinformation debunkings.
Yes, you could say that. But the way I would put it is that the LLMs are only mindlessly constructing responses that are faithful to the supposedly fluent and plausible debunkings they are trained upon. The use of LLMs doesn’t really add anything to the equation other than taking advantage of automation bias, a cognitive bias in which the unaware place too much trust in the output of computers. Even the Cook et al paper seems to recognize this in its concluding remarks:
Our results point to major challenges in automatic debunking and concrete directions for future work, including an improvement of generated facts in specificity and relevance as well as the challenge of validating debunking systems with non-expert annotators.
The problem is that debunking is a problematic idea in the first place and no amount of automation is going to address this. Yes, automation will enable the debunking to keep pace with the spread of misinformation, but that is not improving matters when the ‘debunking’ itself can be part of that misinformation. The masterstroke here is that psychology experts, whom it would appear are not even properly aware of the findings in their own field, have managed to negotiate a position where their own opinions on the rights and wrongs of climate science are treated as an axiomatic set of truths that can be used to sandwich and debunk any contrary view. It’s a situation where an individual only has to write a paper referring to himself as a ‘misinformation expert’ capable of ‘gold-standard’ framing of the issues, and the rest of the world is expected to cede to him the privileged role of master of LLM bias. This paper and its notion of generative debunking is not a serious scientific study — it is a potentially damaging ego trip taken by individuals who can’t even be bothered to take the first steps towards verifying the psychological grounding of their debunking strategies.
Backfire effect indeed! Truth sandwich indeed! Generative debunking indeed! Will this nonsense ever end?
It’s difficult to know what Cook et al are up to. AI has great potential, both for good and for harm. As you have already established, some versions of it dismiss Cliscep as a contrarian website, unworthy of being taken seriously or bothered with. Perhaps we are, perhaps we’re not. It all depends on your point of view. Ultimately there are precious few hard facts, many legitimate (and illegitimate) opinions, and many shades of grey.
So what can LLMs add to all this? Very little, if anything, I suspect. I can’t help feeling that this is the psychologist’s version of rapid weather attribution. They regard our views, which keep popping up with annoying frequency and greater support) as an annoying version of whack-a-mole. AI is to be used to whack the contrarian moles with ever greater speed.
Fortunately it won’t work.
LikeLiked by 3 people
Mark,
I can’t help feeling that this is the psychologist’s version of rapid weather attribution.
I had exactly the same feeling, and I was sorely tempted to draw that comparison in the article. Cook et al obviously feel they are in a shouting match, and they are keen to gain possession of the biggest loud hailer they can get hold of. It’s all part of the same psyops that the Institute of Strategic Dialogue is involved in, and — dare I say it? — the BBC. The name of the game is to get in there first and make sure you hold the keys to the Ministry of Truth.
LikeLiked by 3 people
I guess it’s a bit like Marianna Spring being a “Disinformation Specialist” for the BBC. A sort of poacher turned…erm poacher” !
https://spectator.com/article/bbc-disinformation-correspondent-accused-of-lying-on-her-cv/
LikeLike
Lost for words that Cook can still turn out drivel & get it published.
1 quote/example from the paper –
“As far as green plants are concerned, CO2 is not a pollutant,
but part of their daily bread like water, sunlight, nitrogen,
and other essential elements.
Fact – While increased CO2 levels can enhance plant growth,
they also have negative impacts on ecosystems, such as
promoting the spread of invasive species and increasing
the severity of plant diseases. Moreover, elevated CO2
levels contribute to climate change, leading to record high
temperatures, ocean heat, and sea level rise.
Myth – Green plants require CO2 for growth, similar to how
humans need water and food. Plants thrive at higher CO2
levels, which existed in the past and are reproduced in some
greenhouses, resulting in improved growth and yields.
Fallacy – This argument oversimplifies the complex relation
ship between CO2 and plant growth, ignoring the negative
impacts of increased CO2 levels on ecosystems and the
overall climate system. While it’s true that CO2 is essential
for plant growth, artificially elevating its levels in green
houses does not account for the broader consequences of
climate change, such as heat stress, drought, and extreme
weather events, which have negative impacts on agriculture
and ecosystems.
Fact – Although higher CO2 levels boost plant growth, they
also facilitate invasive species and disease spread, and exac
erbate climate change effects, including temperature, ocean
heat, and sea level rise.
Figure 1: An example input myth (top, dark gray)
and fact-myth-fallacy-fact (“truth sandwich”) debunk
ing generated by our model (bottom)”
He needed a model to regurgitate nothing we haven’t heard before.
LikeLiked by 2 people
Mark Hodgson’s comment that it is difficult to know what Cook et al are up to was easy to answer, I thought, and DF Hunter comment on examples demonstrated that.
Note that Cook’s “sandwich model” requires a statement previously determined as sceptical to allow a “popup” refutation. Then, all that’s needed is to misrepresent some sceptical criticism so it’s easily refuted. In other words, the old straw man technique – which is exact;y what Cook does here.
LikeLiked by 2 people
A myth that is, in fact, a fact.
LikeLiked by 4 people
In fact, the fact seems to be rather more sandwiched in than the myth in this instance. More CO2 leads to invasive species? Show me a plausible mechanism for this that is not absolutely overwhelmed by factors with a thousand times as much power. This fails the sniff test – or should I say, I engage in instinctive rejection of this idea that threatens my worldview because I’m a denialist?
LikeLiked by 4 people
I have rather glossed over a number of relevant issues in my article simply by saying debunking is ‘problematic’. However, the article has already attracted a number of comments that point towards the nature of the problems, ranging from strawman arguments to using untruths in an attempt to create a ‘truth sandwich’. But the brazen deviousness of the debunker doesn’t end there. The psychologists are so anxious regarding exposure to ‘myths’, that they have even gone so far as to propose techniques that avoid such exposure altogether. What they propose is just naked censorship, although to give it ‘psychological grounding’ they have termed it ‘critical ignoring’:
https://journals.sagepub.com/doi/10.1177/09637214221121570
The basic concept behind critical ignoring is that of the ‘reliable source’. By knowing in advance which sources are reliable or unreliable, one can avoid reading the ‘myths’ altogether. According to the advocates of critical ignoring, this has the advantage of rendering critical thinking redundant. In fact, if one attempts to apply critical thinking to evaluate what you are reading, the game is already over because your brain can’t help having been infected by the bad stuff.
I don’t need to point out to anyone here just how dangerous and pernicious this idea is. But the evidence would seem to suggest that it is already well and truly programmed into the workings of the LLMs. Only recently, an LLM told me this:
Yes, you read that correctly: “regardless of the author’s stated intentions”.
LikeLiked by 2 people
…to the advocates of critical ignoring, this has the advantage of rendering critical thinking redundant…
I assume, John, that those words represent your paraphrasing of the issue. However, if your summary is anywhere close to being a fair representation of the argument, then we are getting into very dangerous territory.
Critical thinking is surely at the heart of science. Critical thinking must be encouraged, rather than discouraged. The idea that what is good for society is a mass of people who shun critical thinking in favour of accepting an approved narrative from “safe and trusted” locations, and only from such locations, is quite appalling. If that’s what Cook et al are up to, then it’s profoundly worrying.
LikeLiked by 3 people
Mark,
You’re quite right, it is my paraphrasing, and it isn’t one that the proponents of critical ignoring would accept, hence the following title of an article in The Conversation:
When critical thinking isn’t enough: to beat information overload, we need to learn ‘critical ignoring’
https://theconversation.com/when-critical-thinking-isnt-enough-to-beat-information-overload-we-need-to-learn-critical-ignoring-198549
However, upon reading the article it is clear that the proponents are advocating that critical ignoring be used to undertake a role that should be covered by critical thinking, and they offer two reasons for this:
So yes, they talk about critical ignoring as being additional to critical thinking, but I say that if one can’t, or indeed shouldn’t, use critical thinking to determine the valuable information, then what role is left for it? At the very least, I would argue that by employing critical ignoring they are rendering critical thinking redundant in the very role for which it is most required. Okay, admittedly it may be rational to ignore sources that have been labelled as unreliable, but that is just abdicating your critical thinking to others, i.e. those who determined the labelling. That is not what we sceptics are about.
LikeLiked by 1 person
Just to add to the above, I should have acknowledged that critical thinking still has a role in helping one decide the basis upon which to employ critical ignoring. For example, when Cook described himself as a misinformation expert, and described his work as gold standard, I had the option of automatically accepting his advice in preference to advice received from sources lacking such endorsement. But I didn’t. Instead, I applied critical thinking to determine that such self-endorsement is likely to be unreliable, and instead looked closely at his output to critically evaluate it for myself.
That’s why we infuriate people — we don’t automatically accept the label ‘expert’, particularly when self-applied. That’s also why we are seen as irrational. Not only are we sceptical of the label ‘expert’, we will examine material even when the experts have told us to ignore it. In fact, such advice can makes us even more inclined to resort to our own evaluation. To many this is seen as Dunning-Kruger in action, except…
LikeLiked by 3 people
Only other thing I would add to John’s comment above. Seems (to me) that most sceptics are overall older people with long memories & have usually read widely over the years, so we are not inclined to give so called ‘experts’ the free pass some expect.
ps – “gold standard” & “expert” reminded me of my Aerospace engineering days. Back then I worked on composite/carbon fibre structures for aeroplanes & my subcontract company wanted to label me “composite/carbon fibre expert” in future bids. I refused to be called an expert, knowing my limitations & we settled on “many composite/carbon fibre projects completed successfully for major aero customers”.
LikeLiked by 3 people
Critical ignoring? Didn’t that used to be called ‘dismissing’?
LikeLiked by 2 people
Mike,
Didn’t that used to be called ‘dismissing’?
Yes, I think it did. And, as such, it’s something we all get up to. In fact, on the face of it, it makes complete sense. We are all subject to information overload and so we all need a basis upon which to decide what we should and should not be spending our time reading. The devil is not in the general concept so much as the proposed methods for implementing it. For example, if one were to critically ignore in accordance with Lewandowsky and Cook’s recommendations, one would end up uncritically accepting all the expert, peer-reviewed ‘psychological grounding’ for their ‘backfire effect’ and ‘truth sandwich’ nonsense, whilst simultaneously dismissing out of hand the unexpert, non-peer-reviewed debunking that can be found on sites such as Climate Scepticism. That’s very convenient for them, don’t you think? As Mark says, “The idea that what is good for society is a mass of people who shun critical thinking in favour of accepting an approved narrative from “safe and trusted” locations, and only from such locations, is quite appalling.”
LikeLiked by 2 people
Only somebody as thick as a brick sandwiched between two short planks could come up with the idea of a “truth sandwich”.
And if Ofcom or any of its small army of internet snitches is watching this site, then please stick that in your ‘hate speech’ pipe and smoke it.
So sick of this grindingly depressing, mind-numbingly inane and banally evil war of attrition on logic, reason and free-thinking.
LikeLiked by 2 people
Jaime,
I don’t think it is over the top to refer to all of this as being mind-numbingly inane. If anything, it is worse than I have made out here. It isn’t just the mere mention of the ‘myth’ that is supposed to backfire, it’s the mere mention of the facts. Yes, according to the psychologist, the more factual the debunking, the less effectual it is. It’s something to do with the way sceptics think, apparently. God give me strength.
LikeLiked by 2 people