Engber on Fake News, Debunking and Lewandowsky

 

There’s a long and interesting article by Daniel Engber at Slate about Fake News and how to correct it; the Boomerang or Backfire Effect; echo chambers and so on, giving a useful historical review of research going back to the Second World War. He recounts in detail the efforts of psychologists to replicate the science, with a view to establishing whether we are really living in a post-fact age.

The title and subtitle give an idea of his general scepticism:

LOL Something Matters: We’ve been told that facts have lost their power, that debunking lies only makes them stronger, and that the internet divides us. Don’t believe any of it.

Engber exhibits a healthy scepticism throughout his well-researched article, for example when he says:

As I poked around these and other studies, I began to feel a sort of boomerang effect vis-à-vis my thinking about boomerangs: Somehow the published evidence was making me less convinced of the soundness of the theory. What if this field of research, like so many others in the social sciences, had been tilted toward producing false positive results?

For decades now, it’s been commonplace for scientists to run studies with insufficient sample sizes or to dig around in datasets with lots of different tools, hoping they might turn up a finding of statistical significance. It’s clear that this approach can gin up phantom signals from a bunch of noise. …A scientist who found an upside-down result might go on to make a novel and surprising claim, such as: If you tell people one thing, they’ll believe the opposite; or facts can make us dumber; or debunking doesn’t work. Since editors at top-tier scientific journals are often drawn to unexpected data, this mistake might then be published as a major finding in the field, with all the press reports and academic accolades that follow.

Naturally in an article on debunking, he cites Lewandowsky:

In November 2011, a pair of cognitive psychologists in Australia, Stephan Lewandowsky and John Cook, published an eight-page pamphlet they called “The Debunking Handbook,” on the “difficult and complex challenge” of correcting misinformation. They cited work from Schwarz and Skurnik, among others, in describing several ways in which debunkings can boomerang or backfire. Arriving when it did, in the middle of the post-fact panic, their handbook satisfied a pressing need. Chris Mooney, author of The Republican War on Science, called it “a treasure trove for defenders of reason” The liberal website Daily Kos said it was a “must read and a must keep reference.” …

The existence of backfire effects” have “emerged more and more over time,” Lewandowsky told Vox in 2014. “If you tell people one thing, they’ll believe the opposite. That finding seems to be pretty strong.”

Engber cites experiments that show pretty clearly that Lewandowsky’s statement above is wrong, and cites with approval his new formulation:

There have been other failed attempts to reproduce the Skurnik, Yoon, and Schwarz finding. For a study that came out last June, Briony Swire, Ullrich Ecker, and “Debunking Handbook” co-author Stephan Lewandowsky showed college undergrads several dozen statements of ambiguous veracity (e.g. “Humans can regrow the tips of fingers and toes after they have been amputated”). The students rated their beliefs in each assertion on a scale from 0 to 10, then found out which were facts and which were myths. Finally, the students had to rate their beliefs again, either after waiting 30 minutes or one week. If Skurnik, Yoon, and Schwarz were right, then the debunkings would cause their answers to rebound in the wrong direction: If you tell people one thing, they’ll believe the opposite. But the new study found no sign of this effect. The students’ belief in false statements dropped from a baseline score of 6 down to less than 2 after 30 minutes. While their belief crept back up a bit as time went by, the subjects always remained more skeptical of falsehoods than they’d been at the start. The labels never backfired.

A second study from Ecker and Lewandowsky (along with Joshua Hogan), also out last June, found that corrections to news stories were most effective when they repeated the original misinformation in the context of refuting it. This runs counter to the older theory, that mere exposure to a lie … makes it harder to unseat. The authors noted that the traditional logic of “effective myth debunking may thus need to be revised.”

In other words, at least one variation of the end-of-facts thesis—that debunking sometimes backfires—had lost its grounding in the data. “I’ve tried reasonably hard to find [this backfire effect] myself, and I haven’t been able to,” Ecker told me recently. Unless someone can provide some better evidence, it may be time to ask if this rule of thumb from social science could represent its own variety of rumor: a myth about how myths spread.

It’s nice to see Lewandowsky’s co-author admitting that they might be wrong about something, and even nicer to see a science journalist at Slate researching a subject and expressing scepticism, instead of simply quoting a press release. But there, I’ve exhausted my quota of niceness for today, and so let’s finish with some comments on one of the many examples of fake news which Engber quotes: climate change denialism.

In the sentence: “In those days of phantom Iraqi nukes, anti-vaxxer propaganda, and climate change denialism, reality itself appeared to be in danger” the link to “climate change” goes to:

https://en.wikipedia.org/wiki/An_Inconvenient_Truth

where you can read:

On October 10, 2007, Justice Michael Burton ..ruled that An Inconvenient Truth contained nine scientific errors and thus must be accompanied by an explanation of those errors before being shown to school children. The judge said that showing the film without the explanations of error would be a violation of education laws.

Maybe Engber didn’t read that bit.

His statement:

…in 2015, the idea [of the boomerang] had grown in scope … Papers had by then been published showing that the facts could boomerang when Republicans were told that Obamacare’s “death panels” didn’t exist or that climate change could lead to more disease.

links to an article by Hart and Nisbet: “Boomerang Effects in Science Communication: How Motivated Reasoning and Identity Cues Amplify Opinion Polarization About Climate Mitigation Policies.”

Here’s the relevant quote from the paper’s Methodology:

In the two stimulus conditions, participants read a simulated news story about climate change … designed to be “nonpolitical” … and focused on the potential health impacts of climate change, an increasingly salient and important aspect of climate change. The story discussed the potential for climate change to increase the likelihood that diseases such as West Nile virus will infect individuals who spend a lot of time working outdoors, like farmers. The news story was generated explicitly for the experiment but was based on facts reported by the Associated Press…The two experimental conditions varied by manipulating the identity of the potential victims and story exemplars into conditions of relative low and high social distance by alter‭‬ing the story… In the low social distance condition, the potential victims of climate change were described as being located in the general locality of where the experimental partici‭pants resided (upstate New York). In the high social distance condition the potential victims were located either in the state of Georgia or the country France…

and the abstract states triumphantly:

we found the influence of identification with potential victims was contingent on participants’ political partisanship. This partisanship increased the degree of political polarization on support for climate mitigation policies and resulted in a boomerang effect among Republican participants.

That’s right. Most Republicans are too stupid to believe that West Nile disease might hit upstate New York, a “fact” that the researchers invented.

But the most intriguing quote linking the debate to climate denial is this:

Even as new facts accumulate in the science of post-facts, the field will likely be slow to change its course. Norbert Schwarz, for one, has been a vocal critic of the replication movement in social psychology, comparing those who question old ideas to global warming denialists: “You can think of this as psychology’s version of the climate-change debate,” he told Nature in 2012, when doubts emerged about research into social priming. “The consensus of the vast majority of psychologists closely familiar with work in this area gets drowned out by claims of a few persistent priming skeptics.”

And Engber follows up with this:

Skeptics of the boomerang effect have also run afoul of consensus thinking in their field. Guess and Coppock sent their study to the same journal that published the original Lord, Ross, and Lepper paper in 1979, and it was rejected. Then it was passed over four more times. “We’ve reframed it over and over,” Coppock says. “It’s never rejected on the evidence—they don’t dispute the data…” As a result, their work remains in purgatory, as a posted manuscript that hasn’t made its way to print…

Wood and Porter’s study also faced a wall of opposition during the peer review process; after two rejections, it was finally accepted by a journal just last week.

I asked Coppock: Might there be echo chambers in academia, where scholars keep themselves away from new ideas about the echo chamber? And what if presenting evidence against the backfire effect itself produced a sort of backfire? …if his findings were correct, then wouldn’t all those peer reviewers have updated their beliefs in support of his conclusion? He paused for a moment. “In a way,” he said, “the best evidence against our paper is that it keeps getting rejected.”

Engber draws no conclusion about the resemblance with the climate debate.

Here’s the point. Engber seems an admirable science journalist, researching his subject, looking at both sides, expressing scepticism, and asking scientists awkard questions. Yet when it comes to claiming that reality is in danger from climate change denialism, he quotes as his authority Al Gore. When citing a paper which examines attitudes to the claim that climate change could lead to more disease, he fails to notice that the researchers made the claim up. And when a scientist claims that denial of replicability in the social sciences is akin to climate denialism, Engber provides evidence of non-consensus views in the social sciences being kept out of the peer reviewed literature, and then..

And then nothing.

12 thoughts on “Engber on Fake News, Debunking and Lewandowsky

  1. Lew loves to quote the backfire effect
    ‘Look see if the skeptics read our Climate report they get the backfire effect and their own naughty beliefs are reinforced’
    Doh, if Lew and co put out 10 Climate reports full of flaws, then of course we believe that their case is weaker than we originally thought.

    Like

  2. This is incredibly useful Geoff. Thank you for putting such work in, with necessary limitation of niceness.

    Are we facing here the demonising that transcends even sincere scepticism? Pretty disturbing.

    Like

  3. It is a very interesting report. I doubt there’s a predictable social response in aggregate but some individuals “dig in” if challenged on something. They might actually agree with the new claim but to publicly accept it is seen by some (narcissists, I think) as a weakness.

    Like

  4. Engber raises the question of the reproducibility and therefore reliability of studies in cognitive psychology, pointing out that studies are often poorly designed, with small samples and ropey analysis. At one point he refers to the avalanche of “facts don’t matter” articles, linking to, among others, this article by Elizabeth Kolbert at the New Yorker:
    https://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds which begins:

    In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones. Some students discovered that they had a genius for the task…Others discovered that they were hopeless…As is often the case with psychological studies, the whole setup was a put-on. … the scores were fictitious…
    In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.
    “Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

    On the basis of this, and similar evidence from cognitive psychology, the author concludes that we are irrational.

    Note the nature of the evidence: an experiment in which the experimenters lie to the subjects about a serious subject (suicide), then reveal that they’ve been lying, then lie again, then deduce the “fact” that their subjects have been behaving irrationally.

    Maybe so. Maybe the only rational response from a subject would be to suppress the desire to punch the psychologists in the face and walk out. That probably counts as a “don’t know.”

    Psychology used to be an exploration of the complexity of the human psyche. Then along came the cognitionists who said: “Forget about complexity. Take forty first-year students, ask them some tricky questions, do some stats, and you’ll see what a bunch of silly billies we all are.”

    After some more of the same kind of “just fancy that” science, Kolbert asks “How did we come to be this way?” and answers:

    ..cognitive scientists Mercier and Sperber take a stab at answering this question. [They] point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context. Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to coöperate..

    While I have every respect for my ancestors on the African savannas, and am grateful for the bipedalism and all the rest, this sounds to me like: “We cognitive psychologists don’t have a clue. Let’s do sociology instead. Or evolutionary biology.”

    This is retro-reductionism gone mad, or scientific pass the parcel. Back in the days when psychologists were first trained as medical doctors, they knew some Latin and Greek. They’d heard of Aristotelean logic and the Socratic dialogues. Many of the best were German, and knew a bit about Schopenhauer and his respect for Eastern philosophical thought. In other words, they were educated members of the human race with some knowledge of what mankind had achieved since he left the African savannah. They weren’t thick as a brick in an ivory tower. And they didn’t lie to their students.

    Liked by 1 person

  5. Geoff. You quote from Lewandowski
    “The existence of backfire effects have emerged more and more over time, If you tell people one thing, they’ll believe the opposite. That finding seems to be pretty strong.”
    and then proceed, using Engber, to debunk it.
    I’m not convinced that you and Engber are always right about this this “backfire effect” Look at what has happened here, and at other blogs, with respect to the Harvey et al.,2017 paper. A backfire effect if ever I have seen one.
    One of the important qualifications that needs to added to the Lewandowski statement concerns the quality of the “thing” you tell people. A poorly argued “thing” might well produce backfire, whereas a well argued “thing”, especially in the absence of alternatives, might not.

    I think Lewandowski is probably also right in believing that the backfire effect has increased over time. People are becoming increasingly sceptical and less willing to believe in experts.

    Like

  6. Ego encourages us to believe what we want to believe. But not believing what is false must be a survival trait for which evolution will select. No wonder we have disagreements.
    Maybe (in the interest of not increasing opposition to what he sees as an important truth) Lewandowsky will stop emphasising the catastrophe of global warming, so as not to add to the confidence of opponents?

    Like

  7. Osseo. Unfortunately by framing your magnificent corollary (something we can only dream about), if Lew reads it, in preparation for his next opus, it will cause a simply massive backfire and he will be encouraged to continue as before. It’s the perfect Catch 22.

    Like

  8. Alan Kendall
    The debunking is not by me, but by Lewandowsky’s co-author Ecker, based on research by him and Lew. The backfire effect is not just a matter of suspicious-minded sceptics not believing stuff. Cognitive psychologists are, in theory, not supposed to be researching individual beliefs like belief in global warming, but believing in general, along with knowing, remembering, doubting, and all the other mental faculties.

    What I see in the Engber article is a good journalist politely but firmly expressing his scepticism about the whole field. I”m tempted to go much further, and dismiss cognitive psychology as a pseudo-science. I’ve read too many papers based on an opinion poll conducted among a few dozen of the researcher’s students, yet I never see a researcher discuss what they mean by an opinion. How does it differ from a belief, or a feeling, or a conviction? Cognitive psychology is not interested in this kind of nitpicking detail. What they want is some simple data they can do a significance test on. It has very little to do with the human mind, as far as I can see. How can those Stanford researchers be so thick as not to see that if you make people jump through mental hoops, then tell them you’ve been lying to them, it’s going to falsify any results? People are not just box-ticking machines. They assess, believe, doubt, often using unconscious processes that the cognitionists don’t dream of. Cognitive psychology is based on a series of narrow assumptions about the way we are. Anyone who’s ever had a dream or read a novel or fallen in love knows it’s bollocks.

    I suspect that they cognitionists are (unconsciously) aware of this, which is why they push back everything to the African savannah, out of sight, out of mind.

    Like

  9. Geoff thank you for your wide-ranging response, most of which I have no dispute with. I merely wished to point out that Lew sometimes (by mistake?) says somethings vaguely reasonable. I must admit that cognitive psychology comes close to the theory of science in my list of least favourite subjects to read about. Lisa’s proliferation of new terms and adoption of common words in highly specialist settings cause much irritation. When I struggle to gain meaning from a short Lew paragraph containing a dozen specialist/technical words, I sometimes wonder if some of my own geological writings were similarly obtuse – although I once experimented with writing a technical paper entirely with basic English – quite difficult.

    Like

  10. Geoff:

    Maybe the only rational response from a subject would be to suppress the desire to punch the psychologists in the face and walk out. That probably counts as a “don’t know.”

    Just wanted to repeat that.

    Like

  11. AK what you’ve observed isn’t the backfire effect, it’s the crying wolf effect or maybe the sheep in wolf’s clothing effect. Scientists have been gifted a lot of trust and respect simply because they’re clever. While individual scientists may behave more honestly than the average person, part of it is because they have more to lose if they’re caught. It doesn’t mean they’re all honest. Smart people lie better.

    The public is more educated than it was and many people have seen a scientific claim (and other expert caims too) turned into an oopse moment over the years. Science used to be very slow getting to the public ear but now the ink is barely dry and errors are proliferating. As are journals which leads a need for weaker and weaker papers. Too many studies either have pitifully small sample sizes or are little more than opinion pieces. Science has be exaggerating its skills for some time now. Climate science is guilty of all these sins. Less intelligent the public may be, but eventually they notice. What do people normally do when they realise that they’ve been lied to or observe serial errors? They trust less (or punch people in the face LOL). Sometimes they assume the opposite but often they just run every subsequent pronouncement through their own filters and think ‘screw you lying experts’.

    Dr Lew and many like him studying the human psych are essentially hampered by a lack of empathy for people very different from them. He can’t devise questions to get truly representative opinions in people like us because we don’t fit his model of how to think. Because his field tries to tease out answers from our deepest, darkest motives, they miss the bleeding obvious that we’d tell them if they just asked. Like Freud’s theories, Lew’s questions say more about him than they do about us. Wouldn’t many of us come across as warmists on a Lew inspired questionnaire? Could he devise questions to understand all the things we have an issue with over the climate band wagon? Could he even detect wind up answers because we were annoyed one day against reasoned answers on a day when we were feeling helpful?

    Those that scratch their head about climate scepticism need to stop treating us as movie villans with no back story and start talking to us. Stop with the desk studies and get out into the field. If we were a tribe in some remote forest, Dr Lew’s appoach would be laughed at. ‘So you’ve never actually seen any of this tribe you write about? You’ve never talked to them and your work is based purely on sending them surveys and the opinions of another tribe that competes for the same resources?’

    Of course he doesn’t want to understand, he wants to discredit us and pocket a lot of money to achieve nowt.

    Like

  12. Lew and gang knows exactly what they are doing, and why.
    And has been documented in both the skeptic community and by more than a few of their peers, it has little to do with science.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.