There’s a long and interesting article by Daniel Engber at Slate about Fake News and how to correct it; the Boomerang or Backfire Effect; echo chambers and so on, giving a useful historical review of research going back to the Second World War. He recounts in detail the efforts of psychologists to replicate the science, with a view to establishing whether we are really living in a post-fact age.
The title and subtitle give an idea of his general scepticism:
LOL Something Matters: We’ve been told that facts have lost their power, that debunking lies only makes them stronger, and that the internet divides us. Don’t believe any of it.
Engber exhibits a healthy scepticism throughout his well-researched article, for example when he says:
As I poked around these and other studies, I began to feel a sort of boomerang effect vis-à-vis my thinking about boomerangs: Somehow the published evidence was making me less convinced of the soundness of the theory. What if this field of research, like so many others in the social sciences, had been tilted toward producing false positive results?
For decades now, it’s been commonplace for scientists to run studies with insufficient sample sizes or to dig around in datasets with lots of different tools, hoping they might turn up a finding of statistical significance. It’s clear that this approach can gin up phantom signals from a bunch of noise. …A scientist who found an upside-down result might go on to make a novel and surprising claim, such as: If you tell people one thing, they’ll believe the opposite; or facts can make us dumber; or debunking doesn’t work. Since editors at top-tier scientific journals are often drawn to unexpected data, this mistake might then be published as a major finding in the field, with all the press reports and academic accolades that follow.
Naturally in an article on debunking, he cites Lewandowsky:
In November 2011, a pair of cognitive psychologists in Australia, Stephan Lewandowsky and John Cook, published an eight-page pamphlet they called “The Debunking Handbook,” on the “difficult and complex challenge” of correcting misinformation. They cited work from Schwarz and Skurnik, among others, in describing several ways in which debunkings can boomerang or backfire. Arriving when it did, in the middle of the post-fact panic, their handbook satisfied a pressing need. Chris Mooney, author of The Republican War on Science, called it “a treasure trove for defenders of reason” The liberal website Daily Kos said it was a “must read and a must keep reference.” …
“The existence of backfire effects” have “emerged more and more over time,” Lewandowsky told Vox in 2014. “If you tell people one thing, they’ll believe the opposite. That finding seems to be pretty strong.”
Engber cites experiments that show pretty clearly that Lewandowsky’s statement above is wrong, and cites with approval his new formulation:
There have been other failed attempts to reproduce the Skurnik, Yoon, and Schwarz finding. For a study that came out last June, Briony Swire, Ullrich Ecker, and “Debunking Handbook” co-author Stephan Lewandowsky showed college undergrads several dozen statements of ambiguous veracity (e.g. “Humans can regrow the tips of fingers and toes after they have been amputated”). The students rated their beliefs in each assertion on a scale from 0 to 10, then found out which were facts and which were myths. Finally, the students had to rate their beliefs again, either after waiting 30 minutes or one week. If Skurnik, Yoon, and Schwarz were right, then the debunkings would cause their answers to rebound in the wrong direction: If you tell people one thing, they’ll believe the opposite. But the new study found no sign of this effect. The students’ belief in false statements dropped from a baseline score of 6 down to less than 2 after 30 minutes. While their belief crept back up a bit as time went by, the subjects always remained more skeptical of falsehoods than they’d been at the start. The labels never backfired.
A second study from Ecker and Lewandowsky (along with Joshua Hogan), also out last June, found that corrections to news stories were most effective when they repeated the original misinformation in the context of refuting it. This runs counter to the older theory, that mere exposure to a lie … makes it harder to unseat. The authors noted that the traditional logic of “effective myth debunking may thus need to be revised.”
In other words, at least one variation of the end-of-facts thesis—that debunking sometimes backfires—had lost its grounding in the data. “I’ve tried reasonably hard to find [this backfire effect] myself, and I haven’t been able to,” Ecker told me recently. Unless someone can provide some better evidence, it may be time to ask if this rule of thumb from social science could represent its own variety of rumor: a myth about how myths spread.
It’s nice to see Lewandowsky’s co-author admitting that they might be wrong about something, and even nicer to see a science journalist at Slate researching a subject and expressing scepticism, instead of simply quoting a press release. But there, I’ve exhausted my quota of niceness for today, and so let’s finish with some comments on one of the many examples of fake news which Engber quotes: climate change denialism.
where you can read:
On October 10, 2007, Justice Michael Burton ..ruled that An Inconvenient Truth contained nine scientific errors and thus must be accompanied by an explanation of those errors before being shown to school children. The judge said that showing the film without the explanations of error would be a violation of education laws.
Maybe Engber didn’t read that bit.
…in 2015, the idea [of the boomerang] had grown in scope … Papers had by then been published showing that the facts could boomerang when Republicans were told that Obamacare’s “death panels” didn’t exist or that climate change could lead to more disease.
links to an article by Hart and Nisbet: “Boomerang Effects in Science Communication: How Motivated Reasoning and Identity Cues Amplify Opinion Polarization About Climate Mitigation Policies.”
Here’s the relevant quote from the paper’s Methodology:
In the two stimulus conditions, participants read a simulated news story about climate change … designed to be “nonpolitical” … and focused on the potential health impacts of climate change, an increasingly salient and important aspect of climate change. The story discussed the potential for climate change to increase the likelihood that diseases such as West Nile virus will infect individuals who spend a lot of time working outdoors, like farmers. The news story was generated explicitly for the experiment but was based on facts reported by the Associated Press…The two experimental conditions varied by manipulating the identity of the potential victims and story exemplars into conditions of relative low and high social distance by altering the story… In the low social distance condition, the potential victims of climate change were described as being located in the general locality of where the experimental participants resided (upstate New York). In the high social distance condition the potential victims were located either in the state of Georgia or the country France…
and the abstract states triumphantly:
we found the influence of identification with potential victims was contingent on participants’ political partisanship. This partisanship increased the degree of political polarization on support for climate mitigation policies and resulted in a boomerang effect among Republican participants.
That’s right. Most Republicans are too stupid to believe that West Nile disease might hit upstate New York, a “fact” that the researchers invented.
But the most intriguing quote linking the debate to climate denial is this:
Even as new facts accumulate in the science of post-facts, the field will likely be slow to change its course. Norbert Schwarz, for one, has been a vocal critic of the replication movement in social psychology, comparing those who question old ideas to global warming denialists: “You can think of this as psychology’s version of the climate-change debate,” he told Nature in 2012, when doubts emerged about research into social priming. “The consensus of the vast majority of psychologists closely familiar with work in this area gets drowned out by claims of a few persistent priming skeptics.”
And Engber follows up with this:
Skeptics of the boomerang effect have also run afoul of consensus thinking in their field. Guess and Coppock sent their study to the same journal that published the original Lord, Ross, and Lepper paper in 1979, and it was rejected. Then it was passed over four more times. “We’ve reframed it over and over,” Coppock says. “It’s never rejected on the evidence—they don’t dispute the data…” As a result, their work remains in purgatory, as a posted manuscript that hasn’t made its way to print…
Wood and Porter’s study also faced a wall of opposition during the peer review process; after two rejections, it was finally accepted by a journal just last week.
I asked Coppock: Might there be echo chambers in academia, where scholars keep themselves away from new ideas about the echo chamber? And what if presenting evidence against the backfire effect itself produced a sort of backfire? …if his findings were correct, then wouldn’t all those peer reviewers have updated their beliefs in support of his conclusion? He paused for a moment. “In a way,” he said, “the best evidence against our paper is that it keeps getting rejected.”
Engber draws no conclusion about the resemblance with the climate debate.
Here’s the point. Engber seems an admirable science journalist, researching his subject, looking at both sides, expressing scepticism, and asking scientists awkard questions. Yet when it comes to claiming that reality is in danger from climate change denialism, he quotes as his authority Al Gore. When citing a paper which examines attitudes to the claim that climate change could lead to more disease, he fails to notice that the researchers made the claim up. And when a scientist claims that denial of replicability in the social sciences is akin to climate denialism, Engber provides evidence of non-consensus views in the social sciences being kept out of the peer reviewed literature, and then..
And then nothing.