Lew on Truth, Conspiracy Theorising, and Everything

Bristol University’s news site is announcing the publication of a “book” (their description) by Professors Lewandowsky and Cook on conspiracy theorisingThe link provided by Bristol University to download the “book” isn’t working, but Mirage News, which reproduces the University puff in its entirety has a link here:

Author Professor Stephan Lewandowsky explains: “Conspiracy theories attempt to explain events as the secretive plots of powerful people. While conspiracy theories are not typically supported by evidence, this doesn’t stop them from blossoming… While actual conspiracies do exist they are rarely discovered through the methods of conspiracy theorists. Rather, real conspiracies get discovered through conventional thinking—healthy skepticism of official accounts while carefully considering available evidence and being committed to internal consistency. This handbook helps explain why conspiracy theories are so popular, and how to identify the traits of conspiratorial thinking, and lists effective debunking strategies.”

Got that? A real conspiracy is one that is discovered through healthy skepticism, while a false one is one that is discovered by, er, the methods of conspiracy theorists. Unhealthy scepticism, probably. With a “c.” Like what we do.

The “book,” shorn of covers and title page, is nine pages long. In this very short guide to debunking conspiracy theories, there is very little on real conspiracy theories and no less than seven statements concerning climate change scepticism:

1. Exposure to conspiracy theories decreases people’s intentions to engage in politics or to reduce their carbon footprint.

2. Conspiracy theories may be deployed as a rhetorical tool to escape inconvenient conclusions. The rhetoric of climate denial is filled with incoherence…

3.Incoherence is one attribute of conspiratorial thinking, but it does not follow that climate denial is irrational—on the contrary, denialist rhetoric is an effective political strategy to delay climate action by undermining people’s perception of the strength of scientific evidence. In confirmation, people selectively appeal to a conspiracy among scientists to explain away a scientific consensus when their political ideology compels them to do so—but not when the scientific consensus is of no relevance to their politics.

4. Rejecting the scientific consensus that humans are causing global warming is often the result of conspiratorial thinking rather than a careful weighing of scientific evidence.

5. When climate deniers are presented with information about climate change, their most common response is conspiratorial in nature.

6. However, climate denial isn’t just associated with climate-themed conspiracy theories—rather, people who deny climate science are more likely to endorse conspiracy theories in other topics as well.

7. An ounce of prevention is worth a pound of cure… For example, sharing of conspiratorial climate-denial posts on Facebook was reduced by a simple intervention that encouraged people to ask four questions about material before sharing it..

Points 2, 3, 4, and 6 are supported by references to papers by Lewandowsky et al. Forget 9/11 and the Kennedy and Martin Luther King assassinations. Conspiracy theorising is a niche cottage industry to Lewandowsky and Cook, intimately linked to climate denial. “Persecuted Victim,” “Nefarious Intent,” – all their carefully assembled characteristics of conspiratorial thinking are there, repeated word for word from their “Moon Hoax” and “Alice in Wonderland” papers. Forget the Skripals, the DNC emails / Crowdstrike / FBI story and the Cambridge professor on a million dollar CIA retainer who just couldn’t help bumping in to people who happened to be working for the Trump election campaign. It’s all about climate denial.

Occasionally a new fact is allowed a look in. Under the heading “Immune to Evidence” we get this:

Conspiracy theories are inherently self-sealing—evidence that counters a theory is re-interpreted as originating from the conspiracy. [supported by references to papers from 1999, 2007 and 2009] This reflects the belief that the stronger the evidence against a conspiracy (e.g., the FBI exonerating a politician from allegations of misusing a personal email server), the more the conspirators must want people to believe their version of events (e.g., the FBI was part of the conspiracy to protect that politician).

Here’s where Lewandowsky’s precept: “real conspiracies get discovered through conventional thinking—healthy skepticism of official accounts while carefully considering available evidence” comes in handy. The FBI didn’t exonerate Hillary, they simply decided not to prosecute, possibly because of a certain bias in her favour, highlighted by a senior FBI official asserting that “We” would do everything to prevent Trump from being elected. You’d have to be “immune to evidence” to believe that the FBI was not involved in a conspiracy.

And of course Lewandowsky and Cook are immune to evidence, on this and on many other subjects. For example, they seem to be immune to the fact that the world is not agog to hear their recipe for countering the plague of conspiracy theorising. Lewandowsky had his chance in chapter ten of “Conspiracy Theories and the People Who Believe Them” to present the truth about conspiracy theorising, instead of which he recounted the conspiracy (by Steve McIntyre, Paul Matthews, Barry Woods, me, and others) to get his “Recursive Fury” paper retracted.

There’s one novelty in this little pamphlet, (which Bristol University describes as a “book”) and that’s point 3 mentioned above:

Incoherence is one attribute of conspiratorial thinking, but it does not follow that climate denial is irrational—on the contrary, denialist rhetoric is an effective political strategy to delay climate action by undermining people’s perception of the strength of scientific evidence.

And the reference here is to:

Lewandowsky, S. (2020). “Hannah Arendt and the contemporary social construction of conspiracy theorists.” Manuscript Submitted for Publication.

After four papers demonstrating that climate denialists are irrational -two degrees short of a secular trend – Lew has come up with a paper demonstrating the opposite: that they may be motivated by the perfectly rational observation that their “rhetoric” is an “effective political strategy.” The paper is, in its methodology and structure, a carbon copy of “Moon Hoax,” which demonstrated that climate denial was caused by irrational psychological defects such as feelings of persecution and an inability to think straight.

The abstract starts by presenting grandiose programme for retooling society (or shaping tomorrow’s world:)

Exposure to conspiracy theories can have considerable adverse impact on society. I argue that scholars therefore have a responsibility to combat conspiracy theories and misinformation generally. Exercising this responsibility requires an understanding of the varied rhetorical roles of conspiracy theories. Here I focus on instances in which people reject unequivocal scientific evidence and invoke conspiracy theories, or radical anti-institutional positions, based on ideological imperatives. I argue that those positions do not always reflect true attitudes. Instead, people may deploy extreme rhetoric as a pragmatic tool of political expression. I investigate this possibility by focusing on the role of conspiracy theories in the rejection of science. Conspiracist cognition and rhetoric violate the epistemic standards that underpin science. Ironically, this violation of epistemic standards renders conspiracy theories useful as a rationally deployed tool that serves political purposes. I present a study that confirms that conspiracy theories can be deployed to support worldview-motivated denial of science. I provide suggestions how scholars can debunk or defang conspiratorial rhetoric.

This is followed by one quote from Hannah Arendt and two from Donald Trump. Hannah gets a mention in the title of the paper and Donald doesn’t, so you know who Lew’s rooting for – the philosopher who analysed Nazism and anti-semitism, and not the elected president of the United States. The lines of battle have been drawn up. By Lew. In a scientific paper. It’s Us (anti-Nazis) versus Him.

[We all have our favourite Trump quotes. Mine is “If I’d listened to John Bolton, we’d be on World War Six by Now.” What a pity Obama didn’t make a similar observation about Hillary.]

Pages 3-of the paper are about Lewandowsky’s political journey from reasonable leftist scepticism about the justifications for the Iraq war to a far right wholesale acceptation of the CIA version of the downing of Malaysian flight MH17. Pages 6-11 discuss post modernism and post truth, culminating in this observation:

The idea that the very notion of evidence and truth itself may be compromised by shock and chaos is supported by public opinion data, such as a Pew poll (July 2017) that showed that a majority of Republicans, by a 58% to 36% margin, considered colleges and universities to have a negative effect on the way things are going in the U.S. Among Democrats, opinion was split in reverse, with a 72% (positive) to 19% (negative) margin.

How does this banal finding that Democrats and Republicans disagree about the value of a university education “support” Lew’s opinion that “the very notion of evidence and truth itself may be compromised”? Easy. Republicans are wrong to think that universities have a negative effect. Lew knows that they’re wrong – he works at one, for Gaia’s sake. Half the people think that folks like Lew are having a negative effect, therefore truth itself may be compromised by shock and chaos. It follows as night follows day, because the only people qualified to examine the question are people like Lew, who work at universities and therefore have the monopoly of opinion in “the literature” as to what effect universities etc.

And here’s how he does it (page 13.) By interviewing 195 people on-line for ten minutes (on average.)

Bollocks. In the far off days when I did market research, we might have considered 200 a minimum sample size to determine whether house persons preferred soap A to soap B – just. But they were real human beings, interviewed in the street on their way to the shops to buy soap A or soap B. And their responses weren’t expected to reveal the truth about Life, Truth, Shock, Chaos, and Everything.

Lew’s professional respondents, on the other hand were recruited by Amazon for a measly $1.10 to answer questions on-line (in ten minutes) on:

..basic demographics (age and gender) then, on a seven-point response scale ranging from “Strongly disagree” to “Strongly agree”, with the midpoint “Neither agree nor disagree:”

(a) Political attitudes measured by a subset of 5 items from a scale developed by Thomas Scotto and Jason Reifler for their ESRC project “Public Opinion and the Syrian Crisis in Three Democracies”

(b) The presumed knowability of truth measured by presenting 3 quotes from public figures who questioned that truth or that facts could be unequivocally ascertained. The first two statements were made by Katie Hopkins, a columnist for UK tabloids and the third statement was made by Donald Trump’s attorney, Rudy Giuliani. Participants indicated their agreement or disagreement with each statement. A further two items queried the presumed knowability of truth directly.

(c) Conspiracism was measured using 5 items taken from Imho and Bruder (2014). These items do not target belief in specific conspiracies but probe a broader, likely dispositional, tendency to engage in conspiracist cognition

(d) Reliance on sources of knowledge was measured by 4 items developed by my team.

(e) Need for chaos (“Need”) was measured using 4 items from the scale developed by Petersen et al. (2018).

(f) Reliance on intuition as a source of knowledge was measured using 5 items developed by my team.

The questionnaire additionally examined two [surely three?] aspects of scientific consensus. People first indicated their perceived scientific consensus (using a percentage scale) for the link between HIV and AIDS, the link between CO2 and climate change, and the safety and efficacy of vaccinations. At the end of the questionnaire participants were presented with accurate information about the scientific consensus (e.g., “Virtually all medical scientists agree that HIV causes AIDS”), followed by the question “How much do you think each of the following reasons contributes to this scientific agreement?” The question was accompanied by the 6 response options in Table 2. Options were presented together on the same screen and participants could choose any number of options on a 5-point scale ranging from “Not a reason” to “The only reason”.

After all items in Table 1 had been presented, participants were again asked to indicate their age, followed by a question probing how much attention they paid. Any participants who indicated that they were not “paying much attention” or did not want their data to be used for other reasons would have been eliminated (none did).

Of course they didn’t, you berk. Otherwise they wouldn’t have got their $1:10.

One dollar and ten fucking cents. That’s what Professor Lew paid his informants to reveal the truth about the nature of conspiracy theorising, their need for chaos, the presumed knowability of truth, and hence the scientific path towards eliminating misinformation in public discourse, all in ten minutes. The rest of the paper (pp 15–24) is devoted to demonstrating, on the basis of significant-at-the-.05-level correlations, that if you ask 195 normal people a series of stupid questions about need for chaos, scientific agreement, a dispositional tendency to engage in conspiracist cognition, and whether they agree with Katie Hopkins and Rudy Giuliani, all on-line in an average of ten minutes, you can get any results you want, with correlation coefficients of 0.2-0.3 or so.

This shitty pre-published article can be stopped. The supplemental materials seem to be here

including data and R code

coda:

Lew has contributed to a comment on a paper by our old friend Neil Levy “Is Conspiracy Theorising Irrational?” here.  How does he find time to teach cognitive psychology?

32 Comments

  1. [Read Richard’s first. This one is boring.]

    I would never call a fellow Cliscepr’s work boring (following Flickr, Tumblr and Grindr there, without taking a full part in the latter) but I do think I have a length advantage. Plus two scapegoats if it goes horribly wrong.

    There again, Dr Lew really is boring. It’s astonishing and inspiring that you’ve shed so much light on something so rotten, again and again. I’ll read this one later.

    Liked by 1 person

  2. Lewandowsky and Cook are near Shakespearean in their tragic mercenary obsession to kowtow to the powerful and corrupt and to betray their callings.
    Think of truth and integrity as Hamlet: moody, hard to read but dedicated. Think of the climate consensus as Claudius, a murderous poser dressed up as King. And who finally sees that truth-Hamlet-has to be dispatched. So Claudius of course calls on the two poser traitors to do the deed and kill off that pesky truth. So he calls in Lewandowsky and Cook…no wait:
    Rosencrantz and Guildenstern.

    Like

  3. “5. When climate deniers are presented with information about climate change, their most common response is conspiratorial in nature.”

    Has anyone ever asked conspiracy expert Lewandowsky for his opinion on Mann’s Hockey Stick, and the science he relies on to support his opinion?

    He might run and hide, shouting rude words from a safe distance, without ever giving any answer at all. It seems to be the standard response that Hockey Teamsters have agreed, no matter what the evidence they have to deny.

    Liked by 1 person

  4. Very good Jaime. But I’d say climate scepticism is a gut feel based on a working bullshit-meter. The field being so complex – both the science of climate and the science/engineering/economics of energy provision – leads to many fascinating intellectual questions. To which vocal climate sceptics sometimes do promote wrong answers. The many quiet climate sceptics – the vast majority of the tribe – are united with us in the reading their bullshit-meter gives but have no interest in the intellectual discussion.

    Meanwhile I must read Geoff’s post.

    Liked by 1 person

  5. I’d like to, if I may, channel Brad Keyes for one moment and remind Lew and crew of some basic English:

    People who conspire are being conspiratorial, so it is they who engage in conspiratorial thinking. People who suspect such conspiracy are conspiracists. So suggesting that climate change ‘deniers’ are engaging in conspiratorial thinking by suspecting a conspiracy is simple illiteracy. The best one could say is that they are guilty of conspiracist thinking.

    I find it difficult to treat seriously someone who fumbles so easily over the English language, let alone someone who is so patently talking out of his arse. Please find below alternative research that does, at least, achieve the required level of literacy:

    https://www.psychologicalscience.org/news/releases/coincidence-or-conspiracy-studies-investigate-conspiracist-thinking.html

    Liked by 2 people

  6. John: Yeah, conspiratorial thinking did for me too. The smear that dare not speak its name. Or, more likely, is incapable of doing so. (I did read it Geoff. I am a delayer, I’m a conscious delayer, indeed I’m a merchant of doubt. Just not a very effective one. Can I go now?)

    Liked by 1 person

  7. John Ridgeway I think it most unkind of you to criticize a german-strine speaker for misappropriation of English words – it is to be expected. Who’d have thunk you would need to be so conspiratorial (or is it conspricystorial?) in getting your conspiricist views made known.

    Liked by 1 person

  8. Alan, Richard

    I was well known for my pedantry at work and I make no apologies for it now. In this case, I could be charitable towards Lew because it is actually a very easy mistake to make (I have done it myself in the past). However, it is not so forgivable when you are making a living out of studying something you can’t even correctly label.

    Speaking of which, Alan, was misspelling my name some sort of subtle message or am I over-thinking this? 🙂

    Liked by 1 person

  9. Actually, being serious for a moment, there is something deeper troubling me with regard to Lew’s faux pas. If everyone else in psychology is getting this right but Lew isn’t, what should that tell me about Lew’s reading of the literature on the subject? It seems to suggest he hasn’t done any, otherwise he would surely have recognised his mistake and corrected it before now. You don’t suppose he is working in a self-absorbed silo do you?

    Liked by 1 person

  10. John All misspellings are the responsibility of my evil spell-checker. Think not over-much, young sirrah. 😷

    Like

  11. John, the thing it reminds me of is the Nigerian Letter or “419” Fraud. They say that such emails often have obvious mistakes in, of spelling or grammar. This selects for those most likely to fall for the scam. I’m not offering proof in Lew’s case but the ‘deeper troubling’ is for me on target. Something stinks to high heaven under the Royal Society’s figleaf of respectability.

    Liked by 1 person

  12. Alan,

    No worries friend. I must admit that I still get very offended when the spell-checker insists that I am spelling my name incorrectly. In fact, both spellings are common and my own is (I have been told) due to a mistake made on a birth certificate two generations back. It’s all to do with the English idiom of dropping the ‘e’ after ‘dg’, as in Wedgwood. However, on that point I think it best to avoid both judgement and judgment.

    Richard,

    I think your Nigerian Letter example is spot on.

    Liked by 2 people

  13. I’m afraid its no good appealing to English usage on the meaning of conspiratorial. What we are dealing with here is the scholarly meaning of conspiratorial thinking as defined by the scholars who study the thing they call conspiratorial thinking which, being proper scholars and all is a specialised term only understood by them and the wider conspiritorial thinking scholarly community. Which community appears to comprise Lew, Cook and the bum study chappy.

    Liked by 2 people

  14. GEOFF CRUICKSHANK

    …Which community appears to comprise Lew, Cook and the bum study chappy.

    The CRAASH and CREST and COMPACT programmes I looked at here
    https://cliscep.com/2019/03/08/the-great-climate-conspiracy-theory-conspiracy-theory/
    involve hundreds of researchers, or at least did when Leverhulme and the British and European governments were coughing up the money
    .
    ..and the bum-study chappy get’s a look-in in the Lew-Cookbook, being cited in support of the assertion that, for conspiracists, nothing happens by accident. Yup. Some of us human beings tend to look for causes for things that happen, whatever David Hume says.

    There’s also Wood and Douglas, who are cited seven times. They were the ones who found that nobody who believed that Lady Di was dead also believed that she was alive, which was written up in their paper demonstrating that some people (conspiracists) believe contradictory things. Lew expanded that in his “Alice in Wonderland” paper to invent the principle of teleological irrational thinking: if Professor Plimer says something in Australia and Anthony Watts says something different in California, then they are behaving irrationally.

    Incidentally, Professor Sirven’s bums (he’s a full professor now) were not real. At least, the medium-sized bums were real, while the big and small ones were photoshopped. Respondents (80% of them female) preferred the real ones to the photoshopped, which contradicts previous research. Replication in the social sciences is hell. Now I must get back to my research.

    Liked by 1 person

  15. Geoff Cruickshank,

    Yes, of course scholars have a long record of taking already perfectly well-defined words and misappropriating them for their own jargon, just to confuse us poor plebs. This, however, is not an example of that. An illiterate scholar is an illiterate scholar.

    Geoff Chambers,

    “…being cited in support of the assertion that, for conspiracists, nothing happens by accident.”

    Except that the study I linked to demonstrates that this is just one of those myths that conspiracy theory theorists have taken for granted. One might say that it was a conspiracy that was found wanting just as soon as someone could be bothered to test it. Replication in the social sciences is hell, but only when scientific standards are attempted.

    Liked by 1 person

  16. JOHN RIDGWAY
    From the article you mention (available free here)
    https://www.researchgate.net/publication/282051958_Nothing_Happens_by_Accident_or_Does_It_A_Low_Prior_for_Randomness_Does_Not_Explain_Belief_in_Conspiracy_Theories

    All measures of belief in conspiracy theories were correlated with each other, thus replicating the now standard finding that conspiracy-theory beliefs involve a monological belief system (Goertzel, 1994; Lewandowsky, Oberauer, & Gignac, 2013; Swami et al., 2011; Wagner-Egger & Bangerter, 2007; Wood, Douglas & Sutton 2012

    So, while refuting the Lew hypothesis that conspiracy ideationists necessarily believe that nothing happens by accident, the study confirms Lew’s thesis that belief in different conspiracy theories doesn’t happen by accident, since they are part of a “monological belief system,” i.e. if you think John Kennedy’s assassination was a bit iffy, you’re likely to think the same of Robert’s.

    The next stage of their research will be to determine whether this monological tendency is linked to paranoia and an inability to reason, or to a general curiosity and interest in detailed evidence.

    Liked by 1 person

  17. I googled for ‘monological belief system’ and every link on the first page seemed to be about conspiracy theories. Now standard, as they said. I’m getting all conspiratorial.

    Like

  18. Geoff,

    You are quite right. The one thing that all of these people have in common is their monological belief system that all conspiracist ideation forms part of a monological belief system.

    In fact, I am struggling quite hard to get my own belief system to make sense of the concept of a belief system that is anything other than ‘monological’. Isn’t it the defining characteristic of a belief system that there will always be an underpinning meta-belief (or ‘logic’, to use their terminology) that systemises the beliefs? If there isn’t, then it wouldn’t be a system at all but rather a potpourri. If there is more than one underpinning logic then the system would be resolvable into component systems (though not necessarily in a reductionist manner).

    I’m inclined to think that the whole idea of the monological belief system is just a vacuous, tautological musing designed to entertain the minds of people with nothing better to do. Okay, people who are suspicious by nature tend to be suspicious of more than one thing. Whoopty-bloody-do. Give the boy a grant.

    Liked by 4 people

  19. RICHARD, JOHN
    Well-spotted on monological belief systems. It means what the authors want it to mean, i.e. a bunch of beliefs whose presence in a person is non-random. It comes from Goetzel, who did the only bit of “normal” sociological research on the subject back in the nineties.

    Among the authors who come up using Richard’s google search is Kurtis Hagen, who is well worth reading. Here he trashes Swami and Wood & Douglas

    Click to access 7-Argumenta-Kurtis-Hagen-Conspiracy-Theorists-and-Monological-Beliefs-Systems.pdf

    Liked by 2 people

  20. Gosh, hundreds of researchers trying to force our thinking processes into a tiny box of their own construction and slam the lid shut. John, of course I agree with your language criticism, my tongue kind of found it’s way into my cheek. Hasn’t Lew previously said ‘data only for scholars’, or am I mistaken about that? If data, language too perhaps, in their minds at least.

    Liked by 1 person

  21. Yes Geoff, I picked up on your irony but I still couldn’t resist the opportunity to labour my point. What’s a good point worth if one can’t labour it?

    Liked by 1 person

  22. Geoff: Many thanks for reading this, so I don’t have to. Lew’s descent into self-flattering indulgent nonsense is painful to witness, due its sordid personal incineration of rationality which he thinks is a beacon defending same.

    All: I doubt monological belief systems is a real thing. At any rate, spurious conspiracy theories and also what might legitimately be called ‘denialism’ if the term wasn’t so abused, arise from extreme resistance to alien encroaching or dominant culture (often as represented by authority), and hence can’t be universally held because they depend on the deep values of the expressing individual, and the values that individual opposes. Some spurious conspiracy theories will form natural bedfellows (e.g. if those cultures / authorities they oppose are allied), and some will not. Separately to this, there may be a vanishing fringe of individuals not too far from mental instability, who are essentially overwhelmed and can’t make proper sense of their world, so perforce must invent one that promotes their own importance and the falsity of the generic outside. So like birds building nests they could incorporate into that falsity any con theory they found lying around. But this isn’t a mainstream thing; in practice pretty much a medical thing I should think.

    John: “Isn’t it the defining characteristic of a belief system that there will always be an underpinning meta-belief (or ‘logic’, to use their terminology) that systemises the beliefs?”

    Not really. It is ‘under[pinning’ here that is the problem. There’s typically a highly elaborate and apparent ‘surface logic’ you might say, which upon inspection is always as full of holes as a swiss cheese. This isn’t an underpinning, it’s an identity recognition and reinforcement shell. The thing that underpins cultural belief is not logical in any way, shape, or form, but emotional. And not only does that emotion bypass logic, the purpose of culture essentially requires that consensus narratives support logical contradictions. It’s easier to see this over long time-periods, because the narratives constantly evolve too, and the changes make inconsistency even easier to spot, *if* one is looking objectively (adherents don’t). While they may not typically be as bad as potpourri, sub-narratives need only to be emotively compatible not logically compatible (so can seem like a surprisingly wide and conflicting range), which is to say they can get away with the thinnest possible veneer of logic and pretence of compatibility that even a (non-adherent) child ought to see through, albeit even for non-believers the emotional leakage can still be strangely disarming, like a veil thrown over something. And for sure even potpourri culture can collect an army of devout believers; just look at the principles of scientology. It picks a bunch of sciency (dianetics) and alien (thetan) pseudo principles, a bunch a traditional religionish pseudo principles (from *different* religions, so a spirit like Christianity but also many lives like reincarnation faiths such as Hinduism / Buddhism), a bunch of psychological pseudo principles (memory ‘engrams’, and ironically emotion versus rationality), various secular idealisms, ‘silent birth’ (good gracious, and why??), and a raft of other stuff plus lots of fictional sauce (after all the inventor was an SF author). It has long survived its inventor’s death, and prospers. Nothing connects all this, it’s just an arbitrary means to an end, the ownership of the emotions and hence loyalty of adherents, plus an identity set and comfort framework for them.

    Essentially all cultures are just illogical fairy stories. Whether ACO2 is good, bad, indifferent, even worse or even better for the climate, this is the just same for catastrophic climate culture. Given the whole lot is completely illogical to start with, it’s perfectly okay to introduce further illogicalities as needed for the culture to survive. While this can sometimes challenge believers if done at speed (and indeed climate culture is more fast-moving than most), the evidence that this can still be a net huge success, is all around us.

    Like

  23. Andy,

    I feel I allowed myself to be misunderstood. I was not trying to say anything insightful regarding the role of culture or the epistemological basis for belief. I was simply looking at the term ‘monological belief system’ from the perspective of a systems analyst. For every system (belief or otherwise) there will be a boundary that defines what lies within and without the system. Furthermore, those elements within the boundary will not be an arbitrary collection; they will be ontologically or teleologicallly connected – otherwise one would not refer to it as a system, one would simply refer to it as a miscellany. This is the logic to which I was referring. The logic may be complex but, ultimately, it will be monological in the sense that the system has only one boundary and one can always define that boundary in terms of everything within it being subject to a top level ontology or single overarching purpose. I suppose ‘monological belief system’ was intended to be pejorative, but it struck me as being nothing more than tautological.

    However, after having read the paper posted by Geoff Chambers, I can now see what Goetzel was trying to get at. It seems the ‘monological’ jibe was intended to refer to some form of self-sustaining incestuous logic in which only conspiracists indulge (i.e. one belief serving as evidence for another). But actually, I think Goetzel’s own logic is unsound.

    Firstly, ‘one belief serving as evidence for another’ doesn’t strike me as particularly problematic, and certainly not worthy of epithets like ‘monological’. This point is well covered by the paper Geoff posted.

    Secondly, Goetzel confuses causation with association. If one observes that an individual has belief X, one may know from collected data that this increases the likelihood of observing that they also have the belief Y, i.e. P(Y|X) > P(Y). This is known as probability raising and is often incorrectly interpreted as X causes Y (with no regard made for the possibility of a confounder). There may indeed be X–>Y causation but, as I said above, this wouldn’t actually be problematic – it is how we all reason. However, what Goetzel doesn’t seem to appreciate is that the probability raising may have nothing to do with X–>Y causation, being instead an association resulting from a common causation, e.g. the possession of a suspicious temperament that increases the likelihood of both X and Y. It really isn’t rocket science but it might as well be for Goetzel.

    Liked by 3 people

  24. John,

    If one accepts that by ontologically this is as much or more in the context of a cultural being rather than human beings, despite that the former is insentient and unagential, and likewise by teleologically, the apparent purposes of the characteristic features such as narratives are not what they purport to be and ultimately nothing to do with the conscious purposes of the humans involved, but subconscious group co-operation facilitated via long co-evolution with cultural narratives (for which ‘purpose’ thus carries too much weight of ‘conscious intent’ for many to be comfortable with), and that the ‘system’ boundary is both constantly changing plus porous to multiple other similar systems simultaneously, such that even in theory one can’t separate out where one stops and the other begins, hence boundaries are fuzzy estimates that depend somewhat upon the particular purpose for the estimate, then yes all is good 😊. But I find that these aren’t the typical emphasis of interpretation.

    Thanks for reading / explaining Goetzel, another thing I don’t have to read!

    Like

  25. Andy,

    Yes, I accept that I offered a simplistic analysis and that your caveats are well-warranted. But, there again, I think anyone who dares to pair the word ‘belief’ with ‘system’ is toying with analogy – with or without the dubious embellishment of ‘monological’.

    Liked by 3 people

  26. JOHN RIDGWAY
    Did you really mean the Goertzel paper? Or the Hagen paper which miraculously appeared in my recent comment when I tried to post a link to it?

    Goertzel 1994 is here:
    https://www.researchgate.net/publication/270271929_Belief_in_Conspiracy_Theories
    It’s a very standard opinion survey, with a factor analysis which extracts correlations (at the .001 level, while Lew is happy with .05) but none the worse for that. There’s some interpretation of significant demographic differences and suggested interpretations, and some musing about monological and dialogical conspiracist thinking, corresponding to open and closed mindedness. Confusingly, this Goertzel (Ted) refers to a forthcoming paper by another Goertzel (Ben) as the source for this idea. I don’t see him talking about systems.

    The way academics like Lew cite their “authorities” is little more than a game of Chinese whispers. I read somewhere that in medical research, 6% of citations in favour of a position were in fact written in opposition to the position being defended.

    Liked by 1 person

  27. Geoff,

    I am not referring to any of Goertzel’s work directly. Instead, I am taking my cue from the Hagen paper you posted. Specifically, in section 4, where it states:

    “Ted Goertzel, the originator of the idea that conspiracy theorists are monological thinkers, puts this in more general terms, ‘In a monological belief system, each of the beliefs serves as evidence for each of the other beliefs’ (Goertzel 1994: 740).”

    I have been misspelling Goertzel’s name throughout, however.

    Like

  28. @all

    after reading this thread my brain has stopped working.
    is that a good sign for Lew as a teacher in cognitive psychology?

    from the link – https://social-epistemology.com/2020/02/13/what-rationality-a-comment-on-levys-is-conspiracy-theorising-irrational-stephan-lewandowsky-anastasia-kozyreva-and-james-ladyman/

    “Summary of Levy’s Argument
    Levy invokes social factors in support of both arguments. Concerning the first argument, members of a low status group—who are particularly likely to accept conspiracy theories (Uscinski and Parent 2014)—tend to be at the mercy of forces over which they have little control. In consequence, suspicion and hypervigilance might be adaptive—and hence “subjectively rational”—because they permit early detection of adverse actions and events based on minimal evidence. For many people, the suspicion that “management” or the law or governent are “out to get them” may occasionally afford a true positive. The inevitable concomitant increase in the number of false positives may incur only a small cost, if any, to the agent.”

    my reading – nothing bad has happened, most rational people ignore the MSM hype & papers by numb nutters.

    Liked by 1 person

  29. Hunter: Interesting. Haven’t read Levy or the critique of same, but even if it wasn’t Lew, that position is unlikely to go down well because many psychologists don’t like evolutionary explanations trampling their pitch, despite this is obviously where most explanations must be rooted. From your quote, Levy is on the right track. Both for spurious conspiracy theories and *inapt* innate skepticism in general (the extreme form of which might be called ‘denialism’ if the term wasn’t so abused), both of which we deplore as a society, come from exactly *the same* instinct as non-spurious conspiracy suspicion and *apt* skepticism in general, which we applaud when it exposes alien cultural invasion or authorities way exceeding their power in the name of local culture (essentially decadence). But you can’t have one without the other! It’s a balance, a fail-safe, which will have false positives as the cost of the system.

    Like

  30. DFHunter,

    I only skim-read the Lew paper since experience has taught me that life is too short to expend much effort trying to understanding his world view. Nevertheless, I think the kicker was in the paper’s closing sentence:

    “On the contrary, the dismissal of all socially engineered collective sources of reliable belief (note that reliable does not mean infallible), such as traditional media, science, the legal system, and democratic governments, constitutes an epistemic vice whatever its—imagined or actual—social benefits may be to the agent.”

    It’s interesting that Lew et al fail to mention religion amongst the ‘socially engineered collective sources’. Had they done so, they might have recognized the uneasy tension that exists between the expressions ‘socially engineered’ and ‘reliable belief’. Then maybe – just maybe – they might finally realize how much they have been barking up the wrong tree by blathering on about conspiracists.

    Liked by 2 people

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.