“It’s a very short jump from believing kale smoothies are a cure for cancer to denying the Holocaust happened.”

Statistical Modeling, Causal Inference, and Social Science 2024-10-18

Campos quotes a comment from a thread on RFK Jr. and his running mate:

It’s a very short jump from believing kale smoothies are a cure for cancer to denying the Holocaust happened.

He points to this link:

The physiologist and blogger Mark Hoofnagle, writing in the Denialism blog in 2007, coined the term “crank magnetism” to describe the propensity of cranks to hold multiple irrational, unsupported, and/or ludicrous beliefs that are often unrelated to one another, referring to William Dembski endorsing both a Holocaust denier and one of Peter Duesberg‘s non-HIV weird theories.

I saw Duesberg give a talk once! It was back when I was in the statistics department at the University of California. One of our faculty was a prickly guy who prided himself on his open-mindedness. OK, he wasn’t open-minded in all directions—he absolutely despised social science—but he loooved all things biology. He invited Duesberg to speak, I guess as a way of demonstrating how open-minded he (the faculty member) was. Duesberg gave a terrible talk—he really came off like some sort of movie villain, talking about “poppers” etc. On the plus side, they got a legit epidemiologist to follow up with a devastating rebuttal. I do remember, though, that Duesberg’s theories were promoted by the left as well as the right. He published a book with right-wing publisher Regnery but he was also promoted in an article in the left-wing Village Voice.

Campos continues with some speculations of how it is that supporters of crank theories seem to have a habit of believing more and more unreasonable things. He offers nine hypotheses:

(1) Oppositional defiance disorder. Some people reject authority for the same reason they’re still mad at Mommy because she tried to them clean their rooms. . . . (2) Rejecting expertise as a matter of course is very liberating for anyone who wants to hold beliefs not because they’re true, but because they are satisfying for other reasons. . . . (3) A deep source of satisfaction for many people is the thought that they are Galileo . . . (4) People who happen to be exceptionally talented in one area . . . are prone to believe that their exceptional talent translates into other, completely unrelated areas. . . . (5) A cautionary tale for public intellectuals/academics . . . is to begin by making trenchant counter-cultural insights, and get lots of deserved attention for doing so, only to become subsequently addicted to that praise and attention. . . . (6) There’s a a close relationship between all this and the forces at the core of populist right wing authoritarianism . . . (7) The Internet and social media have greatly facilitated all these trends, since YouTube videos have replaced cranking out mimeographs in Mom’s basement, and can get potentially millions of views . . . (8) Very rich people tend to think they’re very rich because they’re smarter than everyone else . . . (9) Some cultural purveyors of eclectic crankery are clearly bullshitters . . . On the other hand, we all tend to become what we pretend to be.

I’m not saying any of these theories are wrong, exactly—each of them applies in some settings, and indeed Campos gives examples—but all of them seem too specific to me.

My story of how this happens is informed by my experience with a friend who went off the deep end with pseudoscience or conspiracy theories. It goes like this:

Holding a strong belief in a fringe theory is a form of commitment. This could be public commitment or it could just be something you hold to yourself, but in any case I’m thinking of the decisive step from “all things are possible” or “let’s have an open mind on this one” to “I believe this theory” and “the powers that be are suppressing it.” Once you’ve passed this threshold, it lowers the barriers for future such commitments. There’s lots of psychology research on this sort of thing, no? Once you’ve broken a rule or gone past an inhibition, it’s a lot easier to keep doing the taboo behavior.

Sometimes you can start crazy, other times the craziness creeps up on you. Kinda like how they told us in junior high that marijuana is a gateway drug. The fringe-dwellers whom I’ve known have started with offbeat takes that are interesting and on the border of plausibility and then stepped further and further into the wilderness.

In reaction to Campos’s post, Palko writes:

I get the sense that conspiracy theorists have a sense of community that runs deeper than their specific beliefs and part of that is a kind of mutual nonaggression pact: I’ll believe (or at least tolerate) your theory if you’ll do the same.

The desire for community and the need to reciprocate are strong. It’s not a good idea to leave them out of your calculations.

Good point!

This reminds me of the piranha principle. Theories such as nudge, power pose, fat arms, himmicanes, etc., are in competition with each other in that not all these can be true at the same time, but the people who push these theories have a sort of informal alliance. Similarly with conspiracists, who can disagree on the details but agree that they resent the gatekeepers.

Or religious fundamentalists, who can decide to hate each other (your doctrine contradicts my doctrine) or can be allies (we both strongly believe in traditional values). At different places and times, we’ve seen both these sorts of behaviors. I remember after the World Trade Center attacks in 2001 that some Christian fundamentalists were taking strong anti-Islam positions while others were taking Al Qaeda’s side, saying that our decadent society got what was coming to it.

I think more about the bad-science thing, though, the idea that people who are making outlandish claims of huge effects based on noisy data all feel they are on the same side. They can’t agree on what makes us irrational—is it whether we sign at the top or the bottom of a form, or the presence of elderly-associated words, or our facial expression, or how we are sitting, or the outcomes of college football games, or shark attacks, or the time of the month, or whatever—but they can agree on the general principle. They can agree that we’re all “predictably irrational,” even while having incompatible notions about what these predictors might be.

The quotation in the title of this post gets to a different point, though, which refers to some sort of taste for simple dramatic stories. We talked about this a couple years ago in the context of podcaster Joe Rogan, who demonstrates the Chestertonian principle that extreme skepticism is a form of credulity.

There’s also the idea that being an edgelord can be fun in itself. Publicly espousing edgelord positions is a kind of semi-bluff: you take a stance that you know probably isn’t true and then you can enjoy the pleasantly fizzy feeling of being an outsider . . . but then there’s the outside chance that, despite all evidence, your edgelord position is correct, and then you can win big. Or, failing that, you can keep the ball in the air indefinitely by just refusing to accept you were wrong.