- Scott H Young - https://www.scotthyoung.com/blog -

Are Anti-Vaxxers Rational? More Thoughts on the Power of Copying

My last post [1] sparked some debate when I suggested that trying to think through a rational answer for every practical problem is likely unwise.

Duncan Smith comments:

This may be good advice for people with the discipline to distinguish between problems that can be solved with a completely rational approach and those that are complex enough to require copying without understanding. But people are already inclined to the latter approach, even people in the most rational of professions (https://en.wikipedia.org/wiki/Cargo_cult_programming [2]) … I think unwillingness to accept scientific evidence (the anti-vaccine movement, climate science denial, etc.) is a lot more of a problem than excessive rationality.

On the one hand, Duncan reinforces an important point: rationality is good. As a strategy for success, thinking through things and carefully constructing causal models of the world is a good one.

Rationality can be seen as a kind of meta-strategy for solving practical problems. Faced with a practical issue, you can gather evidence, try to imagine how all the pieces fit together and then make predictions about what will happen if one makes changes in the system.

My point wasn’t to state that rationality is a bad meta-strategy, but simply that it isn’t the only good one. In some situations, rationality can fail because the situation is too complex, experiments are hard to run or you simply don’t have the resources to investigate the phenomenon before making a decision.

Copying from a successful exemplar is a good alternative meta-strategy that tends to succeed in situations where rationality can fail. It certainly has its own pitfalls, but given that its strengths and weaknesses are different from rationality, a person using a mixed strategy—sometimes copying, sometimes analyzing, could succeed in more situations.

That being said, I take issue with Duncan’s examples of anti-vaccination movements and climate science denialism being cases of insufficient rationality.

In What Sense are People With Crazy Beliefs Rational?

I believe vaccines are good. Children should be vaccinated and failure to do so not only harms the children themselves, but also creates a public health risk.

I believe global warming is real. While I think the complexity of the models suggest exact predictions are hard to make, I believe the evidence is strong enough to take serious action now.

Yet, I don’t think people who hold these beliefs develop them because of insufficient rationality, at least in the context we’re discussing.

Rationality can be seen in a number of ways. It can be seen broadly, as the process of thinking about the world, trying to create a causal model and making sure beliefs and practices are underpinned by good reasons. Alternatively, it can be construed fairly narrowly, as being only people who use Bayesian probability adjustments or agree with certain scientific establishments.

This wider view is closer to what Seligman, Weller, Puett and Simon [3] call sincerity. The idea that beliefs and practices should be followed because they have good, explicit reasons.

The main alternative to sincerity, or rationality, is the meta-strategy of blind copying. This means doing things ritualistically, rarely thinking about what the reason is for doing something, or if forced to provide an answer, suggesting something equally unintellectual like the Machupe answer of, “It’s our custom.”

The question then, is whether anti-vaxxers or climate change deniers are using the meta-strategy of sincerity with their beliefs or whether they’re blindly copying. In this case, I think it’s clear that both are examples of highly “rational” beliefs, albeit ones that have very different prior structures which result in interpreting evidence differently from scientific consensus.

Take anti-vaxxers. My guess is that many of them hold a set of correlated prior beliefs about the world. One in which evil pharmaceutical companies are actively poisoning children. Where scientific authorities are corrupt, ignorant or both. Where conspiracies to control the population are entirely plausible.

I’m not confident on the exact belief structure of anti-vaxxers, but having met people who hold similar beliefs, I can say that most of these people have thought a lot about the beliefs they hold. Especially if these people live in a society where the majority strongly oppose their views, the only way their viewpoints are sustainable is if they have given a lot of thought to them.

Fundamentalist religion is also an example of what I would suggest as rationality gone haywire. People believe in the value of their religion and spiritual beliefs (which is fine). But noting the inconsistencies in religious texts, instead of ignoring this contradiction by not overanalyzing it, they proceed in the opposite direction—updating all their other beliefs to conform to a literal interpretation of scripture.

Is Rationality that Reaches False Conclusions Still Rational?

Some people might argue I’m being overly broad in describing rationality. How can I consider the processes that arrive at believing the Earth is only 6000 years old, or that vaccines cause autism, be considered rational?

However I think it’s important to separate the procedure of rationality from its outcome. If you start with strong, and probably false, prior beliefs about the world (say that evil corporations are in a conspiracy to poison kids, or that the words of a book written thousands of years ago are the only source of inviolable truth) then it only takes a minimal amount of confirmation bias and rationalizing to arrive at fairly crazy beliefs.

You could point out that good rationalists don’t do that. They don’t succumb to cognitive biases. They meticulously update their Bayesian priors according to the evidence. They equally weigh the beliefs of their culture against the beliefs of other cultures. But now you’re no longer describing any living human being, including people I would consider to hold quite correct, rational beliefs about the world.

If rationality is going to be done by humans, it’s going to be done imperfectly, so comparing perfect rationality with imperfect copying is not a fair comparison of the two meta-strategies.

It’s also easy to dismiss rationality-gone-haywire examples where you feel confident that you know what the correct answer is. That is to say, it’s easy to dismiss anti-vaxxers as being rational because you happen to know that vaccines are good.

However, that confidence is useless when facing genuine practical problems without an obvious correct answer. Consider a few examples where I’m not confident the meta-strategy of rationality is the correct choice:

  1. You’re trying to start a new business. You have the option of copying the broad strategy of marketing used by several successful firms in the industry. Or—you can develop a new strategy based on thinking deeply about how marketing works.
  2. You want to be healthier and you’re not sure what to eat. You have the option of choosing to eat most your meals by a culture of healthy eaters—copying as much as you can from the elements from meals, times, preparation and ingredients. Or—you can meticulously make sure your food intake matches FDA guidelines, maybe drinking something like Soylent.
  3. You want to have a happy, loving relationship. You have the option of looking at exemplars in your community and trying to emulate their patterns of behavior. Or—you can devise and implement a theory of relationships.

What’s the correct approach in these situations? I’m not sure to be honest. In some, I’m inclined to support a rational approach. In others, I’m inclined to blindly copy a cultural package of successful behaviors.