Are Anti-Vaxxers Rational? More Thoughts on the Power of Copying

My last post sparked some debate when I suggested that trying to think through a rational answer for every practical problem is likely unwise.

Duncan Smith comments:

This may be good advice for people with the discipline to distinguish between problems that can be solved with a completely rational approach and those that are complex enough to require copying without understanding. But people are already inclined to the latter approach, even people in the most rational of professions (https://en.wikipedia.org/wiki/Cargo_cult_programming) … I think unwillingness to accept scientific evidence (the anti-vaccine movement, climate science denial, etc.) is a lot more of a problem than excessive rationality.

On the one hand, Duncan reinforces an important point: rationality is good. As a strategy for success, thinking through things and carefully constructing causal models of the world is a good one.

Rationality can be seen as a kind of meta-strategy for solving practical problems. Faced with a practical issue, you can gather evidence, try to imagine how all the pieces fit together and then make predictions about what will happen if one makes changes in the system.

My point wasn’t to state that rationality is a bad meta-strategy, but simply that it isn’t the only good one. In some situations, rationality can fail because the situation is too complex, experiments are hard to run or you simply don’t have the resources to investigate the phenomenon before making a decision.

Copying from a successful exemplar is a good alternative meta-strategy that tends to succeed in situations where rationality can fail. It certainly has its own pitfalls, but given that its strengths and weaknesses are different from rationality, a person using a mixed strategy—sometimes copying, sometimes analyzing, could succeed in more situations.

That being said, I take issue with Duncan’s examples of anti-vaccination movements and climate science denialism being cases of insufficient rationality.

In What Sense are People With Crazy Beliefs Rational?

I believe vaccines are good. Children should be vaccinated and failure to do so not only harms the children themselves, but also creates a public health risk.

I believe global warming is real. While I think the complexity of the models suggest exact predictions are hard to make, I believe the evidence is strong enough to take serious action now.

Yet, I don’t think people who hold these beliefs develop them because of insufficient rationality, at least in the context we’re discussing.

Rationality can be seen in a number of ways. It can be seen broadly, as the process of thinking about the world, trying to create a causal model and making sure beliefs and practices are underpinned by good reasons. Alternatively, it can be construed fairly narrowly, as being only people who use Bayesian probability adjustments or agree with certain scientific establishments.

This wider view is closer to what Seligman, Weller, Puett and Simon call sincerity. The idea that beliefs and practices should be followed because they have good, explicit reasons.

The main alternative to sincerity, or rationality, is the meta-strategy of blind copying. This means doing things ritualistically, rarely thinking about what the reason is for doing something, or if forced to provide an answer, suggesting something equally unintellectual like the Machupe answer of, “It’s our custom.”

The question then, is whether anti-vaxxers or climate change deniers are using the meta-strategy of sincerity with their beliefs or whether they’re blindly copying. In this case, I think it’s clear that both are examples of highly “rational” beliefs, albeit ones that have very different prior structures which result in interpreting evidence differently from scientific consensus.

Take anti-vaxxers. My guess is that many of them hold a set of correlated prior beliefs about the world. One in which evil pharmaceutical companies are actively poisoning children. Where scientific authorities are corrupt, ignorant or both. Where conspiracies to control the population are entirely plausible.

I’m not confident on the exact belief structure of anti-vaxxers, but having met people who hold similar beliefs, I can say that most of these people have thought a lot about the beliefs they hold. Especially if these people live in a society where the majority strongly oppose their views, the only way their viewpoints are sustainable is if they have given a lot of thought to them.

Fundamentalist religion is also an example of what I would suggest as rationality gone haywire. People believe in the value of their religion and spiritual beliefs (which is fine). But noting the inconsistencies in religious texts, instead of ignoring this contradiction by not overanalyzing it, they proceed in the opposite direction—updating all their other beliefs to conform to a literal interpretation of scripture.

Is Rationality that Reaches False Conclusions Still Rational?

Some people might argue I’m being overly broad in describing rationality. How can I consider the processes that arrive at believing the Earth is only 6000 years old, or that vaccines cause autism, be considered rational?

However I think it’s important to separate the procedure of rationality from its outcome. If you start with strong, and probably false, prior beliefs about the world (say that evil corporations are in a conspiracy to poison kids, or that the words of a book written thousands of years ago are the only source of inviolable truth) then it only takes a minimal amount of confirmation bias and rationalizing to arrive at fairly crazy beliefs.

You could point out that good rationalists don’t do that. They don’t succumb to cognitive biases. They meticulously update their Bayesian priors according to the evidence. They equally weigh the beliefs of their culture against the beliefs of other cultures. But now you’re no longer describing any living human being, including people I would consider to hold quite correct, rational beliefs about the world.

If rationality is going to be done by humans, it’s going to be done imperfectly, so comparing perfect rationality with imperfect copying is not a fair comparison of the two meta-strategies.

It’s also easy to dismiss rationality-gone-haywire examples where you feel confident that you know what the correct answer is. That is to say, it’s easy to dismiss anti-vaxxers as being rational because you happen to know that vaccines are good.

However, that confidence is useless when facing genuine practical problems without an obvious correct answer. Consider a few examples where I’m not confident the meta-strategy of rationality is the correct choice:

  1. You’re trying to start a new business. You have the option of copying the broad strategy of marketing used by several successful firms in the industry. Or—you can develop a new strategy based on thinking deeply about how marketing works.
  2. You want to be healthier and you’re not sure what to eat. You have the option of choosing to eat most your meals by a culture of healthy eaters—copying as much as you can from the elements from meals, times, preparation and ingredients. Or—you can meticulously make sure your food intake matches FDA guidelines, maybe drinking something like Soylent.
  3. You want to have a happy, loving relationship. You have the option of looking at exemplars in your community and trying to emulate their patterns of behavior. Or—you can devise and implement a theory of relationships.

What’s the correct approach in these situations? I’m not sure to be honest. In some, I’m inclined to support a rational approach. In others, I’m inclined to blindly copy a cultural package of successful behaviors.

  • DT

    To my opinion, the 2 meta-strategies can be related. It is over-simplified to separate rationality (= science) vs. cultural/religion beliefs. Cultural rituals or prior beliefs about the world are formed on a basis of “vague” rationality, they can be proven to be wrong or right, but it depends a lot on the situation. Like your example of mixing corn & ash, maybe it’s a result of a long term experiment and those people figure out that by doing so they will survive (or by natural selection, people survive by doing so), then it became a habit, or ritual.

    Of course some rituals are just non-sense (to the western civilization), like human sacrifice. But on the other hand, even science have similarities to religion and not all scientific results are absolute truths. There are debates in philosophy of science about realism vs. anti-realism and scientific method does have flaws (like the Hume induction problem etc.). What we know about the world, scientifically, could be just a reflection of us, and in the end, science or religion are all “wrong” (or “right”) at the same time.

    Get down to the practical level, to some extend, we all use mixed strategies and no one can be 100% rational. In a supermarket, you don’t choose a set of goods and then do linear programming to minimize the cost or vice-versa, determine your utility function for each good and then maximize your total welfare. What you do is both: you copy the social standards of what you should eat/wear, and do a little optimization with limited information on price and subjective/perceived quality.

    Another example, you don’t really need a detailed model in game theory to know that lying/cheating is bad, for yourself and for others. That’s what we have been taught, a lot of people just believe it. And it turns out to be true. It’s a cultural belief, at the same time, a result of a very long term experiment.

  • DT

    To my opinion, the 2 meta-strategies can be related. It is over-simplified to separate rationality (= science) vs. cultural/religion beliefs. Cultural rituals or prior beliefs about the world are formed on a basis of “vague” rationality, they can be proven to be wrong or right, but it depends a lot on the situation. Like your example of mixing corn & ash, maybe it’s a result of a long term experiment and those people figure out that by doing so they will survive (or by natural selection, people survive by doing so), then it became a habit, or ritual.

    Of course some rituals are just non-sense (to the western civilization), like human sacrifice. But on the other hand, even science have similarities to religion and not all scientific results are absolute truths. There are debates in philosophy of science about realism vs. anti-realism and scientific method does have flaws (like the Hume induction problem etc.). What we know about the world, scientifically, could be just a reflection of us, and in the end, science or religion are all “wrong” (or “right”) at the same time.

    Get down to the practical level, to some extend, we all use mixed strategies and no one can be 100% rational. In a supermarket, you don’t choose a set of goods and then do linear programming to minimize the cost or vice-versa, determine your utility function for each good and then maximize your total welfare. What you do is both: you copy the social standards of what you should eat/wear, and do a little optimization with limited information on price and subjective/perceived quality.

    Another example, you don’t really need a detailed model in game theory to know that lying/cheating is bad, for yourself and for others. That’s what we have been taught, a lot of people just believe it. And it turns out to be true. It’s a cultural belief, at the same time, a result of a very long term experiment.

  • Scott Young

    I think in practice most people use the copying strategy far, far more than rationally conceived answers. It simply is hardwired into our brains.

    But, at the same time, I think it has become a popular idea that rationality beats tradition and that the world would be a better place if we could all think deeply for ourselves about every topic. I simply want to shake that assumption a bit. If people recognized that there is a non-rational path by which some ideas can become good ones, it can at least provide a counterargument against particularly flimsy pieces of reasoning being held up as absolute truths.

  • Scott Young

    I think in practice most people use the copying strategy far, far more than rationally conceived answers. It simply is hardwired into our brains.

    But, at the same time, I think it has become a popular idea that rationality beats tradition and that the world would be a better place if we could all think deeply for ourselves about every topic. I simply want to shake that assumption a bit. If people recognized that there is a non-rational path by which some ideas can become good ones, it can at least provide a counterargument against particularly flimsy pieces of reasoning being held up as absolute truths.

  • chapeau

    Hello, this is not about your article.
    My goal isn’t to know everything–not possible. But it’s true that I am frustrated to be able to read somestuff and not other stuff because I don’t understand why he or she wrote that. So, I try to make sense of the world.
    My goal is to be ablr to make sense of the world by being able to look at something and ask a question about it. I keep asking. The law vs. the arts, for example. What I believe is that in the arts, it is really the matter of a personal thing (writing something or painting something). The law is written y a group of person. I don’t know if this works like that.
    What about history?

  • chapeau

    Hello, this is not about your article.
    My goal isn’t to know everything–not possible. But it’s true that I am frustrated to be able to read somestuff and not other stuff because I don’t understand why he or she wrote that. So, I try to make sense of the world.
    My goal is to be ablr to make sense of the world by being able to look at something and ask a question about it. I keep asking. The law vs. the arts, for example. What I believe is that in the arts, it is really the matter of a personal thing (writing something or painting something). The law is written y a group of person. I don’t know if this works like that.
    What about history?

  • I agree that we’re hardwired for copying. So while copying may be a good meta-strategy, it’s one that people are already biased to use. That’s why it’s tricky advice to give. It’s like writing about studies showing that red wine and chocolate have health benefits. That may be true, but most people won’t replicate the study conditions when they apply the results to their eating and partying activities.

    Are anti-vaxxers rational? Some of them are, mainly those who write the books and articles other people read. But it’s more likely that any particular anti-vaxxer is using the copying approach, just repeating what they read from influencers. Of course, pro-vaxxers are doing the same thing! Most of us aren’t virologists, so we have to rely on the advice of experts. Given this bias toward the copying meta-strategy, making an effort to apply the rationality meta-strategy is likely to result in the optimal mix of strategies. Also, we should probably be eating less chocolate.

  • Duncan Smith

    I agree that we’re hardwired for copying. So while copying may be a good meta-strategy, it’s one that people are already biased to use. That’s why it’s tricky advice to give. It’s like writing about studies showing that red wine and chocolate have health benefits. That may be true, but most people won’t replicate the study conditions when they apply the results to their eating and partying activities.

    Are anti-vaxxers rational? Some of them are, mainly those who write the books and articles other people read. But it’s more likely that any particular anti-vaxxer is using the copying approach, just repeating what they read from influencers. Of course, pro-vaxxers are doing the same thing! Most of us aren’t virologists, so we have to rely on the advice of experts. Given this bias toward the copying meta-strategy, making an effort to apply the rationality meta-strategy is likely to result in the optimal mix of strategies. Also, we should probably be eating less chocolate.

  • James P

    Hmm… Rationality is never truly objective, its always viewed through some lens.

    Take anti vaxxers. I’ve talked to a few in that club that do not in any way deny science, but they do, however, have an opinion that often gets them lumped with creationists and the like. It’s a matter of what they believe we should do with the scientific evidence and not a matter of rationality.

    Staying on topic, pro vaxxers who positively respond to the science behind the effectiveness of vaccines have built into their lens ideas like:
    -we should help the “weak,” because every human life has value.<—Is that objectively true?

    And I agree that there seems to be an almost knee jerk reaction in these types of cases to belittle anyone with a dissenting opinion before really ever asking them why they believe the way they do.

    When it comes to science, and our record of using its discovered knowledge, the slate is filled with examples of rational misuse.

  • James P

    Hmm… Rationality is never truly objective, its always viewed through some lens.

    Take anti vaxxers. I’ve talked to a few in that club that do not in any way deny science, but they do, however, have an opinion that often gets them lumped with creationists and the like. It’s a matter of what they believe we should do with the scientific evidence and not a matter of rationality.

    Staying on topic, pro vaxxers who positively respond to the science behind the effectiveness of vaccines have built into their lens ideas like:
    -we should help the “weak,” because every human life has value.< ---Is that objectively true?

    And I agree that there seems to be an almost knee jerk reaction in these types of cases to belittle anyone with a dissenting opinion before really ever asking them why they believe the way they do.

    When it comes to science, and our record of using its discovered knowledge, the slate is filled with examples of rational misuse.

  • Scott Young

    I don’t think the discussion needs to be “more copying” vs “more rationality”. Rather, it’s recognizing that the two strategies have different strengths and weaknesses and so should be applied to different areas.

    I’m also inclined to believe that people who hold stranger beliefs tend to be operating on a somewhat more “rational” or “rationalizing” meta-strategy (think of how well-thought out most conspiracy theories need to be to sound compelling). That’s because the environmental default beliefs aren’t strange and so there is a constant pressure to conform to those beliefs, which must be countered by some kind of ad-hoc rationalization.

    I have a small point of disagreement with you on the issue of “copying” beliefs. Although I think very few of our beliefs are formally thought out and are agreed upon because we believe in certain authority figures, I think this is somewhat different than blind copying. Ritualistic copying is to imitate a successful exemplar, because it’s successful or conformist, not because it has some good explanation behind it. By this understanding, someone not taking vaccines because they believe vaccines secretly cause autism is a “rational” belief (one deduced from a causal model) whereas not taking vaccines because Jenny McCarthy is famous and beautiful would be ritualistic copying. By this logic, anti-vax adherence is only “blind copying” in the rare cases where an insular subculture refuses vaccinations because it’s abnormal (maybe some religious sect?).

  • Scott Young

    I don’t think the discussion needs to be “more copying” vs “more rationality”. Rather, it’s recognizing that the two strategies have different strengths and weaknesses and so should be applied to different areas.

    I’m also inclined to believe that people who hold stranger beliefs tend to be operating on a somewhat more “rational” or “rationalizing” meta-strategy (think of how well-thought out most conspiracy theories need to be to sound compelling). That’s because the environmental default beliefs aren’t strange and so there is a constant pressure to conform to those beliefs, which must be countered by some kind of ad-hoc rationalization.

    I have a small point of disagreement with you on the issue of “copying” beliefs. Although I think very few of our beliefs are formally thought out and are agreed upon because we believe in certain authority figures, I think this is somewhat different than blind copying. Ritualistic copying is to imitate a successful exemplar, because it’s successful or conformist, not because it has some good explanation behind it. By this understanding, someone not taking vaccines because they believe vaccines secretly cause autism is a “rational” belief (one deduced from a causal model) whereas not taking vaccines because Jenny McCarthy is famous and beautiful would be ritualistic copying. By this logic, anti-vax adherence is only “blind copying” in the rare cases where an insular subculture refuses vaccinations because it’s abnormal (maybe some religious sect?).

  • crispy2000

    Enjoyed your article, and like the concept of sincerity.

    Lumping “anti-vaxxers” and “climate-change deniers” in with cargo cultists is a cartoonish oversimplification. There is a spectrum of nuanced opinion in either subject. Many “anti-vaxxers” I know are not opposed to all vaccinations, but rather the heavy vaccination schedule imposed on infants. They acknowledge that some diseases, e.g. smallpox, are deadly enough to justify the risk of adverse reactions. Others such as HPV are seen as less likely to occur or of low enough severity that the risk of adverse reactions outweighs the perceived benefit. It’s not necessary to believe that pharma companies are evil to be skeptical of their claims of safety. Look at the history of Vioxx, in which the efficacy was exaggerated, and the adverse effects minimized by the manufacturer. In summary, it can be perfectly rational to hold some of these derided opinions.

    On a higher level, most people don’t have the time, interest, or ability to study such topics in depth. They may follow the opinions of those they respect, and rationalize their choices in a way that suits them. Dan Ariely’s book, Predictably Irrational, suggests that we may claim rational reasons for our choices while falling into fairly predictable traps. We may think we’re being rational, but wind up making choices with our reptilian level of thought.

  • crispy2000

    Enjoyed your article, and like the concept of sincerity.

    Lumping “anti-vaxxers” and “climate-change deniers” in with cargo cultists is a cartoonish oversimplification. There is a spectrum of nuanced opinion in either subject. Many “anti-vaxxers” I know are not opposed to all vaccinations, but rather the heavy vaccination schedule imposed on infants. They acknowledge that some diseases, e.g. smallpox, are deadly enough to justify the risk of adverse reactions. Others such as HPV are seen as less likely to occur or of low enough severity that the risk of adverse reactions outweighs the perceived benefit. It’s not necessary to believe that pharma companies are evil to be skeptical of their claims of safety. Look at the history of Vioxx, in which the efficacy was exaggerated, and the adverse effects minimized by the manufacturer. In summary, it can be perfectly rational to hold some of these derided opinions.

    On a higher level, most people don’t have the time, interest, or ability to study such topics in depth. They may follow the opinions of those they respect, and rationalize their choices in a way that suits them. Dan Ariely’s book, Predictably Irrational, suggests that we may claim rational reasons for our choices while falling into fairly predictable traps. We may think we’re being rational, but wind up making choices with our reptilian level of thought.

  • Len

    As an autistic person, I’d like to say something myself about the anti-vaxx topic. People are so scared of autism–they treat it like a death sentence, mainly to the fault of organizations like Autism Speaks (a hate group, essentially, that has no regard for actual autistic people and is hell-bent on finding a cure that is nearly impossible to find since autism is a neurodevelopmental disorder. Please google “why autism speaks is bad” if you would like to learn more; they’re a seriously horrifying organization). What people don’t see about autistic people is that we are wired to think and process differently. Many of the problems autistic people have are a symptom of living in a world made for regular people (which is not at the fault of non-autistic people at all! I wouldn’t expect something so big considering we’re only a small portion of the population and are deeply misunderstood). What, essentially, an anti-vaxxer is saying with the action of not vaccinating their kids is that they would rather have their child die a slow, painful death from a preventable cause…than have an autistic child. They’ll trade smallpox for noise and light sensitivity and learning difficulties.

    There’s an ideology with autistic people–and I think many of those with disabilities in general–that we are inferior to the “normal” person. There’s this narrative put out that if you are autistic you will never be able to do the same things that a “normal” person could do. Places like Autism Speaks creates this false ideology that autism NEEDS a cure, that it only happens with children (mainly because it’s easier to get fundraising for cute little kids than it is for adults who need help), that your child has been stolen from you and replaced with something horrendous (there’s this sense of sympathy when a parent of an autistic child kills their child because “it was too much of a burden”. There’s something so deeply and horrifically wrong with that). Yes, I have difficulties in my daily life. I still deserve to exist.

    The research should be focused on how to help autistic people, not how to eradicate them.

AS SEEN IN