When is it Good To Be a Hypocrite?

In my recent article where I shared beliefs I’ve changed (or started to doubt) since starting writing this blog, I included one hypothesis about why we so often fail to do the things we should:

Are a lot of “good” habits just signalling attempts? Maybe I’ve just been reading too much Robin Hanson, but he does provide a convincing explanation of why we often don’t things we think we should. The reason being, that we don’t actually want to do those things, but want to seem like we do, so we make striving to do those things part of our conscious self-presentation but subconsciously sabotage ourselves in actually doing them. Some possible examples: reading more books, not watching television, giving to charity.

The deep, dark idea here is that human beings evolved not only with desires to do things which were helpful in our survival, but also desires to express desires to other people which helped our survival. So a person might have a built-in desire for food to keep from being hungry, but also a built-in desire to appear generous, to attract friends and allies.

The dark part of the idea is that sometimes the desire to project a certain self-image runs counter to our more direct desires. For example, appearing generous is beneficial because it wins you friends, but being generous is costly because you have to actually put up money.

Charity is a rather obvious example of something we all want to create the impression of (that we’re charitable people) but that we may not necessarily feel like doing (actually giving away our time and money).

One, even darker, idea is that since lying is hard to do, our natures may have evolved to protect us from the dangers inherent in this self-contradiction. Instead of self-consciously feeling like we want to appear charitable but actually be stingy, we sincerely believe in the values of charity but subconsciously sabotage ourselves in actually donating anything. Instead, we claim failures of discipline, busyness of literally anything possible when confronted with the contradiction that we believe we should give to charity but don’t actually give very much.

As a solution to the evolutionary problems (wanting to appear generous, be stingy, but lying is hard), this one actually works rather well. Nobody can truly call you a liar, because you sincerely believe in the values you want to project, but such sincerity doesn’t cost you anything because you’ll invent excuses to prevent you from actually following through with it.

I’m not convinced that this model of human behavior explains every gap in our idealized self-image and actual behavior, but it does seem to explain a lot of the seeming irrationality in people’s actions.

Hypocrites, By Design

“Learning Spanish and learning to play guitar are the two things on the perpetual back-burner of every American.”

Vat, who I did my year-long language learning project with (and also a guitar player), joked this to me while we were preparing our TEDx Talk. I laughed and said I completely agreed. Tons of people we’ve met claim a deep desire to learn Spanish, but haven’t even put in the most modest of efforts to move closer to that goal.

Language learning is a high-status activity. Being able to speak more than one language shows you’re cultured, intelligent and interesting (at least, in theory). Therefore, many people want to self-consciously express a desire to learn a language.

However language learning is hard work. Not to mention that the economic returns to learning a language other than English in the United States are considerably lower than the benefit of learning English in most other countries. Therefore, for a lot of people, learning a language is a goal they want to appear to have, but they don’t actually have sufficient interest for actually pursuing it.

I used language learning as an example of such an activity, but I think there are many potential other candidates of hypocrisy-by-design:

  • Reading more books. Books are hard to read (compared to blogs or magazines) and signal intelligence, knowledge and sophistication. So we want to give people the impression we strive to read books, but not actually do this.
  • Cutting back on vices like television, junk food, drinking or smoking. It’s not socially acceptable to simply state that you value eating junk food more than being thin or healthy. So instead you use excuses like you don’t have time to eat well, or you lack willpower.
  • Making improvements in your career. We want to appear ambitious and productive, so we might come up with idealized dream jobs, but not actually take any concrete actions to make them a reality.

How Important is Hypocrisy-By-Design?

Obviously this cynical explanation for the numerous failures of self-improvement efforts, competes with the more typical story. The typical story is that we fail to improve ourselves because of failures of discipline, motivation or strategy. If we could augment or change those elements, the correct changes would happen for us.

I don’t believe that it is the case that all failures of self-improvement are because of this, or that the typical story is wrong. Instead, I think it’s probably a mixture of both, depending on the case. Sometimes we fail earnestly because the goal is hard. Sometimes we fail because we never really wanted the goal in the first place.

However, I believe that in order to live better, it’s important to know not only why we succeed when we do, but also why we fail. Therefore, when you constantly struggle against a self-improvement goal, it’s worth examining the possibility that your failure is not because of the typical story of low willpower and poor strategy, but possibly because you’re sabotaging yourself from reaching it.

Everyone is Trying to Sell You Something

I remember when I was a kid, I was often warned to beware of people trying to sell you something. The implication being, salespeople are often self-serving with their persuasion, if not downright dishonest, and therefore you’re rarely getting an unbiased account of the information.

I feel this attitude is profoundly unfair.

But not because I feel all salespeople should be trusted, but rather because of the implicit suggestion that just because money isn’t changing hands, all the non-salespeople of the world are somehow telling you the truth. The warning is unfair not because it judges salespeople too harshly, but because it implicitly judges non-salespeople too lightly.

The reality is every person you meet who gives you information on any issue is trying to sell you on something. Very often what they’re trying to sell you on is viewing the world in a certain way. The bias introduced from that is often less obvious than it is with an actual salesperson, which in some ways, makes it more dangerous.

The bias of ordinary people is deceptive in two ways:

First, it may not be obvious there’s any material gain from the other side in convincing you to believe something. This can lead to a false conclusion that if someone has no incentive to dissemble, they’ll give an unbiased account of the truth.

Second, we lie to ourselves even more than we lie to other people. Therefore, we often allow people to contort logic and evidence in ridiculous ways if it appear that they at least genuinely believe what they’re saying. They’re following the George Costanza logic when he said, “It’s not a lie, if you believe it.”

The Chinese Robber Fallacy

Suppose you heard a story in the news that one Chinese person was a thief. Would you believe that China is a country full of robbers?

Probably not, you’d say. One anecdote hardly proves something about an entire country, even if the anecdote were described in grizzly detail.

But what about if you heard one hundred stories about Chinese robbers? One thousand? What if you had so many stories that it would take you a hundred years of reading non-stop just to get through all of the stories about the thieving Chinese people, would you now believe China is full of thieves?

This is the exact picture Scott Alexander paints here, and the answer is still no. The reason is that, despite the possible prevalence of thieves-in-China stories, there are over a billion people in China. Therefore, even if only 0.1% of Chinese people had ever stolen anything before, you’d still end up with over one million stories!

The point being that people can lie to you, even when everything they say is the truth. If they selectively pick examples from the pool of evidence—even if every example is true in all aspects—the result can be just as bad as if they lied to you directly.

Although the Chinese robber fallacy owes is hypothetical, having been to China, I feel there is a large, persistent anti-China bias in Western media. Selective reporting often paints a picture of China as being the awful place that freedom forgot, when in reality it’s a huge country with more than its fair share of both wonderful and dismal.

Appealing to a World View

Salespeople like to sell people things they want. Although we all have had experiences of buying products that turn out to be junk, the majority of things we buy probably work as advertised. It’s just so much easier to create a product people will like than it is to consistently and repeatedly sell garbage.

The bias of salespeople is to sell you the things you want. That applies to ideas as much as products. Since everyone is trying to sell you something, very often what they are selling is whatever it is you want to hear.

In my opinion, this explains a lot of anti-Chinese bias in reporting in the West. People are xenophobic by nature, and China is a large, powerful and culturally quite different country than the United States, Canada or France. That latent fear of the unknown means we’re more likely to pick up the paper if it tells us about some Chinese disaster that would be unthinkable here, than about a charity drive or an innovative new company.

Because companies like Google, Facebook and Twitter are getting ever-better at giving you more of the type of information you like to consume, this is a problem that is getting worse, not better. Before, we had to worry about cabals of media tycoons, selectively deciding what the world was going to believe. Now, we have to worry more about self-manufactured bubbles, where we only hear news pre-tailored to our existing prejudices.

Sometimes what you want to hear isn’t what makes you feel good. In fact, it’s probably more likely that the opposite is the case—we seek out news which amplifies our fears, rather than lessens them. The almost absurd polarization on issues in social media websites like Reddit, Twitter or Tumblr, is a perfect example of people building outrage resonance chambers. Anger is a more viral emotion than happiness, so in an ecosystem filtering thousands of ideas, only the memes perfectly evolved to strike at what scares or enrages us will be seen.

Journalists’ Versus Scientists’ Ethics

Professional organizations frequently reinforce codes of conduct that apply to their members, above and beyond what is expected from normal members of society.

Creating and enforcing these additional rules is a classic defense against the tragedy of the commons. People tend to believe or distrust a profession as a whole, so it benefits selfish professionals to abuse that implicit trust and benefit for themselves. As a result, organizations have legal or social mechanisms to punish members who damage the reputation of the group.

These norms of conduct, however, come about differently in different professions. As a result, when two different communities of professionals overlap in the same work they offer, there can be clashes of these professional norms.

One area this happens frequently is between journalists and scientists. Journalists normally deal with fuzzy issues and are often not trained to do rigorous statistical analysis. As a result, their professional norms focus on getting the quotations right, avoiding plagiarism, protecting confidentiality of sources, etc.

Scientists, are generally trained in the best available analytical techniques for a particular field. So they have much more stringent requirements. It’s not enough to vividly describe an anecdote that occurred during your experiment—you need to post your data. Ideally, although sometimes not in practice, your experiment should be replicable and your hypotheses stated in advance to avoid statistical manipulation.

The conflict between these two ethical norms is that journalists can often get away with things scientists can’t. They can cherry-pick research that supports their conclusion. They can paint vivid anecdotes which run contrary to a statistical trend. They can decide the angle for a story, in advance, and then find the evidence which fits the pieces later.

Scientists’ Versus Still Other Scientists’ Ethics

This clash of norms doesn’t make scientists immune. Frequently scientists of different disciplines will land on the same subject of inquiry, and then there is much infighting about who is really an authority on the topic.

Cognitive science is a field with such overlap. Linguists, psychologists, philosophers and computer scientists have very different tools, methods and ingrained professional standards, which can often conflict. Each side is trained to see why their approach is the most valuable. Philosophers can believe certain questions are outside the power of empiricism. Psychologists can believe a computer simulation provides no insight into how the brain works. Computer scientists argue that linguists adhere too much to formal grammars.

These conflicts occur in the overlap of academic disciplines, where the territory hasn’t clearly been won by one side or another. Most areas of academic inquiry either have only one major investigator, or at least one discipline has become the de facto authority above others when disputes arise.

But these temporary border disputes often reveal that a discipline itself may be the highest status authority on a subject, but the internal rules by which it operates are often historical and somewhat arbitrary. Depending on where its community of norms grew out of, many of the disciplines could have evolved quite differently.

Each of these communities of values, with their own internal incentives and pressures creates the power to deceive, even if it is better controlled in some communities than others.

I’m also aware of the irony in commenting on the strengths and weaknesses of different professional codes of conduct as a blogger. Bloggers are not a coherent group and have far fewer group-enforced norms. But, I suppose, that’s probably also why we’re seen as a far weaker authority than scientists and journalists.

How Do You Fight the Deception?

My first instinct in this regard would be to advise you to be aware of the types of biases present in the person giving you advice. To know that researcher standards owe heavily to the history of the discipline. To know that journalists may not lie about a story, but they may selectively pick their evidence to produce the same effect.

But, there’s the danger of selective cynicism, whereby learning more about bias causes you to become more skeptical, but only on the evidence you don’t want to believe, exacerbating rather than fixing the problem.

My second, more humbled instinct, therefore is to seek out new ideas and experiences that you disagree with, admit when you’re wrong frequently and with pleasure, and don’t believe just because there isn’t a suitcase full of products, the person you’re speaking to isn’t trying sell you something.