- Scott H Young - https://www.scotthyoung.com/blog -

Everyone is Trying to Sell You Something

I remember when I was a kid, I was often warned to beware of people trying to sell you something. The implication being, salespeople are often self-serving with their persuasion, if not downright dishonest, and therefore you’re rarely getting an unbiased account of the information.

I feel this attitude is profoundly unfair.

But not because I feel all salespeople should be trusted, but rather because of the implicit suggestion that just because money isn’t changing hands, all the non-salespeople of the world are somehow telling you the truth. The warning is unfair not because it judges salespeople too harshly, but because it implicitly judges non-salespeople too lightly.

The reality is every person you meet who gives you information on any issue is trying to sell you on something. Very often what they’re trying to sell you on is viewing the world in a certain way. The bias introduced from that is often less obvious than it is with an actual salesperson, which in some ways, makes it more dangerous.

The bias of ordinary people is deceptive in two ways:

First, it may not be obvious there’s any material gain from the other side in convincing you to believe something. This can lead to a false conclusion that if someone has no incentive to dissemble, they’ll give an unbiased account of the truth.

Second, we lie to ourselves even more than we lie to other people. Therefore, we often allow people to contort logic and evidence in ridiculous ways if it appear that they at least genuinely believe what they’re saying. They’re following the George Costanza logic when he said, “It’s not a lie, if you believe it.”

The Chinese Robber Fallacy

Suppose you heard a story in the news that one Chinese person was a thief. Would you believe that China is a country full of robbers?

Probably not, you’d say. One anecdote hardly proves something about an entire country, even if the anecdote were described in grizzly detail.

But what about if you heard one hundred stories about Chinese robbers? One thousand? What if you had so many stories that it would take you a hundred years of reading non-stop just to get through all of the stories about the thieving Chinese people, would you now believe China is full of thieves?

This is the exact picture Scott Alexander paints here [1], and the answer is still no. The reason is that, despite the possible prevalence of thieves-in-China stories, there are over a billion people in China. Therefore, even if only 0.1% of Chinese people had ever stolen anything before, you’d still end up with over one million stories!

The point being that people can lie to you, even when everything they say is the truth. If they selectively pick examples from the pool of evidence—even if every example is true in all aspects—the result can be just as bad as if they lied to you directly.

Although the Chinese robber fallacy owes is hypothetical, having been to China [2], I feel there is a large, persistent anti-China bias in Western media. Selective reporting often paints a picture of China as being the awful place that freedom forgot, when in reality it’s a huge country with more than its fair share of both wonderful and dismal.

Appealing to a World View

Salespeople like to sell people things they want. Although we all have had experiences of buying products that turn out to be junk, the majority of things we buy probably work as advertised. It’s just so much easier to create a product people will like than it is to consistently and repeatedly sell garbage.

The bias of salespeople is to sell you the things you want. That applies to ideas as much as products. Since everyone is trying to sell you something, very often what they are selling is whatever it is you want to hear.

In my opinion, this explains a lot of anti-Chinese bias in reporting in the West. People are xenophobic by nature, and China is a large, powerful and culturally quite different country than the United States, Canada or France. That latent fear of the unknown means we’re more likely to pick up the paper if it tells us about some Chinese disaster that would be unthinkable here, than about a charity drive or an innovative new company.

Because companies like Google, Facebook and Twitter are getting ever-better at giving you more of the type of information you like to consume, this is a problem that is getting worse, not better. Before, we had to worry about cabals of media tycoons, selectively deciding what the world was going to believe. Now, we have to worry more about self-manufactured bubbles, where we only hear news pre-tailored to our existing prejudices.

Sometimes what you want to hear isn’t what makes you feel good. In fact, it’s probably more likely that the opposite is the case—we seek out news which amplifies our fears, rather than lessens them. The almost absurd polarization on issues in social media websites like Reddit, Twitter or Tumblr, is a perfect example of people building outrage resonance chambers. Anger is a more viral emotion than happiness, so in an ecosystem filtering thousands of ideas, only the memes perfectly evolved to strike at what scares or enrages us will be seen.

Journalists’ Versus Scientists’ Ethics

Professional organizations frequently reinforce codes of conduct that apply to their members, above and beyond what is expected from normal members of society.

Creating and enforcing these additional rules is a classic defense against the tragedy of the commons. People tend to believe or distrust a profession as a whole, so it benefits selfish professionals to abuse that implicit trust and benefit for themselves. As a result, organizations have legal or social mechanisms to punish members who damage the reputation of the group.

These norms of conduct, however, come about differently in different professions. As a result, when two different communities of professionals overlap in the same work they offer, there can be clashes of these professional norms.

One area this happens frequently is between journalists and scientists. Journalists normally deal with fuzzy issues and are often not trained to do rigorous statistical analysis. As a result, their professional norms focus on getting the quotations right, avoiding plagiarism, protecting confidentiality of sources, etc.

Scientists, are generally trained in the best available analytical techniques for a particular field. So they have much more stringent requirements. It’s not enough to vividly describe an anecdote that occurred during your experiment—you need to post your data. Ideally, although sometimes not in practice, your experiment should be replicable and your hypotheses stated in advance to avoid statistical manipulation.

The conflict between these two ethical norms is that journalists can often get away with things scientists can’t. They can cherry-pick research that supports their conclusion. They can paint vivid anecdotes which run contrary to a statistical trend. They can decide the angle for a story, in advance, and then find the evidence which fits the pieces later.

Scientists’ Versus Still Other Scientists’ Ethics

This clash of norms doesn’t make scientists immune. Frequently scientists of different disciplines will land on the same subject of inquiry, and then there is much infighting about who is really an authority on the topic.

Cognitive science is a field with such overlap. Linguists, psychologists, philosophers and computer scientists have very different tools, methods and ingrained professional standards, which can often conflict. Each side is trained to see why their approach is the most valuable. Philosophers can believe certain questions are outside the power of empiricism [3]. Psychologists can believe a computer simulation provides no insight into how the brain works. Computer scientists argue that linguists adhere too much to formal grammars [4].

These conflicts occur in the overlap of academic disciplines, where the territory hasn’t clearly been won by one side or another. Most areas of academic inquiry either have only one major investigator, or at least one discipline has become the de facto authority above others when disputes arise.

But these temporary border disputes often reveal that a discipline itself may be the highest status authority on a subject, but the internal rules by which it operates are often historical and somewhat arbitrary. Depending on where its community of norms grew out of, many of the disciplines could have evolved quite differently.

Each of these communities of values, with their own internal incentives and pressures creates the power to deceive, even if it is better controlled in some communities than others.

I’m also aware of the irony in commenting on the strengths and weaknesses of different professional codes of conduct as a blogger. Bloggers are not a coherent group and have far fewer group-enforced norms. But, I suppose, that’s probably also why we’re seen as a far weaker authority than scientists and journalists.

How Do You Fight the Deception?

My first instinct in this regard would be to advise you to be aware of the types of biases present in the person giving you advice. To know that researcher standards owe heavily to the history of the discipline. To know that journalists may not lie about a story, but they may selectively pick their evidence to produce the same effect.

But, there’s the danger of selective cynicism [5], whereby learning more about bias causes you to become more skeptical, but only on the evidence you don’t want to believe, exacerbating rather than fixing the problem.

My second, more humbled instinct, therefore is to seek out new ideas and experiences that you disagree with, admit when you’re wrong frequently and with pleasure, and don’t believe just because there isn’t a suitcase full of products, the person you’re speaking to isn’t trying sell you something.