Break Your Map

In the process of getting better at something there are two mistakes that hold you back. The first kind is the mistake of not knowing. Not knowing how the market works, which major to choose, what to do.

If I wanted to start a business selling industrial solvents, I suffer under the first error. I have no idea how the industry works (or much about solvents, for that matter). Ignorance holds me back.

Ignorance, however, isn’t too hard to fix. If I spent several months researching, I could probably have a decent idea of how the industry works. If I spent several years working in it, I’d know even more. The first error has a straightforward remedy—learn more.

The second kind of mistake, and far more insidious, is the mistake of believing things that happen to be wrong. If you’ve convinced yourself that a hill is a valley, it will take a lot of climbing before you realize you were wrong. I worry more about the second mistake.

The Map and the Territory

We spend our lives devising theories for explaining the world. These theories form crude maps of the impossibly complex terrain of our lives. We have a map for our careers, a map for our relationships, a map for our beliefs about the meaning in our lives.

Maps are good. Even a map that is wrong occasionally is a lot better than no map. Philosophical skepticism may have its adherents but it’s utterly impractical. You must have beliefs about the world to make decisions, and even imperfect ones are better than nothing.

But the map is not the territory. The territory is alien, strange and perhaps even incomprehensibly complex. Any map-making process undertaken by an individual over the course of one lifetime is going to be error-ridden.

The rational thing to do, is a cost-benefit analysis. If we can invest less resources to fix our map than the benefits of a correct map yield, fix the map. Yet human beings rarely do the rational thing.

Confirmation Bias and Protecting Our Maps

It turns out we don’t follow the rational process for map fixing. Through a set of interesting experiments, psychologists could show that instead of trying to hunt for the information that would force us to change our maps, we instead seek to information confirming what we already “know”.

The experiment was ingenious. Subjects were given a set of three numbers, such as 2, 4, 6 and told it fits a secret pattern. The task was to guess the identity of the secret pattern by suggesting further sets of three numbers, which the experimenter would say either fit or didn’t fit the pattern.

Given only one data point, many possible hypotheses could be dreamt up by the participants. The numbers could be all even, for example, or the middle number could be the average of the first and last.

The rational method for testing these hypotheses would be to choose counterexamples. If you believed the numbers were all even, try 3, 4, 6 and see if it is validated. If it did, you’d know that your only-evens rule was not the correct rule.

This wasn’t how subjects proceeded, however. Instead, they picked examples which confirmed their previous hypothesis. All-evens testers would pick 4, 8, 10 or 2, 6, 12 as candidate patterns, seeking validation for their theory.

The problem with this method was that the actual rule was “any ascending numbers” so the previous two examples would have been valid, but so would 1, 3, 12 or 3, 9, 11. The method of testing hypotheses sought confirmation, even when it couldn’t determine the secret rule.

What relevance does this have outside the laboratory? The relevance is that we look for information to support our theories, not to break them. We try to protect our maps instead of pointing out where they may be flawed. Worse, when we expend energy trying to improve our maps, the methods we default to are unsound.

Looking Around the Edges

The most profitable method to winning the secret-rule game of the experiment is not to pick random counterexamples. After seeing 2, 4 and 6 validate, picking 1, 17, 4 and seeing it fail doesn’t teach you too much. Instead, the best bet is to try to break the edges of your rule: make one odd, flip the order, make two the same.

The same strategy is effective in life: test around the edges of your map, so you’ll know where to redraw. By breaking your map in precise ways, you can get more information than seeking confirmation or pulling counterexamples out of a hat.

I’ll give an example from my business. When I first launched Learning on Steroids, it was unusually successful compared to my previous business efforts and I wanted to know which principles guided that so I could use those insights in the future. Here were some candidate hypotheses:

  • Monthly billing over one-time sale.
  • Conducting an email-based launch.
  • Restricting capacity.
  • Restricting registration time.
  • Having a clearer service component (in earlier editions I made more emphasis on being able to reach me for feedback)

All of these could have been valid, some combination of them could be or it could be that none of them were the underlying causes of the recent success.

My approach to testing these hypotheses was to vary the different variables individually in my future launches. Later, I did launches that had one-time courses, no capacity restrictions and downplayed the service component. I couldn’t always test each variable in perfect isolation, but in nearly every launch the permutation of these variables was somewhat different.

In retrospect, my hypothesis now is that #2, conducting an email-based launch, is the only consistent winner. Restrictions on capacity has mixed results and restrictions on registration time has a minor, but positive effect. Service components were not important, but that could have been a feature of the price points tested.

My map is far from perfect now, but it is a lot better than it was when I started, which I believe is a large part of the reason my business is generating four times the revenue from when I had made those initial hypotheses.

Researching Edge Cases

You often don’t need to run an experiment to break your map on edge cases and update it to more accurate beliefs. Sometimes simply doing a bit of research can reveal edge-case counterexamples which force you to re-evaluate your thinking.

Cal Newport recently shared an example from his own journey trying to become a tenured academic. Instead of browsing through random examples and trying to confirm his previous hypotheses, he looked for a natural experiment: choose a group of PhD graduates from the same graduating class, but who differed greatly in their eventual success and look at what they did differently in their early careers.

Studying these two groups, the biggest differences were number of papers published (the successful group had more publications) and number of citations, a rough indicator for quality. Using that as a benchmark, Cal could easily hone in on the precise metrics success required in his field.

Research, as opposed to direct experimentation, is useful when the time frame you expect to see results is very long. I could directly experiment on my launch strategy because I could repeat it every 3-6 months. Cal was better off looking for natural experiments because the time frame to observe results was in decades.

Comfort in Contradiction

To me, the idea of map-breaking is unsettling and counterintuitive. Our brains aren’t hard-wired to think this way, so it always takes a deliberate effort to apply.

The challenge to me is being able to be comfortable with spending a lot of mental energy constructing explanatory theories, and then seeking to tear them down. We’d rather spend time building more, rather than admit what we’ve built may be on a shaky foundation.

One step I’ve found helpful to combat this urge (and is often derided by outsiders) is to simply allow yourself to temporarily hold contradictory beliefs. Believing that your theories themselves are a work-in-progress can allow you to recognize the validity of part of the map, even if you don’t know how to connect it to the other parts yet.

Ultimately, confirmation bias is in our nature, and can’t be completely avoided. With effort, however, I think we can remind ourselves to avoid it when we design the larger experiments or research projects to redraw the lines on our map.

  • Jim Stone

    Much better to embrace a contradiction than to prematurely give up on one of the principles or come to a hasty resolution that tangles your map.

  • dag

    I guess we all can hold an opposite thinking for a time and see whether it works better. I doubt we all can endure it even though the experiment is successful.

  • Arjan

    It’s like what philosopher Karl Popper advocated: try to prove your hypothesis wrong.

  • Joe

    Hi,Scott
    I leave u a message here just to say a “thank you”,
    i am from China,a place where many years efforts just contributes to examination;and rote memory is promoted as “hard working” .

    Until one day i read your ebooks,such as “Study less and learn more”、“hostilistiC learning” etc.The moment a feeling blamed in my mind and sth seemed woken up . Still now I often chew and refresh some “gold” concepts such as Feynman Technique 、visulization、metaphor、meditation。。。And I try to use them in my daily study and life.It is not easy to act ,but the process helped and changed me a lot.

    So,I have to say thanks a lot .I am so appreciate your genius and generous,You know not every clever one loves to share his thoughts to help other guys.It’s my lucky to meet and learn from you.

    THANK U AGAIN AND
    Best regards

  • Li

    I want to learn more about rapid study method.

  • Donny

    Possibly one of the most well written blog posts on the nature of subjectivity in relation to life that I’ve ever read. Human’s experience of reality and the creation of maps have long been a passion of mine and something that lead me to study subjectivity in graduate school. Not that such a field helps get a job or anything… >_>

    Eventually I’d like to take my research into a PhD. The notion of “maps” become really interesting when you get the the societal wide one’s that academia refers to as grand narratives. If we look at post-modern thinking from scholars like Baudrillard, the world is now in a state were we pretty much know that our maps are fake representations of reality but we cling to their vestiges because, as you pointed out, humans need their maps.

    My particular area of study has been Japan and symptoms of post-modernism is appallingly apparent here. I think the area of subjectivity really needs to be examined as it is central to human advancement.

  • Tony Khuon

    Confirmation bias is a big mental hurdle. One idea I picked up from Clayton Christensen’s latest book is to ask yourself “What has to prove true for this to work?” It forces you to examine your assumptions and seek disconfirming scenarios. To go “off the map” so to speak. Thanks for the post!

  • Elisa

    Very well written and definitly food for thought!
    I will try to implement it in order to find out why some students in my degree are more outstanding than others.
    All the best,
    Elisa

AS SEEN IN