Simplicity Beats Nuance: Why the Wrong Answer Can Often Beat the Correct One

Nobody wants to be told their views are oversimplified. So, when thinking about issues, we tend to try to include as many factors as possible. The more nuanced our views become, the less likely they can be attacked for missing something important.

But are more nuanced ways of thinking about a problem actually more useful?

There’s at least some evidence that simpler models are better than more nuanced ones. Consider the example from Daniel Kahneman’s Thinking, Fast and Slow involving trained counselors trying to predict student grades after their freshman year:

“The counselors interviewed each student for forty-five minutes. They also had access to high school grades, several aptitude tests, and a four-page personal statement. [A simpler model] used only a fraction of this information: high school grades and one aptitude test. Nevertheless [the model] was more accurate than 11 of the 14 counsellors.”

This result was far from unusual. Simple algorithms and models were often found to beat nuanced experts in a wide variety of professions from stock pickers to medical professionals.

The Problem with Nuance

Reality is complicated. It only seems natural that having models of reality that attempt to model this complexity are going to be more useful than simpler models.

But there are two problems with trying to hold a more nuanced mental model than a simpler one: implementation and falsifiability.

The problem of implementation is that if your model is more complicated it can be a lot harder to say exactly what should be done to fix a problem.

Say you hold the (more nuanced) view that success in your career is a combination of aptitude, connections, credentials, politicking and economic forces. What does that mean you should do next? Unfortunately, with so many possibly relevant factors, it’s very hard to say.

Instead, if you held the (more simplified) view that success in your career depends on doing your job very well, you’d have a much clearer picture of what needs to be done.

This problem of implementation could be disregarded, however, if it weren’t for another problem: falsifiability. Namely, more complex theories are harder to disprove because they explain a lot more. Unfortunately, explaining more is not a virtue. A theory that explains every possible occurrence tells you nothing.

In our simplified career model, we could learn fairly easily when it wasn’t a good model of our particular career or position if we saw that doing good work wasn’t well correlated with success. This may provoke us to look for a different model that is better suited to our situation. A more nuanced explanation, however, can seem to fit almost any feedback you get from the environment, so it can be very hard to tell whether the approach you’re using is actually working.

Don’t Look for the “Correct” Approach, But Many Useful “Wrong” Ones

The problem with simple models is that reality isn’t simple, so often they are wrong. But, instead of trying to incorporate more and more factors into our decision-making process, I think it’s better to consider multiple simple perspectives in isolation and switch between them as they become more accurate.

To make sense of our career example, you might consider that there are several simple models of career success: one that says success is a function of your network, one that says it’s a function of your ability, another that says having recognized credentials will determine your standing, all in isolation.

Then, formulate your goals and plans on the basis of these rather simpler models and see how far they can take you. When you get stuck with applying a certain model, you may adopt a different one.

Why Separate, Simpler Models Work Better than One, Nuanced Approach

Having separate, simple models works better than trying to have one perfect theory for a few reasons.

First, it avoids locking in on bad beliefs. If you accept, from the outset, that you’re going to be applying a simplified model, you’ll be less attached to it when it no longer applies. Accept that it might be useful in your situation and use it while it works. A perfect theory often struggles to adapt when the situation changes and you may get locked into a bad practice long after it has stopped working.

Second, simple models are easier to take action from. When Cal and I were working on our career course, we decided to adopt the simple model that career success depends on ability. For many people it’s a pretty good, but imperfect, model. Once that has been established, it becomes a lot easier to decide how to work on that goal, than it would be if we were also trying to coach students on a million other possibly relevant, but confusing factors.

Third, you can collect more simple models without having to declare one as correct and all others as wrong. Trying to have a nuanced, complete picture constantly forces you to defend your views against all contradictory models, which is often impossible. Having multiple, simplified models instead causes you to ask, “where might each of these apply?”

No Right Answers

I think this thinking strategy can be uncomfortable because we are always seeking the “correct” or “true” approach, despite the fact that for domains more complicated than particle physics, it’s unlikely that such a tractable truth exists in any absolute terms.

This leads to a philosophy of not seeking the “right” answer, but inviting multiple useful, but incomplete, answers. And while there may not be a correct answer, there are very often wrong ones, or models which are not useful in almost any situation. Being open to multiple possibly useful answers isn’t the same as saying every answer is useful.

Accepting simple models is merely accepting that all models are simplifications, and that it’s not possible to fully comprehend and exactly specify the correct answer for most areas of human interest. The goal isn’t to avoid any simplifications, but not to become trapped by them, given that they are unavoidable.

  • Alberto

    At the same time, in a common conception of mastery, in the long term one breaks through plateaus to reach the next level of understanding, which is more subtle and nuanced. I think the contradiction may be only apparent: mastery gives subtler perception and judgement in a field, intrinsically, but that “understanding” cannot be formalised and it is not to be considered a theoretical model. So, there is more subtlety in decision making as we progress (this seems undeniable to me, at least in creative fields), without real unification of deep theoretical models

  • All models are wrong, but some are useful (George Box).

  • Arthur Guerrero

    My thoughts: Simple models are good for beginners, because it allows them to make quicker progress and ‘get it’ faster. Hitting them with nuanced points would potentially hurt them and even discourage them from investing more time in the activity.

    But, eventually nuanced points will yield good results once there is a solid foundation for the ‘beginner’ to work with. Once you ‘get it,’ a nuanced point could be like taking a short cut towards your destination.

  • Deepti Km

    I’d distinguish between simple models and simple *strategies*, though. It’s not uncommon in engineering to build a sophisticated model to tune a simple strategy. For example, in control engineering, an engineer might build a complicated model of the system they want to design a controller for, and then decide, based on that model, that something simple like a bang-bang controller will produce optimal results. A bang-bang controller is essentially “go left if you’re too far to the right and and go right if you’re too far to the left” (see–bang_control). Part of the reason for favoring this algorithm is that it is easy to implement.

    To relate that to the career success strategy, your model needs to correctly tell you the direction of the ‘gradient’ of success in some significant variables, so that you can head toward upward career success. You might do that by a simple strategy: focusing on one variable (core competence), since the partial derivative of career success with core competence is positive and reasonably large according to the model (say). This strategy is easier to implement than trying to improve in some combination of competence, politicking and social skills. But *knowing* that multiple factors are at play prevents you from drawing wrong conclusions while switching strategies (say you increase your competence but stop showing up to work on time and stop showering – then you might become less successful and wrongly conclude that success decreases with competence).

    I think there are several different things going on in your article:
    1. Complex models are hard to validate, and may be more inaccurate than simple ones that can be validated (Kahneman’s example).
    2. Complex accurate models are harder to derive strategies from than simpler accurate models.
    3. Complex strategies are hard to implement and may be ineffective (the person who wants to improve on *all* the career success variables).
    4. “Nuance” is often a cover for equivocation. I know of a lot of arguments where people will cling to some exception to a model and insist that no conclusion can be drawn. This is a symptom of not being able to think quantitatively. For example, they might insist that *every* variable plays a role in career success, so one can never choose one to focus on, neglecting the natural next question of which one or two variables have the biggest (or most reliable, or most efficient) impact.

    I agree with this article on the whole, except I would say that simple *strategies* are better than complex ones, and models should be only as complex as needed to create a “reasonably sound” strategy.

    (By the way, @Scott, I don’t know if you’ve ever tried looking at career strategy through the lens of control engineering or dynamical systems, but if not I think you’ll find both of those topics interesting. I especially found that the concept of a phase plot ( changed the way I looked at strategies in general.)

  • I’d say you were tempted to mention Trump in this more than once, Scott!

  • Tuğba Denizci thank you so much Scott , i learned a lot from you

  • Scott Young

    I largely agree with your points. Some additional thoughts:

    Gradient search is an interesting career strategy, of course it also has local maxima problems. Often the only way to go up in career involves first stepping down (to get more education, credentials, experience).

    Engineering disciplines, I think, overemphasize the knowability of many domains. As such they tend to produce a mindset which is sometimes artificially certain or enamored by complex, logically sound models and explanations. I’ve talked about this bias before here:

  • Scott Young

    Hmmm honestly, I hadn’t thought of it!

  • Deepti Km

    Continuing along the vein of “I mostly agree, but,” I think the issue of local maxima is readily resolved if you use (estimated) lifetime career success as your metric. That way, increasing your capacity for future success by getting an education counts as moving “up”.

    While engineering methods can be applied wrongly to situations where they don’t work well (like stock markets), I think that is more a symptom of technical snobbery than an inherent limitation of the methods. I definitely agree that people tend to mess up when engineering-type methods are the only tools in their toolkit (and I’ve been guilty of this too).