A reader emailed me after I wrote about learning calculus in five days:
“I question that you’re just a person of average intelligence who knows how to learn faster. I can’t imagine ever finishing an MIT class in 5 days.”
My response to him was that, of course, I was probably smarter than average. If we measure intelligence, in part, by a persons ability to learn quickly, then learning faster than average would imply being smarter than average. He responded back with:
“Would you say someone with average genes could learn a class in 5 days with the right techniques?”
The question doesn’t make any sense, but I feel it represents a common viewpoint so I wanted to share my thoughts here. What on earth does “average genes” mean?
Do You Need Good Genes to Be Smart?
The real problem is that society has extremely loaded definitions of the word “smart” or “intelligence” which tends to preclude any meaningful discussion. Worse, now these loaded definitions are mixed up in people’s equally confused minds about genetics.
I want to clarify some misconceptions:
- You can’t have an IQ.
- When researchers say 50% of intelligence is heritable, they don’t mean 50% of your intelligence is caused by genes.
- Innate talent doesn’t mean immutable talent.
- No, not all people are created equal, but that’s true of everything.
“Maybe You Just Have a Really High IQ?”
This first misconception bothers me the most. An IQ is never something you can possess, it is merely the result of a test you have taken. Think of an algebra test. You would never say, “I have an algebra quotient of 120!” So why do people say “I have an IQ of 120”?
It may seem like I’m splitting hairs, but it’s a big difference. IQ is not intended to measure only innate, immutable ability. It also measures knowledge you’ve accumulated through experience.
The way IQ is calculated is that your score is compared to people the same age as you, using a bell curve. So a particularly precocious 10 year-old who scores 150 isn’t necessarily smarter than me.
In fact this sorting by age is often necessary, because otherwise people’s IQs would grow steadily higher as they aged. This only makes sense, if IQ tests also measure knowledge and cognitive skill, then one would expect to score better as you accumulate more experience.
The other misconception about IQ is that it measures general intelligence. I firmly believe that such a scalar quantity does not exist. Studies involving chess grandmasters show that they are not particularly more intelligent in other things than an average person, despite being a genius in chess. Intelligence, it turns out, is highly contextual.
I don’t think IQ tests are useless, and most psychologists who use them understand their strengths and their limitations. The problem is that a misinformed public has grossly exaggerated both their relevance and the idea that intelligence is something you can pin down to a number.
What Does it Mean When Scientists Say 50% of Intelligence Heritable?
Studies generally show about 50% of intelligence is heritable, with the other 50% coming from the environment. This type of research is usually grounded in separated identical twin studies, where the effects of genes and similar upbringings can be sorted.
Now most people take these results and interpret them as saying that 50% of my intelligence is caused by my genes. This is incorrect.
What scientists mean when they say 50% of intelligence is heritable, they are saying that over the population being sampled, 50% of the deviations from the average in intelligence can be explained by differences in genes. This sounds similar to saying your smarts are half genes, but it’s not.
For example, let’s say we took 100 boys and raised them in barrels, where they received no sunlight or human contact of any kind, until they were 15. Then we gave them an IQ test immediately after sending them into the world.
For this population, 100% of the deviation in IQ would be explained by genes. After all, their environments were identical, so it cannot be a factor. However, this doesn’t mean that the boys’ general mental ability is the result of their genes—clearly their environment was the key factor preventing them from being normal.
Scientists cannot, at this point in time, say what percentage of your ability comes from genes. All they can do is observe population deviations. That’s very useful information, but just because 50% of IQ is explained by genes, that doesn’t mean 50% of your individual intelligence is caused by genes.
How Much Do Genetic Advantages Matter Anyways?
This confusion is compounded by another common misconception people have. People hear the words “genes” and they believe this means innate talent. Second, they believe that innate talent means immutable ability. Although these things could be true, they are not logical consequences of each other.
First, what is the difference between genetic predisposition and innate talent? Genes could cause a natural aptitude, in math, for example. But a genetic predisposition could also mean that you simply enjoy math more, so you’re more likely to work hard at it.
The fact that researchers can see trends in populations being explained in heritability does not mean the mechanism of action is well understood.
The second, even more tenuous, jump people make is assuming that if talent is innate (or caused by genes) that it is necessarily impossible to change. This one is demonstrably false.
A really simple example is hair color. Hair color is mostly genetic. But I can purchase a bottle of hair dye for a few dollars and gain nearly 100% control over the actual color of my hair. In this case, the innate attribute of hair color is entirely modifiable by technique.
The same is true of talent. Yes, some people may have more natural aptitude, but that doesn’t mean that you’re stuck with what you’re born with. Even if an attribute was mostly genetic, that wouldn’t preclude new methods from being able to improve that attribute.
My favorite example of this is the running of the 4-minute mile. Before 1954, no human being had ever ran a 4-minute mile. Then Roger Bannister completed his run in 3:59.4. Now thousands of people have completed a 4-minute mile.
Running is something, I feel, has a lot of similarities to intelligence. There is definitely an element of innate advantage; Bannister was an incredible athlete. But the gene pool didn’t suddenly change in 1954. Instead, the new approach allowed other people to push past the barrier as well.
The question I got was whether anyone can complete an MIT class in 4.5 days. I don’t think most people could, just like I know I can’t run a 4-minute mile. But that doesn’t mean that you can’t learn better, just as Bannister’s aspirational marker doesn’t mean you can’t train to run faster.
No, People Are Not Created Equal
Yes, some people have innate advantages over other people. Life is unfair. I’m not trying to spin this article into saying that there don’t exist fantastic geniuses whose abilities are largely unearned.
But it frustrates me when I see people making equally absurd chains of reasoning to assume that because some people have a head start, that there’s no point trying to learn better.
My methods may not turn out to be fully correct. I’ve worked with a lot of students, but I don’t forget that many of my ideas haven’t withstood a barrage of peer-reviewed experiments demonstrating causality. But I do feel that, regardless of your opinion on metaphors, the Feynman technique or flow-based notetaking, that anyone can improve how they learn, just as they can improve anything else.
Speaking of learning faster, I just finished my second week (and second class) of my MIT Challenge to learn MIT’s computer science curriculum in 12 months, without taking classes. You can see the video below where I discuss how you can watch lectures at 2x speed, how to learn math faster and link to free resources I’ve been using: