How Much Can You Possibly Learn?

How much can the brain store?

We all know how much our computers and phones can store, if only because we occasionally get the pings of messages telling us we’ve taken too many photos or downloaded too many apps or movies and something has to be deleted to store more.

The brain doesn’t seem to be like this. While we do forget things, this seems to be more a matter of decay from disuse than being actively “pushed out” by new knowledge.

On the other hand, the brain is still subject to the same laws of the universe that govern everything else. It can’t possibly store infinite amounts of data, as that would be physically impossible.

So how much can you actually learn?

Some Upper-Bounds on Memory

A good first attack on this kind of problem would be to look at the potential upper bounds of human memory, based on the laws of physics. These will be wildly too high, mostly because the brain is a living organism and not an idealized information storage medium, but they should give some starting points for thinking about it.

The brain is typically 350-450 cubic centimeters. The maximum possible information you can cram into a volume that size is defined by the Bousso bound, which ends up calculating to roughly 10^70 bits of information. However, in order to get this amount of information, your brain would become a black hole… so let’s try to reduce this bound further.

If we look, somewhat more modestly at the amount of information content possible in a volume of water at room temperature equivalent to the brain, we end up with a somewhat more modest 10^25 bits of information. This is one yottabyte of data or 7-8 orders of magnitude more data than the entire Internet.

This, of course is still way too high, since most of the matter in the brain isn’t encoding data but keeping the brain alive and functioning.

We’re still a long way from understanding the exact information carrying capacity of the neurons and synapses in the brain. As such, an estimate of brain information capacity has to use a simplified approximation of how much information these connections can possibly store.

If we ballpark the amount of data that can be stored in the brain as roughly the same order of magnitude as the amount of synapses in the brain, that leaves us with 100 trillion bits or ten terabytes of data—similar to the size of a large hard drive. Even if we imagine that synapses are storing more than one byte of data each (through multiple connections or non-binary connection strengths), we might be able to bump that up to a petabyte, but probably not much more.

By this look, the brain definitely can’t be storing more than a yottabyte of data, and it’s quite unlikely it’s storing more than a petabyte.

Of course, this is still an upper-bound. We know from people who suffer brain injuries, that the knowledge stored in the brain likely has some measure of redundancy. Those synaptic connections are not exclusively used for storing memories either—many are being used for processing, relaying information or may even be spurious, not doing anything at all. This suggests the memory storage of the brain might indeed be a fair bit smaller, but the certainty of this estimate is a lot less than the harder upper bounds mentioned before.

Why Don’t People Ever “Fill Up” Their Brain’s Capacity?

One explanation for why we don’t seem to run out of space to learn new things is that learning may be a lot slower than our mnemonic capacity. That is to say, the actual learning rate of information may be slow enough that we never reach that capacity.

Here’s an analogy: imagine you have a latest-generation harddrive which can store 10TBs of data, but you have to fill it up using a dial-up connection which only downloads at 3 kilobytes per second. It would take over a decade to fill it up, if you were downloading constantly, without rest, and without removing anything you previously downloaded.

Given that it’s likely only a fraction of our waking type is filling up our mental harddrive, and that we suffer from forgetting, it may simply be that we never experience the upper bound of our mental capacity because we learn too slowly.

However, a different explanation might be that, unlike a computer, we don’t store memories that way. Because we store them differently, when we run out of “space” the impact is different.

Vector Encoding, Learning and Forgetting

One of the most popular accounts for how the brain stores information comes from connectionism. This says that the brain uses vector encoding, which means each memory is distributed over many thousands of individual synapses and neurons, rather than there being a neuron which individually stores each atomic concept or memory.

In this view, there is no “grandmother neuron” which specifically points out a memory of your grandmother, but that memory of your grandmother is stored across many different neurons.

Each neuron may be involved in tens of thousands of memories, each contributing a tiny, but necessary, part to the processing that results in thinking of your grandmother, calculus or recognizing the words in this sentence.

What happens when we learn, therefore, is quite unlike storing a file on a computer, where each memory address exclusively and completely stores one piece of data. Instead, we “pile on” data on top of old data. As more and more memories are stored, older memories may get weaker and weaker, as they become harder to activate, since their contribution to the total network of weights is relatively small.

Intuitively, this seems to jive with our own experience of memory. Unlike a computer which remembers things all-or-nothing, human memory seems to fade, and needs to be refreshed or it will become harder and harder to summon up. It may even still exist in our brains, but become unretrievable, until the exact pattern of triggers can summon it up again.

If this is how our memory fills its capacity, it may be that we encounter the limits of our brain’s storage ability all the time.

What Kind of Memories Get Overwritten?

One idea here is also that the memory fading process, where accumulated new memories make older memories harder to find, may depend on the kind of information learned. Information that must be finely discriminated from similar, but different, possibilities, might require more storage than memories which are easy to tell apart.

This might explain the effect of interference on learning new languages. If you learn Spanish, and then learn Portuguese, for instance, you may have difficulty recalling Spanish words when they’re different from Portuguese. Because the languages are quite similar, but sometimes vocabulary is different, you may have difficulty remembering.

A more extreme example of this comes from mnemonics. People who train themselves to memorize vast swaths of information with techniques like the memory palace, have to be careful not to reuse the same areas for mapping information for two sets of things they want to retain in memory. Without care, the connections can get mixed up and old memories may become unretrievable.

By this account, you should be more careful when learning information which is quite similar to something you already know, but has distinct differences in use (say similar languages), rather than fields which are either completely different (chemistry and art history) or complementary (physics and math).

My own experience from learning multiple languages suggests that, if you want to go down that route, you often need to invest time switching between the languages, so the cues for distinguishing vocabulary get reinforced and they don’t mix together.

Should You Worry About Running Out of Space?

Probably not. Even if the brain does have a more limited capacity than an ideal physical medium for storage, and even if we occasionally run into the limits of memory from ideas and concepts that get pushed down, my sense is that running out of memory isn’t a concern for almost anyone.

One reason might be that memory decay happens naturally, interference or not. This might mean that trying to “save” space in your mind, by avoiding learning unnecessary things, may not stop forgetting any less than learning constantly.

Another reason is that many memories are supportive of each other. Learning one thing often connects to another thing. If retrieval, not storage, is the major flaw in our memory hardware, then overinvesting in memory cues more than makes up for trying to save extra space.

For extremely memory-intensive subjects or tasks, it may be possible to reach a saturation point, where new memories can only be created at the expense of old ones. I could imagine, for instance, that there’s an upper bound on how many languages one can learn to mastery, since each may require remembering hundreds of thousands of pieces of linguistic information.

However, it may be that those bounds are reached because natural decay processes, and thus the need to practice previously learned information in order to keep it active, eventually overwhelmns the ability to learn new things. This way, your memory capacity would be hit, not because you suddenly run out of space one day, but because maintaining everything else you’ve learned requires 100% of your time.

If this is the case, though, it is likely far greater than what most of us will ever experience. Polyglots like Alexander Arguelles have proficiency in 50+ languages, albeit through lifelong devotion. If there is a threshold for theoretically-maximal linguistic fluency, it might be well over a hundred, given that Arguelles is still a human being and needs to eat, sleep and do things other than learn languages.

We also can’t discount the possibility of the brain itself to expand its capacity under the pressure to learn more things. London taxicab drivers, who must memorize the city’s infamously complex road system, sights and stops, have larger hippocampuses (a part of the brain involved in forming long-term memories). What’s more, this seems to be caused by their intensive study, rather than being the result of those with larger hippocampuses becoming taxi drivers.

Albert Einstein’s brain supposedly had larger sections related to spatial reasoning and visualization. That could have been a genetic endowment, but it’s also possible it was an outcome of years of strenous thought experiments trying to imagine the warps and curvatures in spacetime. If the learning capacity of the brain is itself plastic, this provides extra weight on the idea that one shouldn’t “save” brain capacity for other things.

As a practical issue, it’s probably unlikely that the storage limits of the human brain should be a concern for everyday learning. However, by understanding how your memory works better, you maximize what you’re able to learn.

Is Reading Blogs Like This One Keeping You From Improving Your Life?

One idea I’ve been pondering over lately is to what extent reading about self-improvement is a complement versus a substitute for taking self-improvement action.

Complements and substitutes are terms that come from economics. A complement to a product is something you buy more of when you buy the product. Think popcorn and movies. The more movies you go to, the more likely you are to buy popcorn (at least here in North America). Wine and fine dining. Cars and gasoline. These are all complements.

The standard view of self-improvement writing, whether it’s fitness books, cookbooks, business books or popular psychology is that it should assist with personal development. That is to say, the more books you read, the more likely you are to actually work on improving yourself.

The alternative view, of course, is that reading isn’t a complement but a substitute.

Substitutes, again from economics, are products that compete with each other. When you go to the movie theater, each movie acts as a partial substitute of the others. If you go to see one, you forego the other. Similarly, Italian wines and French wines are substitutes, as is gasoline from different gas stations.

Here the theory is that what we really want out of personal development, both in active efforts and passive consumption, is that good feeling that we’re doing something to improve our situation. It’s an anxiety-reducing effect that the challenges we’re facing are somehow being dealt with, even if they aren’t being resolved immediately. Since both reading and doing something alleviate this tension, they are substitutes, and consuming one will decrease consumption of the other.

Which Dominates: The Substitute or Complement?

My own feeling is that both of these effects exist, and which dominates won’t be a universal consideration but will depend on a lot of factors.

One factor probably has to do with the material itself. Some types of self-improvement probably work as substitutes. They provide a large emotional payoff (and, thus, work well in anxiety-reduction), but they may be weak on substantial follow-up. Being weak on the latter, they don’t make good complements.

Another factor, however, is probably the nature of the self-improvement task itself. Some types of self-improvement work are probably particularly difficult, emotionally unrewarding and complicated. Because the anxiety-reduction from working in those areas is so hard to come by, it might be easier to seek it from consuming self-improvement material passively instead.

While I’d like to think I avoid the worst of the self-help vapidness of the former, I admit that many of the self-improvement problems I focus on here are exactly the kind of nebulous, hard-to-work-on, abstract categories that may encourage substitution.

As an example, compare what I write to a cookbook. The latter has a quite straightforward application, and therefore is more likely to serve as a complement to actual cooking. Of course it’s not a pure effect—some people buy cookbooks to alleviate the anxiety that they don’t cook enough, or need to learn to cook, and then don’t actually use them. But I imagine this is not the majority.

On the other hand, consider improving your ability to learn effectively, combat procrastination or improve your career. These are all hard pursuits that, even when you’re doing them right, have mixed emotional payoffs in the short-term. As such, I can imagine people consuming self-improvement material here as a way to get that emotional payoff more reliably than doing the actual work.

Note: Let me be clear that I don’t think anyone is actually conscious of this distinction, even if they abuse it. What probably happens, psychologically, is that there’s a desire to improve something, or more accurately, a desire to feel like things are going to be improved. When this desire is strong enough, it can trigger motivation to do something about it. Sometimes that manifests as taking action. Sometimes that manifests as buying a book that you tell yourself you’ll use, but never do.

The Problem with Substitution

I had a conversation with a friend of mine who is doing her doctorate in clinical psychology. I mentioned another friend of ours who was having some anxiety and who I was trying to offer encouragement.

To my surprise, she told me that this was likely harmful in the long-run. The best treatment doesn’t get patients to reduce their anxiety by consoling them, but by confronting that anxiety, which will increase it in the short-run, but has the long-term effect of reducing the intensity of anxiety the next time it comes up.

The problem here was, again, one of substitution. By offering consolation, you are escaping from your anxiety. It provides short-term relief, but it only reinforces the pattern that created the anxiety in the first place.

This is the problem with the substitution effect in self-improvement. If you are using reading a blog like this, buying books or doing “research” as a way of reducing the tension you feel that something needs to be improved, that can be potentially harmful. Without doing something and solving the underlying issue, consuming more information is counterproductive.

When to Decide If You Need a Break?

I’ve experienced self-improvement both as complements and as substitutes in my own life.

The key difference, it seems to me, is a question about how much effort are you expending to actively work on the areas you’re reading about. If the answer is low or zero, and it has been for some time, but the amount of time you’re spending consuming information is significant, you may be having some kind of substitution effect.

I’ve had success in cutting down my consumption. That tends to increase angst about whatever issue you wanted to use the material to solve, momentarily, but that same energy can hopefully be redirected towards taking some action.

Similarly, I’ve had times when I’ve been engaged in a lot of action and really benefited from having complementary material guide me through. Unfortunately, this isn’t an area I can give an easy prescription to read less or read more. Both might be useful! Instead, you need to look more closely at yourself–what are you using the time you spend reading books and blogs on. Is it a substitute for real action or a complement to it. Only you can decide.

AS SEEN IN