- Scott H Young - https://www.scotthyoung.com/blog -

How Much Can You Possibly Learn?

How much can the brain store?

We all know how much our computers and phones can store, if only because we occasionally get the pings of messages telling us we’ve taken too many photos or downloaded too many apps or movies and something has to be deleted to store more.

The brain doesn’t seem to be like this. While we do forget things, this seems to be more a matter of decay from disuse than being actively “pushed out” by new knowledge.

On the other hand, the brain is still subject to the same laws of the universe that govern everything else. It can’t possibly store infinite amounts of data, as that would be physically impossible.

So how much can you actually learn?

Some Upper-Bounds on Memory

A good first attack on this kind of problem would be to look at the potential upper bounds of human memory, based on the laws of physics. These will be wildly too high, mostly because the brain is a living organism and not an idealized information storage medium, but they should give some starting points for thinking about it.

The brain is typically 350-450 cubic centimeters. The maximum possible information you can cram into a volume that size is defined by the Bousso bound [1], which ends up calculating to roughly 10^70 bits of information. However, in order to get this amount of information, your brain would become a black hole… so let’s try to reduce this bound further.

If we look, somewhat more modestly at the amount of information content possible in a volume of water at room temperature equivalent to the brain, we end up with a somewhat more modest 10^25 bits of information. This is one yottabyte [2] of data or 7-8 orders of magnitude more data than the entire Internet.

This, of course is still way too high, since most of the matter in the brain isn’t encoding data but keeping the brain alive and functioning.

We’re still a long way from understanding the exact information carrying capacity of the neurons and synapses in the brain. As such, an estimate of brain information capacity has to use a simplified approximation of how much information these connections can possibly store.

If we ballpark the amount of data that can be stored in the brain as roughly the same order of magnitude as the amount of synapses in the brain, that leaves us with 100 trillion bits or ten terabytes of data—similar to the size of a large hard drive. Even if we imagine that synapses are storing more than one byte of data each (through multiple connections or non-binary connection strengths), we might be able to bump that up to a petabyte [3], but probably not much more.

By this look, the brain definitely can’t be storing more than a yottabyte of data, and it’s quite unlikely it’s storing more than a petabyte.

Of course, this is still an upper-bound. We know from people who suffer brain injuries, that the knowledge stored in the brain likely has some measure of redundancy. Those synaptic connections are not exclusively used for storing memories either—many are being used for processing, relaying information or may even be spurious, not doing anything at all. This suggests the memory storage of the brain might indeed be a fair bit smaller, but the certainty of this estimate is a lot less than the harder upper bounds mentioned before.

Why Don’t People Ever “Fill Up” Their Brain’s Capacity?

One explanation for why we don’t seem to run out of space to learn new things is that learning may be a lot slower than our mnemonic capacity. That is to say, the actual learning rate of information may be slow enough that we never reach that capacity.

Here’s an analogy: imagine you have a latest-generation harddrive which can store 10TBs of data, but you have to fill it up using a dial-up connection which only downloads at 3 kilobytes per second. It would take over a decade to fill it up, if you were downloading constantly, without rest, and without removing anything you previously downloaded.

Given that it’s likely only a fraction of our waking type is filling up our mental harddrive, and that we suffer from forgetting, it may simply be that we never experience the upper bound of our mental capacity because we learn too slowly.

However, a different explanation might be that, unlike a computer, we don’t store memories that way. Because we store them differently, when we run out of “space” the impact is different.

Vector Encoding, Learning and Forgetting

One of the most popular accounts for how the brain stores information comes from connectionism [4]. This says that the brain uses vector encoding, which means each memory is distributed over many thousands of individual synapses and neurons, rather than there being a neuron which individually stores each atomic concept or memory.

In this view, there is no “grandmother neuron” which specifically points out a memory of your grandmother, but that memory of your grandmother is stored across many different neurons.

Each neuron may be involved in tens of thousands of memories, each contributing a tiny, but necessary, part to the processing that results in thinking of your grandmother, calculus or recognizing the words in this sentence.

What happens when we learn, therefore, is quite unlike storing a file on a computer, where each memory address exclusively and completely stores one piece of data. Instead, we “pile on” data on top of old data. As more and more memories are stored, older memories may get weaker and weaker, as they become harder to activate, since their contribution to the total network of weights is relatively small.

Intuitively, this seems to jive with our own experience of memory. Unlike a computer which remembers things all-or-nothing, human memory seems to fade, and needs to be refreshed or it will become harder and harder to summon up. It may even still exist in our brains, but become unretrievable, until the exact pattern of triggers can summon it up again.

If this is how our memory fills its capacity, it may be that we encounter the limits of our brain’s storage ability all the time.

What Kind of Memories Get Overwritten?

One idea here is also that the memory fading process, where accumulated new memories make older memories harder to find, may depend on the kind of information learned. Information that must be finely discriminated from similar, but different, possibilities, might require more storage than memories which are easy to tell apart.

This might explain the effect of interference on learning new languages. If you learn Spanish, and then learn Portuguese, for instance, you may have difficulty recalling Spanish words when they’re different from Portuguese. Because the languages are quite similar, but sometimes vocabulary is different, you may have difficulty remembering.

A more extreme example of this comes from mnemonics [5]. People who train themselves to memorize vast swaths of information with techniques like the memory palace, have to be careful not to reuse the same areas for mapping information for two sets of things they want to retain in memory. Without care, the connections can get mixed up and old memories may become unretrievable.

By this account, you should be more careful when learning information which is quite similar to something you already know, but has distinct differences in use (say similar languages), rather than fields which are either completely different (chemistry and art history) or complementary (physics and math).

My own experience from learning multiple languages [6] suggests that, if you want to go down that route, you often need to invest time switching between the languages, so the cues for distinguishing vocabulary get reinforced and they don’t mix together.

Should You Worry About Running Out of Space?

Probably not. Even if the brain does have a more limited capacity than an ideal physical medium for storage, and even if we occasionally run into the limits of memory from ideas and concepts that get pushed down, my sense is that running out of memory isn’t a concern for almost anyone.

One reason might be that memory decay happens naturally, interference or not. This might mean that trying to “save” space in your mind, by avoiding learning unnecessary things, may not stop forgetting any less than learning constantly.

Another reason is that many memories are supportive of each other. Learning one thing often connects to another thing. If retrieval, not storage, is the major flaw in our memory hardware, then overinvesting in memory cues more than makes up for trying to save extra space.

For extremely memory-intensive subjects or tasks, it may be possible to reach a saturation point, where new memories can only be created at the expense of old ones. I could imagine, for instance, that there’s an upper bound on how many languages one can learn to mastery, since each may require remembering hundreds of thousands of pieces of linguistic information.

However, it may be that those bounds are reached because natural decay processes, and thus the need to practice previously learned information in order to keep it active, eventually overwhelmns the ability to learn new things. This way, your memory capacity would be hit, not because you suddenly run out of space one day, but because maintaining everything else you’ve learned requires 100% of your time.

If this is the case, though, it is likely far greater than what most of us will ever experience. Polyglots like Alexander Arguelles [7] have proficiency in 50+ languages, albeit through lifelong devotion. If there is a threshold for theoretically-maximal linguistic fluency, it might be well over a hundred, given that Arguelles is still a human being and needs to eat, sleep and do things other than learn languages.

We also can’t discount the possibility of the brain itself to expand its capacity under the pressure to learn more things. London taxicab drivers, who must memorize the city’s infamously complex road system, sights and stops, have larger hippocampuses (a part of the brain involved in forming long-term memories). What’s more, this seems to be caused by their intensive study, rather than being the result of those with larger hippocampuses becoming taxi drivers.

Albert Einstein’s brain supposedly had larger sections related to spatial reasoning and visualization. That could have been a genetic endowment, but it’s also possible it was an outcome of years of strenous thought experiments trying to imagine the warps and curvatures in spacetime. If the learning capacity of the brain is itself plastic, this provides extra weight on the idea that one shouldn’t “save” brain capacity for other things.

As a practical issue, it’s probably unlikely that the storage limits of the human brain should be a concern for everyday learning. However, by understanding how your memory works better, you maximize what you’re able to learn.