Saturday, 24 January 2015

On Counting Wisps


My most popular blog post is On The Processing Speed Of The Human Brain. This post is a follow-up, asking a related question: What is the storage capacity of the human brain?

This is a difficult question to answer, because we don't really know what memories are, or how exactly the brain is storing them. It seems implausible that it is storing them in some sort of countable form, an organized set of tiles that can be plucked from the mind when needed. If you believe that the objects of the mind are less like tiles and more like the sensible wisps of a seething dynamic process, then a question about the storage capacity of the brain doesn't make  obvious sense. What exactly are we to count? All the wisping that has been sensed? All the wisping that could theoretically be sensed? The wisping that will actually be sensed by a particular person's brain? The average Shannon information of the dynamic wisping process across some specific time unit?

Some people have estimated that the storage capacity of the human brain is functionally infinite since we always find room to store more information if we want to, so no practical limit exists. This is a weak argument, since by the same argument your computer's hard disk has functionally infinite space. Perhaps the storage of the brain is constant despite our ability to always fit more in if we really have to, because we fit it in by over-writing what is already there.

A more principled lower estimate can be made by considering the hardware of the brain. A human being has about 100 billion brain cells. Let's assume that a change in any connection strength between two connected neurons is equal to one bit of information and further assume (a huge over-simplification) that neural connections have just two possible strengths (like a bit in a computer, which is either 1 or 0). Assume as we did before that each neuron connects to 1000 other neurons. Then each neuron has ‘write’ access to 1000 bits of information, or about 1 kilobyte. So we have 100 billion (number of neurons) X 1 K of storage capacity, or 100 billion kilobytes. That’s about 8 x 10^14 bits or 100 terabytes. Let's round it up to 10^15 bits, or 125 terabytes. Since in fact neural connections are not two-state but multi-state and since neuron bodies can also change their properties and thereby store information, this may be a very low estimate. On the other hand, since it counts every potential distinction as an encoded distinction (assuming no noise, no redundancy, no unused connections), it could be a huge over-estimate.

The number of bits in the brain is not equal to the number of items. For example, to store one letter of text (one item) on a computer takes a theoretical minimum of seven bits, and in real computers it usually takes more. To store one picture can take thousands or even millions of bits. The same must apply to the human brain, so if we want to count 'things' (potential wisping!) rather than bits, we need to make some adjustments for the fact that each memory must be composed of many bits.

The first person to try to estimate the amount of storage in a human brain was Robert Hooke, in 1666. He estimated how fast he could think, multiplied by his lifespan, and decided that he had 2 x 10^9 bits of storage. He had a high estimate of himself: his estimate for an average person was twenty times less, at 10^8 bits! The psychologist Tom Landauer wrote a paper in 1986 ("How Much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-term Memory", Cognitive Science, volume 10, pages 477-493), in which he tried to estimate from a review of experimental results how many useful distinctions a person might be able to remember in all. His estimate was one billion relevant distinctions (10^9 bits). At a 2002 Psychonomics conference presentation that I saw, Landauer re-visited this question. He used a novel technical method (whose details need not concern us here, which is good because I can't remember them now) to estimate how much word knowledge a person had. His new estimate is in the same ball park as Hooke's: 4 x 10^8 bits. This seems implausibly low to me: that's only about 50 megabytes. I can't believe that my brain wisps as lightly as that.

Eh. I don't really think the question of how much information the brain stores has a universally compelling answer, except with some really wide error margin. Let's say: the brain can store somewhere between 10^9 bits (125 megabytes) and 10^15 bits (125 terabytes). Only [!] six orders of magnitude difference, which is good enough when we are trying to count something as abstract as a random person's potential to sense wisps.


[Image: Altered Figure 27 "The commissural connecting the cerebellum to the olivary bodies" from Thomas Laycock's (1860) Mind and Brain: Or, the Correlations of Consciousness and Organisation; with Their Applications to Philosophy, Zoology, Physiology, Mental Pathology, and the Practice of Medicine, Volume 2 ]

No comments:

Post a Comment