Virtual habitability
Blue Brain 2008 estimate
from this article posted on 2008-03-03: |
Markram estimates that in order to accurately simulate the trillion synapses in the human brain, you'd need to be able to process about 500 petabytes of data (peta being a million billion, or 10 to the fifteenth power). But if computing speeds continue to develop at their current exponential pace, and energy efficiency improves, Markram believes that he'll be able to model a complete human brain on a single machine in ten years or less. |
These numbers don't make sense, especially when combined with the 2005 figures below:
- Markram says "trillion synapses", but singinst.org says "100 trillion"
- Markram says you'd need to process ~500 PB of data -- is that online storage? Per second?
- If it's online storage, then his time estimate seems far too optimistic, as he is looking at AI-level computing in 10 years despite needing over 1000 times as much storage
Ok, here's the actual Blue Brain Project web site.
Woozle's ~2002 estimate
(Originally posted here)
"The human brain contains somewhere around 100 billion neurons and 100 trillion synapses." [ http://www.singinst.org/seedAI/general.html ]
Oversimplistically modeling a synapse as a single-precision floating-point number (4 bytes), this means we'd need 400,000,000,000,000 bytes (400 terabytes) just to store the data contained in a typical human brain.
A good portion of those 100 terasynapses are probably in the autonomic sections of the brain and may be unnecessary (when an Essienaut is not using a robotic extension, at any rate); we may also get lucky and find that 4 bytes of resolution is unnecessary for many synapses. On the other hand, we will probably need additional storage for the Essienaut's work and daily life (think of all the things you "store" in your immediate environment)... so 400 TB seems like a reasonable guess of what we'd need in order to get started, if not to live in digital luxury.
Starting with a present low-budget disc storage capacity of 120 GB (about $100 at today's prices) and doubling every 1.5 months, how long does it take to reach 400 TB? Approximately 12 doublings gets us from 120 to 400,000 -- 17.5 years. That's NOT LONG.
Ok, let's be pessimistic... we need 400 TB of RAM, not disc; disc space is too slow. Present RAM sizes generally start around 128MB, but it's not hard to find system boards supporting over a gigabyte (and it's not terribly expensive to fill them up). So let's start at 1 GB. 18.6 doublings gets us from 1 to 400,000 -- not quite 30 years.
Now, CPU power...
"Neurons are slow; a neuron has a firing rate of only 200 times per second, and the signals travel at a maximum speed of only 150 meters per second." - ibid.
One might think "oh, THAT one's EASY!" but you have to remember that this is EACH of those 100 terasynapses firing 200 times a second (ok, processing signals from neurons firing at about 200 times/second -- same thing, computationally), all working in parallel... or in other words, (100e+9 x 200 = ) 20,000 teraflops/second. Determining flops for today's microprocessors is tricky because it really depends what sort of processing you're doing, but here's my best estimate. AMD's latest 64-bit processor (available at retail for about $500) can be clocked at up to 2,600 MHz and can run two 64-bit (8-byte) operations per cycle. Boldly extrapolating that this means it could run four 4-byte floating point ops in those two cycles, where there are 2,600,000 cycles per second, that means 10.4 gflops.
From 10.4 gflops to 20,000 tflops (20,000,000 gflops) is... almost 21 doublings, or 31 years. (And who's to say that we won't come up with some specialized hardware for this stuff before then?)
So it seems clear to me that barring major setbacks, we should have quite adequate computing power (at least to set up a rough digital homestead) by the year 2035 or so.