It took 35,000 processor cores running to render Avatar. If we assume that a Six-Core Opteron 2400 (2009, same year as Avatar) has roughly 10^9 transistors, then we have (35,000/6)*10^9 = 5.83*10^12 transistors.
The primary visual cortex has 280 million neurons, while a typical neuron has 1.000 to 10.000 synapses. That makes 2.8*10^8*10^4 synapses, if we assume 10.000 per neuron, or 2.8*10^12.
By this calculation it takes 5.83*10^12 transistors to render Avatar and 2.8*10^12 synapses to simulate something similar on the fly. Which is roughly the same amount.
Since the clock rate of a processor is about 10^9 Hz and that of a neuron is 200 Hz, does this mean that the algorithms that our brain uses are very roughly (10^9)/200 = 5000000 times more efficient?
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
Intuitively this doesn't seem right at all: I can think of plenty of things that a human plus an external memory aid (like a pencil + paper) can do that a laptop can't, but (aside from dumb hardware stuff like "connect to the internet" and so on) I can't think of anything for which the reverse is true; while I can think of plenty of things that they both can do, but a laptop can do much faster. Or am I misinterpreting you?
I'm not sure I understand your question.
I guess part of my point is that a laptop processor is a very general purpose tool, while the human brain is a collection of specialized modules. Also, the more general a tool is, the less efficient it will be on average for any task.
The human brain might be seen as a generalist, but not in the same way a laptop computer processor is.
Besides, even a laptop processor has certain specializations and advantages over the human brain in certain narrow domains, like for instance among others, number crunching and fast arithmetic operations.