From time to time I encounter people who claim that our brains are really slow compared to even an average laptop computer and can't process big numbers.
At the risk of revealing my complete lack of knowledge of neural networks and how the brain works, I want to ask if this is actually true?
It took massive amounts of number crunching to create movies like James Cameron's Avatar. Yet I am able to create more realistic and genuine worlds in front of my minds eye, on the fly. I can even simulate other agents. For example, I can easily simulate sexual intercourse between me and another human. Which includes tactile and olfactory information.
I am further able to run real-time egocentric world-simulations to extrapolate and predict the behavior of physical systems and other agents. You can do that too. Having a discussion or playing football are two examples.
Yet any computer can outperform me at simple calculations.
But it seems to me, maybe naively so, that most of my human abilities involve massive amounts of number crunching that no desktop computer could do.
So what's the difference? Can someone point me to some digestible material that I can read up on to dissolve possible confusions I have with respect to my question?
It took 35,000 processor cores running to render Avatar. If we assume that a Six-Core Opteron 2400 (2009, same year as Avatar) has roughly 10^9 transistors, then we have (35,000/6)*10^9 = 5.83*10^12 transistors.
The primary visual cortex has 280 million neurons, while a typical neuron has 1.000 to 10.000 synapses. That makes 2.8*10^8*10^4 synapses, if we assume 10.000 per neuron, or 2.8*10^12.
By this calculation it takes 5.83*10^12 transistors to render Avatar and 2.8*10^12 synapses to simulate something similar on the fly. Which is roughly the same amount.
Since the clock rate of a processor is about 10^9 Hz and that of a neuron is 200 Hz, does this mean that the algorithms that our brain uses are very roughly (10^9)/200 = 5000000 times more efficient?
There are several issues here.
First, just because ~5x10^12 transistors was used to render Avatar (slower than real-time, btw) does not mean that it minimally requires ~5x10^12 transistors to render Avatar.
For example, I have done some prototyping for fast, high quality real-time volumetric rendering, and I'm pretty confident that the Avatar scenes (after appropriate database conversion) could be rendered in real-time on a single modern GPU using fast voxel cone tracing algorithms. That entail only 5*10^9 transistors, but we should also mention storage, b... (read more)