The switching rate in a processor is faster than the firing rate of neurons.
All else being equal, a computer should be faster than an aggregate of neurons. But all isn't equal, even when comparing different processors. Comparing transistors in a modern processor to synapses in a human brain yields many more synapses than transistors. Furthermore, the brain is massively parallel, and has a specialized architecture. For what it does, it's well optimized, at least compared to how optimized our software and hardware are for similar tasks at this point.
For instance, laptop processors are general purpose processors, being able to do many different tasks they aren't really fast or good at any. Some specific tasks may make use of custom made processors, which, even if their clock rate is slower, or if they have less transistors, will still vastly outperform a general purpose processor if they are to compete for the task they were custom-built for.
It took 35,000 processor cores running to render Avatar. If we assume that a Six-Core Opteron 2400 (2009, same year as Avatar) has roughly 10^9 transistors, then we have (35,000/6)*10^9 = 5.83*10^12 transistors.
The primary visual cortex has 280 million neurons, while a typical neuron has 1.000 to 10.000 synapses. That makes 2.8*10^8*10^4 synapses, if we assume 10.000 per neuron, or 2.8*10^12.
By this calculation it takes 5.83*10^12 transistors to render Avatar and 2.8*10^12 synapses to simulate something similar on the fly. Which is roughly the same amount....
From time to time I encounter people who claim that our brains are really slow compared to even an average laptop computer and can't process big numbers.
At the risk of revealing my complete lack of knowledge of neural networks and how the brain works, I want to ask if this is actually true?
It took massive amounts of number crunching to create movies like James Cameron's Avatar. Yet I am able to create more realistic and genuine worlds in front of my minds eye, on the fly. I can even simulate other agents. For example, I can easily simulate sexual intercourse between me and another human. Which includes tactile and olfactory information.
I am further able to run real-time egocentric world-simulations to extrapolate and predict the behavior of physical systems and other agents. You can do that too. Having a discussion or playing football are two examples.
Yet any computer can outperform me at simple calculations.
But it seems to me, maybe naively so, that most of my human abilities involve massive amounts of number crunching that no desktop computer could do.
So what's the difference? Can someone point me to some digestible material that I can read up on to dissolve possible confusions I have with respect to my question?