This is probably a misconception for several reasons. Firstly, given that we don't fully understand the learning mechanisms in the brain yet, it's unlikely that it's mostly one thing ...
We don't understand the learning mechanisms yet, but we're quite familiar with the data they use as input. "Internally" supervised learning is just another term for semi-supervised learning anyway. Semi-supervised learning is plenty flexible enough to encompass the "multi-objective" features of what occurs in the brain.
The GTX TitanX has a peak perf of 6.1 terraflops, so you'd need only a few hundred to get a petaflop supercomputer (more specifically, around 175).
Raw and "peak performance" FLOPS numbers should be taken with a grain of salt. Anyway, given that a TitanX apparently draws as much as 240W of power at full load, your "petaflop-scale supercomputer" will cost you a few hundred-thousand dollars and draw 42kW to do what the brain does within 20W or so. Not a very sensible use for that amount of computing power - except for the odd publicity stunt, I suppose. Like playing Go.
It's just a circuit, and it obeys the same physical laws.
Of course. Neuroglia are not magic or "woo". They're physical things, much like silicon chips and neurons.
Raw and "peak performance" FLOPS numbers should be taken with a grain of salt.
Yeah, but in this case the best convolution and gemm codes can reach like 98% efficiency for the simple standard algorithms and dense input - which is what most ANNs use for about everything.
given that a TitanX apparently draws as much as 240W of power at full load, your "petaflop-scale supercomputer" will cost you a few hundred-thousand dollars and draw 42kW to do what the brain does within 20W or so
Well, in this case of Go and for an increasing numbe...
DeepMind's go AI, called AlphaGo, has beaten the European champion with a score of 5-0. A match against top ranked human, Lee Se-dol, is scheduled for March.