Here's the patent, since I couldn't find any other detailed documentation. It describes two separate implementations:
The slides linked in the OP are about the digital one, and only once mention the possibility of analogue as an intuition pump. I don't know which one the quoted performance numbers are for.
Here is presentation from a researcher at MIT on a novel way of designing computer processors. It relies on performing approximate, rather than exact, mathematical operations (like the meat-based processor in our heads!). Claimed benefits are a 10,000-fold improvement in speed, while the errors introduced by the approximations are postulated to be insignificant in many applications.
http://web.media.mit.edu/~bates/Summary_files/BatesTalk.pdf
Slide #2 of the presentation offers a fascinating insight: We currently work around the limitations of the processing substrate to implement a precise computation, and it is becoming increasingly difficult:
------------------
THE MOTIVATING PROBLEM:
Computations specified by programmers are implemented as behavior in physical material
• Hardware designer’s job:
efficiently implement Math (what sw wants) using Physics (what silicon offers)
(near) perfect arith noisy, approximate
uniform mem delay delay ~ distance
• Increasingly difficult as decades passed and transistor counts exploded
• Now each instruction (increment, load register, occasionally multiply) invokes >10M transistor operations, even though a single transistor can perform, for instance, an approximate exponentiate or logarithm
--------------------
The parallels and contrasts with our own brain are what interested me the most. Perhaps one day the most powerful computers will be running on "corrupted hardware" of sorts.