Douglas_Knight comments on Bloggingheads: Yudkowsky and Aaronson talk about AI and Many-worlds - Less Wrong

18 Post author: Vladimir_Nesov 16 August 2009 04:06PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (102)

You are viewing a single comment's thread. Show more comments above.

Comment author: Douglas_Knight 18 August 2009 04:36:31AM 2 points [-]

It's a purely theoretical counterfactual about the combination of Moore's law and Grover's algorithm.

Moore's law says that the computer becomes twice as efficient in 18 months. Grover's algorithm says that the time taken by a quantum computer to solve SAT is the square root of the time required by a classical computer. Thus in 18 months, Moore's law of hardware should make the quantum computer 4 times as fast.

Comment author: timtyler 18 August 2009 07:39:55AM *  1 point [-]

Well, I can see what math was done. The problem is the false assertion. I learned in math classes that if you accept one false thing, you can prove everything, and consequently your understanding of the difference between what's true and what's not dwindles to zero. You can't just believe one false thing.

If we actually "switched to quantum computers" it isn't clear we would get an exponential trajectory at all - due to the proximity of physical limits. If we did get an exponential trajectory, I can see no coherent reason for thinking the doubling time would relate to that of classical computers - because the technology is quite different. Currently, quantum computers grow mostly by adding qubits - not by the shrinking in component size that drives Moore's law in classical computers. That increases their quantum-parallelism, but doesn't affect their speed.

Comment author: pengvado 18 August 2009 08:19:58AM *  1 point [-]

Assume the number of quantum gate-ops per second doubles every 18 months. Assume SAT is O(2^n) on a classical computer and O(2^(n/2)) by Grover's. Then the maximum feasible problem size on a classical computer increases by 1 every 18 months, and on a quantum computer increases by 2. No factors of anything involved.
Alternately, if you measure a fixed problem size, then by assumption speed doubles for both.
So where does 4x come from?

Comment author: Douglas_Knight 18 August 2009 05:59:14PM 1 point [-]

It just comes from treating classical computers as the correct measuring stick. It would be more precise to refer, as you do, to 18 months as the add one time than the doubling time. But if you do call it the doubling time, then for quantum computers, it becomes the 4x time. Of course, it's not uniform--it doesn't apply to problems in P.

Comment author: timtyler 18 August 2009 06:28:15PM 0 points [-]

With classical computers Moore's law improves serial and parallel performance simulataneously - by making components smaller.

With quantum computers serial and parallel performance are decoupled - more qubits improves parallel performance and minaturisation has no effect on the number of qubits, but improves serial processing performance. So, there are two largely independent means of speeding up quantum computing. Which one supposedly doubles twice as fast as classical computers? Neither - AFAICS.

Comment author: Douglas_Knight 18 August 2009 09:25:25PM 1 point [-]

Sorry, my original response should have been "yes, you aren't getting into the spirit of the counterfactual."