"Hypercomputation" is a term coined by two philosophers, Jack Copeland and Dianne Proudfoot, to refer to allegedly computational processes that do things Turing machines are in principle incapable of doing. I'm somewhat dubious of whether any of the proposals for "hypercomputation" are really accurately described as computation, but here, I'm more interested in another question: is there any chance it's possible to build a physical device that answers questions a Turing machine cannot answer?
I've read a number of Copeland and Proudfoot's articles promoting hypercomputation, and they claim this is an open question. I have, however, seen some indications that they're wrong about this, but my knowledge of physics and computability theory isn't enough to answer this question with confidence.
Some of the ways to convince yourself that "hypercomputation" might be physically possible seem like obvious confusions, for example if you convince yourself that some physical quality is allowed to be any real number, and then notice that because some reals are non-computable, you say to yourself that if only we could measure such a non-computable quantity then we could answer questions no Turing machine could answer. Of course, the idea of doing such a measurement is physically implausible even if you could find a non-computable physical quantity in the first place. And that mistake can be sexed up in various ways, for example by talking about "analog computers" and assuming "analog" means it has components that can take any real-numbered value.
Points similar to the one I've just made exist in the literature on hypercomputation (see here and here, for example). But the critiques of hypercomputation I've found tend to focus on specific proposals. It's less clear whether there are any good general arguments in the literature that hypercomputation is physically impossible, because it would require infinite-precision measurements or something equally unlikely. It seems like it might be possible to make such an argument; I've read that the laws of physics are consiered to be computable, but I don't have a good enough understanding of what that means to tell if it entails that hypercomputation is physically impossible.
Can anyone help me out here?
Taking the karma hit to clarify why I, at least, downvoted this comment. When you say the Turing model is outdated, you seem to be assuming that the model was originally intended as a physical model of how actual computers do (or should) work. But that was never its purpose. It was supposed to be a mathematical model that captures the intuitive notion of an effective procedure. All the talk of tapes and tape heads is just meant to aid understanding, and maybe that part is outdated, but the actual definition of a Turing machine itself can be given purely mathematically, without any assumptions about physical instantiation. Saying the Turing model is outdated would only make sense if there were good reason to doubt the Church-Turing thesis, and there isn't.
Thank you.
I seriously disagree, incidentally. For example, it has some pretty specific assumptions about instantiation - it will be the sort of computer they had seventy some odd years ago. Because the single pool of memory, the single processor, the scan-one mechanism of traversal, all of these are assumptions which have serious effects on the very field the mathematical model was devised to consider, computability.
(And I can point out one good reason to doubt the Church-Turing thesis. A Turing machine is incapable of replicating the nondeterministic b... (read more)