This post is a followup to "We are not living in a simulation" and intended to help me (and you) better understand the claims of those who took a computationalist position in that thread. The questions below are aimed at you if you think the following statement both a) makes sense, and b) is true:
"Consciousness is really just computation"
I've made it no secret that I think this statement is hogwash, but I've done my best to make these questions as non-leading as possible: you should be able to answer them without having to dismantle them first. Of course, I could be wrong, and "the question is confused" is always a valid answer. So is "I don't know".
- As it is used in the sentence "consciousness is really just computation", is computation:
a) Something that an abstract machine does, as in "No oracle Turing machine can compute a decision to its own halting problem"?
b) Something that a concrete machine does, as in "My calculator computed 2+2"?
c) Or, is this distinction nonsensical or irrelevant? - If you answered "a" or "c" to question 1: is there any particular model, or particular class of models, of computation, such as Turing machines, register machines, lambda calculus, etc., that needs to be used in order to explain what makes us conscious? Or, is any Turing-equivalent model equally valid?
- If you answered "b" or "c" to question 1: unpack what "the machine computed 2+2" means. What is that saying about the physical state of the machine before, during, and after the computation?
- Are you able to make any sense of the concept of "computing red"? If so, what does this mean?
- As far as consciousness goes, what matters in a computation: functions, or algorithms? That is, does any computation that give the same outputs for the same inputs feel the same from the inside (this is the "functions" answer), or do the intermediate steps matter (this is the "algorithms" answer)?
- Would an axiomatization (as opposed to a complete exposition of the implications of these axioms) of a Theory of Everything that can explain consciousness include definitions of any computational devices, such as "and gate"?
- Would an axiomatization of a Theory of Everything that can explain consciousness mention qualia?
- Are all computations in some sense conscious, or only certain kinds?
ETA: By the way, I probably won't engage right away with individual commenters on this thread except to answer requests for clarification. In a few days I'll write another post analyzing the points that are brought up.
Here is another attempt to rephrase one of the opinions hold within the philosophy camp:
Imagine 3 black boxes, each of them containing a quantum-level emulation of some existing physical system. Two boxes contain the emulations of two different human beings and one box the emulation of an environment.
Assume that if you were to connect all 3 black boxes and observe the behavior of the two humans and their interactions you would be able to verify that the behavior of the humans, including their utterances, would equal that of the originals.
If one was to disconnect one of the black boxes containing the emulation of a human and store it within the original physical environment, containing the other original human being, the new system would not exhibit the same behavior as either the system of black boxes or the genuinely physical system.
A system made up of black boxes containing emulations of physical objects and genuinely physical objects does not equal a system made up of only black boxes or physical objects alone.
The representations of the original physical systems that are being emulated within the black boxes are one level removed from the originals. A composition of those levels will exhibit a different interrelationship.
Once you enable the black box to interact with the higher level in which it resides, the system made up of the black box, the original environment and the human being (representation-level / physical-level / physical-level) will approach the behavior exhibited in a context of emulated systems and the original physical system.
You can equip the black box with sensors and loudspeakers yet it will not exhibit the same behavior. You can further equip it with an avatar, still, the original and emulated human will treat an avatar differently than another original, respectively emulated human. You can give it a robot body. The behavior will still not equal that of the behavior that a system consisting of the original physical systems would exhibit and neither the behavior that would be exhibited in the context of a system made up of emulations.
You may continue to tweak what was once the black box containing an emulation of a human being. But as you approach a system that will exhibit the same behavior as the original system you are slowly reproducing the original human being, you are turning the representation into a reproduction.
...This argument strikes me as, pardon me, tremendously silly. Just off the top of my head, it seems to still hold if you replace the 'quantum level simulation of a person' with an exact duplicate of the original brain in a saline bath, hooked up to a feed of oxygenated blood. Should we therefore conclude that human brains are not conscious?
EDIT: Oh blast, didn't realize this was from months ago.