This post is a followup to "We are not living in a simulation" and intended to help me (and you) better understand the claims of those who took a computationalist position in that thread. The questions below are aimed at you if you think the following statement both a) makes sense, and b) is true:
"Consciousness is really just computation"
I've made it no secret that I think this statement is hogwash, but I've done my best to make these questions as non-leading as possible: you should be able to answer them without having to dismantle them first. Of course, I could be wrong, and "the question is confused" is always a valid answer. So is "I don't know".
- As it is used in the sentence "consciousness is really just computation", is computation:
a) Something that an abstract machine does, as in "No oracle Turing machine can compute a decision to its own halting problem"?
b) Something that a concrete machine does, as in "My calculator computed 2+2"?
c) Or, is this distinction nonsensical or irrelevant? - If you answered "a" or "c" to question 1: is there any particular model, or particular class of models, of computation, such as Turing machines, register machines, lambda calculus, etc., that needs to be used in order to explain what makes us conscious? Or, is any Turing-equivalent model equally valid?
- If you answered "b" or "c" to question 1: unpack what "the machine computed 2+2" means. What is that saying about the physical state of the machine before, during, and after the computation?
- Are you able to make any sense of the concept of "computing red"? If so, what does this mean?
- As far as consciousness goes, what matters in a computation: functions, or algorithms? That is, does any computation that give the same outputs for the same inputs feel the same from the inside (this is the "functions" answer), or do the intermediate steps matter (this is the "algorithms" answer)?
- Would an axiomatization (as opposed to a complete exposition of the implications of these axioms) of a Theory of Everything that can explain consciousness include definitions of any computational devices, such as "and gate"?
- Would an axiomatization of a Theory of Everything that can explain consciousness mention qualia?
- Are all computations in some sense conscious, or only certain kinds?
ETA: By the way, I probably won't engage right away with individual commenters on this thread except to answer requests for clarification. In a few days I'll write another post analyzing the points that are brought up.
My $0.02, without reading other answers:
\1. I'm not sure, but I lean towards (b).
Unpacking a bit: As it is used in the sentence "the sum of 1 and 1 to yield 2 is a computation", my intuition is that something like (a) is meant. That said, it seems likely that this intuition comes from me reasoning about a category of computations as a cognitive shortcut, and then sloppily reifying the category. Human brains do that a lot. So I'm inclined to discard that intuition and assert that statements about 1+1=2 are statements about an abstract category in my mind of concretely instantiated computations (including hypothetical and counterfactual instantiations).
I'm perfectly comfortable using "machine" here as a catchall for anything capable of instantiating a computation, though if you mean something more specific by the use of that word then I might balk.
\2. N/A
\3. Oh, hey, look at that! (I wrote the above before reading this question.)
A couple of caveats: It's not saying anything terribly precise about the physical state of the machine, but I suppose we can speak very loosely about it. And there's an important use-mention distinction here; I can program a computer to reliably and meaningfully compute "blue" as the result of "2+2" by overriding the standard meanings of "2" and "+". Less absurdly, the question of whether 2+2=4 or 2 + 2 = 4.00000 actually can come up in real life.
Waving all that stuff aside, though, it's saying that prior to performing that computation the machine had some data structure(s) reliably isomorphic to two instances of values at a particular (identical) point along a number line, and of an operation that given a pair of such values returned a third value with a particular relationship to them. (I don't feel like trying to define addition right now.)
\4. Sure.
It means different things for different kinds of computing devices. For example, for the Pantone-matching software on my scanner, computing (a particular shade of) red means looking up the signals it's getting from the scanner in a lookup table and returning the corresponding code, which downstream processes can use for other purposes (e.g., figuring out what CMYK values to display on my monitor).
For my eye and visual cortex, it means something very roughly similar, though importantly different. The most important aspects of the difference for this discussion have to do with what sorts of downstream processes use the resulting signal for what sorts of operations.
\5. Either the question is confused, or I am.
Unpacking: How it feels to be a particular computation is an artifact of the structure of that computation; changing that structure in a particular way might change how it feels, or might not, depending on specifics. I'm not sure how to map the labels "outputs", "inputs", "inside", and "intermediate" in your question to that answer, though.
\6. That hinges a bit on how precisely we define "computational devices"; there are no doubt viable definitions for which I'd answer "yes," although my answer for an AND gate is "no".
Come to that, it also hinges on what axiomatization you're using; I suppose you could construct one that did, if you wanted to. I would just consider it unnecessarily complex.
Having said all that: No, I wouldn't expect a "simplest possible axiomatization" of a "theory of everything" to contain any "computational devices." (Scare quotes used to remind myself that I'm not sure I know what those phrases mean.)
\7. As above; it might, much as it might mention AND gates, but I would look askance at one that did. (Also as above, this hinges a fair bit on what counts as a quale, but my answer for "the perception of red" is "no".)
\8. Mostly, I think this question is a question about words, rather than about their referents.
There are computations to which my intuitive notion of the label "conscious" simply does not apply. But I suppose I could accept for the purposes of discussion a definition of "conscious" that applies in some sense to all computations, if someone were to propose one, though it would be a counterintuitive one.