(This post grew out of an old conversation with Wei Dai.)
Imagine a person sitting in a room, communicating with the outside world through a terminal. Further imagine that the person knows some secret fact (e.g. that the Moon landings were a hoax), but is absolutely committed to never revealing their knowledge of it in any way.
Can you, by observing the input-output behavior of the system, distinguish it from a person who doesn't know the secret, or knows some other secret instead?
Clearly the only reasonable answer is "no, not in general".
Now imagine a person in the same situation, claiming to possess some mental skill that's hard for you to verify (e.g. visualizing four-dimensional objects in their mind's eye). Can you, by observing the input-output behavior, distinguish it from someone who is lying about having the skill, but has a good grasp of four-dimensional math otherwise?
Again, clearly, the only reasonable answer is "not in general".
Now imagine a sealed box that behaves exactly like a human, dutifully saying things like "I'm conscious", "I experience red" and so on. Moreover, you know from trustworthy sources that the box was built by scanning a human brain, and then optimizing the resulting program to use less CPU and memory (preserving the same input-output behavior). Would you be willing to trust that the box is in fact conscious, and has the same internal experiences as the human brain it was created from?
A philosopher believing in computationalism would emphatically say yes. But considering the examples above, I would say I'm not sure! Not at all!
I think maybe I'm not being clear.
If you want to tell me what a chair is, you can point to a chair and its characteristics and I can look at it. I can then notice that when I look at that chair, and when I look at an object inside my house, they look pretty much the same. So I conclude that the object inside my house seems to be what you would call a chair. (Of course, you'd probably describe a chair in a more complicated way, but it would come down to a lot of instances of that.)
If I try to do that for consciousness, one of the intermediate steps is missing. I can't look at your consciousness, then look at mine, and say "hmm, they seem to be the same sort of thing". Each one is (or is purported to be) only visible to one person.
The fact that I can "notice myself being conscious" doesn't change this. I can't compare consciousnesses. While it's true that I can't directly compare my idea of sitting to your idea of sitting, I can go through the intermediary of asking you to sit, then comparing what I see when you sit to what I see when I sit.
If you notice when things look pretty much the same, then I can explain what I mean by consciousness, without you having to see what my consciousness is like. In fact, we can assume I have no consciousness and you are the only one who has it: we can talk about it anyway.
First, notice that things look pretty similar at all the times when you ar... (read more)