hegemonicon comments on Consciousness - Less Wrong

2 Post author: Mitchell_Porter 08 January 2010 12:18PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (221)

You are viewing a single comment's thread. Show more comments above.

Comment author: hegemonicon 08 January 2010 06:25:16PM 6 points [-]

My problem is I don't see how you can avoid a "that's how an algorithm feels from the inside" explanation somewhere down the line. Even if you create some theory that purports to account for the (say) mysterious redness of red, isn't there still a gap to bridge between that account and whatever your subjective perception - your feeling - of red is? I'm confused as to what an 'explanation' for the mysterious redness of red would even look like.

Comment author: Tyrrell_McAllister 08 January 2010 06:45:40PM 7 points [-]

If you can't even imagine what an answer would look like, you should doubt that you've successfully asked a question.

That's not supposed to be a conversation-stopper. It's just that the first step in the conversation should be to make the question clear.

Comment author: LauraABJ 09 January 2010 12:47:39AM 0 points [-]

What I think Mitchell is looking for (an he can correct me if I'm wrong) as an explanation of experience is some model that describes the elements necessary for experience and how they interact in some quantitative way. For example, let's pretend that flesh brains are not the only modules capable of experience, and that we can build experiences out of other materials. A theory of experience would help to answer: what materials can be used, what processing speeds are acceptable (ie, can experience exist in stasis), what cpus/processors/algorithms must be implemented, and what outputs will convince us that experience is taking place (vs creating a Chinese letter box). Now, I don't think we will have any way of answering these questions before uploading/AI, but I can conceive of ways of testing many variables in experience once a mind has been uploaded. We could change one variable- ask the subject to describe the change- change it back and ask the subject what his memory of the experience is, etc,etc. We can run simulations that are deliberately missing normal algorithms until we find which pieces of a mind are the bare bone essentials of experience. To me this is just another question for the neuroscientists and information theorists, once our technology is advanced enough to actually experiment on it. It is only a 'problem' if you believe p-zombies are possible, and that we might create entities that describe experience without having it.

Comment author: Liron 09 January 2010 03:04:24AM 3 points [-]

Traceback:

TooManyLinesError in paragraphs[0]

Comment author: hegemonicon 09 January 2010 01:54:14AM *  1 point [-]

This is a useful heuristic, but if anything it seems to dissolve the initial question of "Where's the qualia?" As DanArmak and RobinZ channeling Dennet point out elsewhere in the thread, questions about qualia don't appear to be answerable.