billswift comments on Be careful with thought experiments - Less Wrong

6 Post author: lukeprog 18 May 2012 09:54AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (97)

You are viewing a single comment's thread.

Comment author: billswift 18 May 2012 12:00:01PM 0 points [-]

I don't think there is anything special about consciousness. "Consciousness" is what any intelligence feels from the inside, just as qualia are what sense perceptions feel like from the inside.

Comment author: RichardKennaway 18 May 2012 12:40:20PM *  3 points [-]

For qualia, that is precisely the definition of the word, and therefore says nothing to explain their existence. For consciousness, it also comes down to a definition, given a reasonable guess at what is meant by "intelligence" in this context.

What is this "inside"?

Comment author: ciphergoth 18 May 2012 12:28:00PM 3 points [-]

I am inclined to believe that what we call "consciousness" and even "sentience" may turn out to be ideas fully as human-specific as Eliezer's favourite example, "humour".

There's at least a possibility that "suffering" is almost as specific.

Comment author: [deleted] 18 May 2012 02:15:14PM 1 point [-]

There's at least a possibility that "suffering" is almost as specific.

Why? I'd expect that having a particular feeling when you're damaging yourself and not liking that feeling would be extremely widespread. (Unless by "suffering" you mean something else than ‘nociception’, in which case can you elaborate?)

Comment author: ciphergoth 18 May 2012 02:19:58PM 1 point [-]

I mean something morally meaningful. I don't think a chess computer suffers when it loses a game, no matter how sophisticated. I expect that self-driving cars are programmed to try to avoid accidents even when other drivers drive badly, but I don't think they suffer if you crash into them.

Comment author: [deleted] 19 May 2012 08:38:08AM 0 points [-]

Yeah, if by “suffering” you mean “nociception I care about”, it sure is human-specific.

Comment author: ciphergoth 19 May 2012 11:11:33AM 1 point [-]

I'd find this more informative if you explicitly addressed my examples?

Comment author: [deleted] 19 May 2012 03:20:04PM 1 point [-]

Well, I wouldn't usually call the thing a chess computer or a self-driving car is minimizing “suffering” (though I could if I feel like using more anthropomorphizing language than usual). But I'm confused by this, because I have no problem using that word to refer to a sensation felt by a chimp, a dog, or even an insect, and I'm not sure what is that an insect has and a chess computer hasn't that causes this intuition of mine. Maybe the fact that we share a common ancestor, and our nociception capabilities are synapomorphic with each other... but then I think even non-evolutionists would agree a dog can suffer, so it must be something else.