pjeby comments on We are not living in a simulation - Less Wrong

-9 Post author: dfranke 12 April 2011 01:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (211)

You are viewing a single comment's thread. Show more comments above.

Comment author: dfranke 12 April 2011 02:46:54PM 1 point [-]

I apologize if this is recapitulating earlier comments -- I haven't read this entire discussion -- and feel free to point me to a different thread if you've covered this elsewhere, but: on your view, could a simulation of me in a computer classify the things that it has (which, on your view, cannot be actual qualia) into categories like "pleasant" and "unpleasant" and "indifferent"? Could it tell me that certain (simulations of) meat tastes like chicken, and if it did, could I understand what it meant by "taste" and understand the gist of "like chicken"?

I'm not certain what you mean by "could a simulation of me do X". I'll read it as "could a simulator of me of do X". And my answer is yes, a computer program could make those judgements without actually experiencing any of those qualia, just like it could make judgements about what trajectory the computer hardware would follow if it were in orbit around Jupiter, without it having to actually be there.

Comment author: pjeby 12 April 2011 05:13:35PM *  7 points [-]

a computer program could make those judgements (sic) without actually experiencing any of those qualia

Just as an FYI, this is the place where your intuition is blindsiding you. Intuitively, you "know" that a computer isn't experiencing anything... and that's what your entire argument rests on.

However, this "knowing" is just an assumption, and it's assuming the very thing that is the question: does it make sense to speak of a computer experiencing something?

And there is no reason apart from that intuition/assumption, to treat this as a different question from, "does it make sense to speak of a brain experiencing something?".

IOW, substitute "brain" for every use of "computer" or "simulation", and make the same assertions. "The brain is just calculating what feelings and qualia it should have, not really experiencing them. After all, it is just a physical system of chemicals and electrical impulses. Clearly, it is foolish to think that it could thereby experience anything."

By making brains special, you're privileging the qualia hypothesis based on an intuitive assumption.

Comment author: dfranke 12 April 2011 05:22:34PM -2 points [-]

I don't think you read my post very carefully. I didn't claim that qualia are a phenomenon unique to human brains. I claimed that human-like qualia are a phenomenon unique to human brains. Computers might very well experience qualia; so might a lump of coal. But if you think a computer simulation of a human experiences the same qualia as a human, while a lump of coal experiences no qualia or different ones, you need to make that case to me.

Comment author: pjeby 12 April 2011 09:14:28PM 4 points [-]

But if you think a computer simulation of a human experiences the same qualia as a human, while a lump of coal experiences no qualia or different ones, you need to make that case to me.

Actually, I'd say you need to make a case for WTF "qualia" means in the first place. As far as I've ever seen, it seems to be one of those words that people use as a handwavy thing to prove the specialness of humans. When we know what "human qualia" reduce to, specifically, then we'll be able to simulate them.

That's actually a pretty good operational definition of "reduce", actually. ;-) (Not to mention "know".)