khafra comments on Dreams of AIXI - Less Wrong

-1 Post author: jacob_cannell 30 August 2010 10:15PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (145)

You are viewing a single comment's thread. Show more comments above.

Comment author: Mass_Driver 02 September 2010 04:54:27PM 1 point [-]

A highly detailed model of me, may not be me. But it will, at least, be a model which (for purposes of prediction via similarity) thinks itself to be Eliezer Yudkowsky. It will be a model that, when cranked to find my behavior if asked "Who are you and are you conscious?", says "I am Eliezer Yudkowsky and I seem have subjective experiences" for much the same reason I do.

I buy that. That sort of model could probably exist.

Your "zombie", in the philosophical usage of the term, is putatively a being that is exactly like you in every respect - identical behavior, identical speech, identical brain; every atom and quark in exactly the same position, moving according to the same causal laws of motion - except that your zombie is not conscious.

That sort of zombie can't possibly exist.

If you don't believe an emulated mind can be conscious, do you believe that your mind is noncomputable or that meat has special computational properties?

It's not that I don't believe an emulated mind can be conscious. Perhaps it could. What boggles my mind is the assertion that emulation is sufficient to make a mind conscious -- that there exists a particular bunch of equations and algorithms such htat when they are written on a piece of paper they are almost certainly non-conscious, but when they are run through a Turing machine they are almost certainly conscious.

I have no opinion about whether my mind is computable. It seems likely that a reasonably good model of my mind might be computable.

I'm not sure what to make of the proposition that meat has special computational properties. I wouldn't put it that way, especially since I don't like the connotation that brains are fundamentally physically different from rocks. My point isn't that brains are special; my point is that matter-energy is special. Existence, in the physical sense, doesn't seem to me to be a quality that can be specified in an equation or an algorithm. I can solve Maxwell's equations all day long and never create a photon from scratch.

That doesn't necessarily mean that photons have special computational properties; it just means that even fully computable objects don't come into being by virtue of their having been computed. I guess I don't believe in substrate independence?

Comment author: khafra 02 September 2010 05:45:26PM 2 points [-]

I think you've successfully analyzed your beliefs, as far as you've gone--it does seem that "substrate independence" is something you don't believe in. However, "substrate independence" is not an indivisible unit; it's composed of parts which you do seem to believe in.

For instance, you seem to accept that the highly detailed model of EY, whether that just means functionally emulating his neurons and glial cells, or actually computing his hamiltonian, will claim to be him, for much the same reason he does. If we then simulate, at whatever level appropriate to our simulated EY, a highly detailed model of his house and neighborhood that evolves according to the same rules that the real life versions do, he will think the same things regarding these things that the real life EY does.

If we go on to simulate the rest of the universe, including all the other people in it, with the same degree of fidelity, no observation or piece of evidence other than the anthropic could tell them they're in a simulation.

Bear in mind that nothing magic happens when these equations go from paper to computer: If you had the time and low mathematical error rate and notebook space to sit down and work everything out on paper, the consequences would be the same. It's a slippery concept to work one's intuition around, but xkcd #505 gives as good an intuition pump as I've seen.

Comment author: jacob_cannell 02 September 2010 08:03:57PM 1 point [-]

but xkcd #505 gives as good an intuition pump as I've seen

what is this btw?

Comment author: mattnewport 02 September 2010 08:09:00PM 2 points [-]