ata comments on Consciousness of simulations & uploads: a reductio - Less Wrong

1 Post author: simplicio 21 August 2010 08:02PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (139)

You are viewing a single comment's thread.

Comment author: ata 21 August 2010 08:24:38PM *  9 points [-]

I once took this reductio in the opposite direction and ended up becoming convinced that consciousness is what it feels like inside a logically consistent description of a mind-state, whether or not it is instantiated anywhere. I'm still confused about some of the implications of this, but somewhat less confused about consciousness itself.

Take a moment to convince yourself that there is nothing substantively different between this scenario and the previous one, except that it contains approximately 10,000 times the maximum safe dosage of in principle.

Once again, Simone will claim she's conscious.

...Yeah, I'm sorry, but I just don't believe her.

I don't claim certain knowledge about the ontology of consciousness, but if I can summon forth a subjective consciousness ex nihilo by making the right series of graphite squiggles (which don't even mean anything outside human minds), then we might as well just give up and admit consciousness is magic.

"If I can summon forth a subjective consciousness ex nihilo by making the right blobs of protein throw around the right patterns of electrical impulses and neurotransmitters (which don't even mean anything outside human minds), then we might as well just give up and admit consciousness is magic."

Remember that it doesn't count as a reductio ad absurdum unless the conclusion is logically impossible (or, for the Bayesian analogue, very improbable according to some actual calculation) rather than merely implausible-sounding. I'd rather take Simone's word for it than believe my intuitions about plausibility.

Comment author: simplicio 21 August 2010 08:37:18PM 3 points [-]

Doesn't this imply that an infinity of different subjective consciousnesses are being simulated right now, if only we knew how to assign inputs and outputs correctly?

Comment author: orthonormal 24 August 2010 06:24:18AM 2 points [-]

This relates to the notion of "joke interpretations" under which a rock can be said to be implementing a given algorithm. There's some discussion of it in Good and Real.

Comment author: PaulAlmond 21 August 2010 10:03:22PM 2 points [-]

I started a series of articles, which got some criticism on LW in the past, dealing with this issue (among others) and this kind of ontology. In short, if an ontology like this applies, it does not mean that all computations are equal: There would be issues of measure associated with the number (I'm simplifying here) of interpretations that can find any particular computation. I expect to be posting Part 4 of this series, which has been delayed for a long time and which will answer many objections, in a while, but the previous articles are as follows:

Minds, Substrate, Measure and Value, Part 1: Substrate Dependence. http://www.paul-almond.com/Substrate1.pdf.

Minds, Substrate, Measure and Value, Part 2: Extra Information About Substrate Dependence. http://www.paul-almond.com/Substrate2.pdf.

Minds, Substrate, Measure and Value, Part 3: The Problem of Arbitrariness of Interpretation. http://www.paul-almond.com/Substrate3.pdf.

This won't resolve everything, but should show that the kind of ontology you are talking about is not a "random free for all".

Comment author: jimrandomh 21 August 2010 08:56:31PM 1 point [-]

Yes, it does. And if the universe is spatially infinite, then that implies an infinity of different subjective consciousnesses, too. Neither of these seems like a problem to me.

Comment author: Dre 21 August 2010 09:43:15PM 1 point [-]

Not necessarily. See Chlamer's reply to Hilary Putnam who asserted something similar, especially section 6. Basically, if we require that all of the "internal" structure of the computation be the same in the isomorphism and make a reasonable assumption about the nature consciousness, all of the matter in the Hubble volume wouldn't be close to large enough to simulate a (human) consciousness.

Comment author: Mitchell_Porter 22 August 2010 04:15:36AM -1 points [-]

I once took this reductio in the opposite direction and ended up becoming convinced that consciousness is what it feels like inside a logically consistent description of a mind-state, whether or not it is instantiated anywhere.

Do you think the world outside your body is still there when you're asleep? That objects are still there when you close your eyes?