Eliezer_Yudkowsky comments on How sure are you that brain emulations would be conscious? - Less Wrong

15 Post author: ChrisHallquist 26 August 2013 06:21AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (174)

You are viewing a single comment's thread. Show more comments above.

Comment author: Eliezer_Yudkowsky 25 August 2013 08:46:58PM 6 points [-]

The combination of verified pointwise causal isomorphism of repeatable small parts, combined with surface behavioral equivalence on mundane levels of abstraction, is sufficient for me to relegate the alternative hypothesis to the world of 'not bothering to think about it any more'..

Key word: "Sufficient". I did not say, "necessary".

Comment author: badtheatre 03 September 2013 05:03:15AM 2 points [-]

This brings up something that has been on my mind for a long time. What are the necessary and sufficient conditions for two computations to be (homeo?)morphic? This could mean a lot of things, but specifically I'd like to capture the notion of being able to contain a consciousness, so what I'm asking is, what we would have to prove in order to say program A contains a consciousness --> program B contains a consciousness. "pointwise" isomorphism, if you're saying what I think, seems too strict. On the other hand, allowing any invertible function to be a _morphism doesn't seem strict enough. For one thing we can put any reversible computation in 1-1 correspondence with a program that merely stores a copy of the initial state of the first program and ticks off the natural numbers. Restricting our functions by, say, resource complexity, also seems to lead to both similar and unrelated issues...

Has this been discussed in any other threads?