Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

TheOtherDave comments on Nonperson Predicates - Less Wrong

29 Post author: Eliezer_Yudkowsky 27 December 2008 01:47AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (175)

Sort By: Old

You are viewing a single comment's thread. Show more comments above.

Comment author: TheOtherDave 30 May 2012 04:02:43AM -1 points [-]

If I accept all of your suppositions, your conclusion doesn't seem particularly difficult to accept. Sure, after doing all the prep work you describe, executing a conscious experience (albeit an entirely static, non-environment-dependent one) requires a single operation... even the conscious experience of suffering. As does any other computation you might ever wish to perform, no matter how complicated.

That said, your suppositions do strike me as revealing some confusion between an isolated conscious experience (whatever that is) and the moral standing of a system (whatever that is).

Comment author: eurleif 30 May 2012 04:37:57AM 0 points [-]

Well, this post heavily hints that a system's moral standing is related to whether it is conscious. Elizezer mentions a need to tackle the hard problem of consciousness in order to figure out whether the simulations performed by our AI cause immoral suffering. Those simulations would be basically isolated; their inputs may be chosen based on our real-world requirements, but they don't necessarily correspond to what's actually going on in the real world; and their outputs would presumably be used in aggregate to make decisions, but not pushed directly into the outside world.

Maybe moral standing requires something else too, like self-awareness, in addition to consciousness. But wouldn't there still be a critical instruction in a self-aware and conscious program, where a conscious experience of being self-aware was produced? Wouldn't the same argument apply to any criteria given for moral standing in a deterministic program?

Comment author: TheOtherDave 30 May 2012 03:15:45PM 0 points [-]

It's not clear to me that whether a system is conscious (whatever that means) and whether it's capable of a single conscious experience (whatever that means) are the same thing.