DanielLC comments on The Strangest Thing An AI Could Tell You - Less Wrong

81 Post author: Eliezer_Yudkowsky 15 July 2009 02:27AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (574)

You are viewing a single comment's thread. Show more comments above.

Comment author: Fleisch 08 October 2010 12:09:35PM 24 points [-]

Every time you imagine a person, that simulated person becomes conscious for the time of your simulation, therefore, it is unethical to imagine people. Actually, it's just morally wrong to imagine someone suffering, but for security reasons, you shouldn't do it at all. Reading fiction (with conflict in it) is, by conclusion, the one human endeavor that has caused more suffering than anything else, and the FAIs first action will be to eliminate this possibility.

Comment author: DanielLC 09 April 2011 09:22:28PM 1 point [-]

I find the idea that they're conscious more likely than the idea that death is inherently bad. I also doubt that they're as conscious as humans (either it isn't discrete, and a human is more, or it is, and a human has more levels of consciousness), and that their emotions are what they appear to be.