JayDugger comments on The Strangest Thing An AI Could Tell You - Less Wrong

81 Post author: Eliezer_Yudkowsky 15 July 2009 02:27AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (574)

You are viewing a single comment's thread.

Comment author: JayDugger 15 July 2009 02:33:39PM 0 points [-]

@Liron, consciousness as an after-the-fact rationalization would surprise you?

And this post seems suspiciously like a set-up for Sterling's short story "The Compassionate, the Digital."

Comment author: Liron 15 July 2009 08:40:40PM 2 points [-]

Yeah, because I'm sure that consciously representing "I want to implement this software feature" is a direct cause of that software feature getting implemented. I would be surprised if you couldn't analyze the feature-implementation phenomenon by pointing to consciously-represented goals and subgoals.