Alicorn comments on The Strangest Thing An AI Could Tell You - Less Wrong

81 Post author: Eliezer_Yudkowsky 15 July 2009 02:27AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (574)

You are viewing a single comment's thread. Show more comments above.

Comment author: simpleton 15 July 2009 05:11:41AM 11 points [-]

I would believe that human cognition is much, much simpler than it feels from the inside -- that there are no deep algorithms, and it's all just cache lookups plus a handful of feedback loops which even a mere human programmer would call trivial.

I would believe that there's no way to define "sentience" (without resorting to something ridiculously post hoc) which includes humans but excludes most other mammals.

I would believe in solipsism.

I can hardly think of any political, economic, or moral assertion I'd regard as implausible, except that one of the world's extant religions is true (since that would have about as much internal consistency as "2 + 2 = 3").

Comment author: Alicorn 15 July 2009 05:30:08AM 11 points [-]

Solipsism? Isn't there some contradiction inherent in believing in solipsism because someone else tells you that you should?

Comment author: simpleton 15 July 2009 06:07:19AM 6 points [-]

Well, I wouldn't rule out any of:

1) I and the AI are the only real optimization processes in the universe.

2) I-and-the-AI is the only real optimization process in the universe (but the AI half of this duo consistently makes better predictions than "I" do).

3) The concept of personal identity is unsalvageably confused.

Comment author: CannibalSmith 15 July 2009 11:46:03AM 0 points [-]

If you perceive other people [telling you you should believe in solipsism] it doesn't mean they really exist as something more than just your perception of them.

Of course, if someone is trying to convert other people to solipsism, he doesn't know what solipsism is.