Alicorn comments on The Strangest Thing An AI Could Tell You - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (574)
I would believe that human cognition is much, much simpler than it feels from the inside -- that there are no deep algorithms, and it's all just cache lookups plus a handful of feedback loops which even a mere human programmer would call trivial.
I would believe that there's no way to define "sentience" (without resorting to something ridiculously post hoc) which includes humans but excludes most other mammals.
I would believe in solipsism.
I can hardly think of any political, economic, or moral assertion I'd regard as implausible, except that one of the world's extant religions is true (since that would have about as much internal consistency as "2 + 2 = 3").
Solipsism? Isn't there some contradiction inherent in believing in solipsism because someone else tells you that you should?
Well, I wouldn't rule out any of:
1) I and the AI are the only real optimization processes in the universe.
2) I-and-the-AI is the only real optimization process in the universe (but the AI half of this duo consistently makes better predictions than "I" do).
3) The concept of personal identity is unsalvageably confused.
If you perceive other people [telling you you should believe in solipsism] it doesn't mean they really exist as something more than just your perception of them.
Of course, if someone is trying to convert other people to solipsism, he doesn't know what solipsism is.