You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

8YfSLVVpnYLstBFVtC9q comments on Irrationality Game III - Less Wrong Discussion

11 Post author: CellBioGuy 12 March 2014 01:51PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (204)

You are viewing a single comment's thread. Show more comments above.

Comment author: solipsist 12 March 2014 06:45:29PM *  28 points [-]

You (the reader) do not exist.

EDIT: That was too punchy and not precise. The reasoning behind the statement:

Most things which think they are me are horribly confused gasps of consciousness. Rational agents should believe the chances are small that their experiences are remotely genuine.

EDIT 2: After thinking about shminux's comment, I have to retract my original statement about you readers not existing. Even if I'm a hopelessly confused Boltzmann brain, the referent "you" might still well exist. At minimum I have to think about existence more. Sorry!

Comment author: [deleted] 12 March 2014 06:58:05PM *  1 point [-]

.

Comment author: solipsist 12 March 2014 07:30:34PM *  2 points [-]

To quote the adage, I'm a solipsist, and am surprised everyone else isn't too. I think any intelligent agent should conclude that it is probably something akin to a Boltzmann brain. You could plausibly argue that I am cheating with pronoun references (other people might agree with the solipsistic logic, but centered around them). Is that what you are asking?

EDIT

Is there anything about the world that you expect would appear different to you because of this belief?

Not really. I think some of the problems with AIXI may be AIXI acting rationally where the desired behavior is irrational, but that's the only time I can think of it coming up outside of a philosophy discussion.