You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

JoshuaZ comments on A boltzmann brain question. - Less Wrong Discussion

8 Post author: DuncanS 25 September 2011 10:35PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (24)

You are viewing a single comment's thread.

Comment author: JoshuaZ 25 September 2011 10:56:56PM *  3 points [-]

This seems to be really hitting on an issue that is only marginally related to Boltzmann brains and is made more confusing by the really counterintuitive stuff about Boltzmann brains.

Whenever one is trying to make any anthropic argument one has to ask what is an observer? If one believed in ontologically irreducible observers (something close to the classical notion of a soul in many cultures) this wouldn't be a problem. The problem here arises primarily from the difficulty in trying to understand what it means for something to be an observer in a universe where no observer seems to be irreducible.

Incidentally, I sometimes think that the Boltzmann brain argument is an argument that something is very wrong with our understanding of the eventual fate of the universe. The essential problem is that the idea doesn't add up to normality. I don't know how much this should impact my estimates at all (such as whether it should make me slightly doubt current estimates that say we won't have a Big Crunch). Anthropics can be really confusing.

Comment author: endoself 26 September 2011 12:55:36AM 2 points [-]

UDT does anthropics without reference classes.

Comment author: DuncanS 25 September 2011 11:09:18PM 1 point [-]

I think you're right - if there was a homunculus of some kind somewhere, then the problem apparently goes away (well, it goes inside the homunculus where it remains as unsolved as ever.) What is clear is that the complexity of our thoughts can't exist in a small enough partial brain, it needs the whole thing to be there - just as with the PC. The complexity is perhaps being hidden in the fiction of continuing to provide the inputs?