Pavitra comments on Could/would an FAI recreate people who are information-theoretically dead by modern standards? - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (44)
Certainly the vast majority of Boltzmann brains don't become gods, in the same way that the vast majority of virtual particles don't form brains. But it only takes one, ever.
However, it occurs to me that I haven't actually done the math, and the improbability of an AGI forming out of the ether may well exceed the space-and-time volume of the universe from Big Bang to heat death.
This sounds like a plausible argument for heat death as a hard deadline on the birth of a Boltzmann god. But once one exists, any others that arise in its future light cone are rendered irrelevant.
That seems correct. So the details come down to precisely how many Boltzmann brains one expects to arise, what sort of goals they'll have, and how much resources they'll have. This seems very tough to estimate.