MBlume comments on Real-Life Anthropic Weirdness - Less Wrong

24 Post author: Eliezer_Yudkowsky 05 April 2009 10:26PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (86)

You are viewing a single comment's thread. Show more comments above.

Comment author: Yvain 05 April 2009 11:43:53PM *  17 points [-]

And not only Obama. The closer you are to the center of human history, the more likely you are to be on a holodeck. People simulating others should be more likely to simulate people in historically interesting times, and people simulating themselves for fun and blocking their memory should be more likely to simulate themselves as close to interesting events as possible.

And...if Singularity theory is true, the Singularity will be the most interesting and important event in all human history. Now, all of us are suspiciously close to the Singularity, with a suspiciously large ability to influence its course. Even I, a not-too-involved person who's just donated a few hundred dollars to SIAI and gets to sit here talking to the SIAI leadership each night, am probably within the top millionth of humans who have ever lived in terms of Singularity "proximity".

And Michael Vassar and Eliezer are so close to the theorized center of human history that they should assume they're holodecking with probability ~1.

After all, which is more likely from their perspective - that they're one of the dozen or so people most responsible for creating the Singularity and ensuring Friendly AI, or that they're some posthuman history buff who wanted to know what being the guy who led the Singularity Institute was like?

(the alternate explanation, of course, is that we're all on the completely wrong track and that we're simply in the larger percentage of humans who think they're extremely important.)

Comment author: MBlume 05 April 2009 11:54:35PM *  13 points [-]

Still, I think that in most EU calculations, the weight of "holy crap this is improbable, how am I actually this important?" on the one side, and of "well, if I am this dude, I'd really better not @#$% this up" on the other should more or less scale together. I don't think I'm stepping into Pascalian territory here.