James_Miller comments on Real-Life Anthropic Weirdness - Less Wrong

24 Post author: Eliezer_Yudkowsky 05 April 2009 10:26PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (86)

You are viewing a single comment's thread.

Comment author: James_Miller 05 April 2009 11:01:28PM 14 points [-]

So with what probability should Barack Obama believe he is on a holodeck, and how should this belief influence his behavior?

Comment author: MBlume 05 April 2009 11:19:21PM 9 points [-]

I don't think it should influence his behavior very much. Even if he assigns strong probability to being in a holodeck, his expected utility calculations should, I think, be dominated by the case in which he is in fact PotUS, since a president is in a better position to purchase utility.

Comment author: Yvain 05 April 2009 11:43:53PM *  17 points [-]

And not only Obama. The closer you are to the center of human history, the more likely you are to be on a holodeck. People simulating others should be more likely to simulate people in historically interesting times, and people simulating themselves for fun and blocking their memory should be more likely to simulate themselves as close to interesting events as possible.

And...if Singularity theory is true, the Singularity will be the most interesting and important event in all human history. Now, all of us are suspiciously close to the Singularity, with a suspiciously large ability to influence its course. Even I, a not-too-involved person who's just donated a few hundred dollars to SIAI and gets to sit here talking to the SIAI leadership each night, am probably within the top millionth of humans who have ever lived in terms of Singularity "proximity".

And Michael Vassar and Eliezer are so close to the theorized center of human history that they should assume they're holodecking with probability ~1.

After all, which is more likely from their perspective - that they're one of the dozen or so people most responsible for creating the Singularity and ensuring Friendly AI, or that they're some posthuman history buff who wanted to know what being the guy who led the Singularity Institute was like?

(the alternate explanation, of course, is that we're all on the completely wrong track and that we're simply in the larger percentage of humans who think they're extremely important.)

Comment author: AnnaSalamon 06 April 2009 03:44:30AM 12 points [-]

And Michael Vassar and Eliezer are so close to the theorized center of human history that they should assume they're holodecking with probability ~1.

The "with probability ~1" part is wrong, AFAICT. I'm confused about how to think about anthropics, and everybody I've talked to is also confused as far as I've noticed. Given this confusion, we can perhaps obtain simulation-probabilities by estimating the odds that our best-guess means of calculating anthropic probabilities is reliable, and then obtaining an estimate that we’re in a holodeck conditional on our anthropic calculation methods being correct. But it would be foolish to assign more than, say, a 90% estimate to “our best-guess means of calculating anthropic probabilities is basically correct”, unless someone has a better analysis of such methods than I’d expect.

Comment author: MBlume 05 April 2009 11:54:35PM *  13 points [-]

Still, I think that in most EU calculations, the weight of "holy crap this is improbable, how am I actually this important?" on the one side, and of "well, if I am this dude, I'd really better not @#$% this up" on the other should more or less scale together. I don't think I'm stepping into Pascalian territory here.

Comment author: James_Miller 05 April 2009 11:53:56PM 3 points [-]

The idea of eternal inflation might cut against this. Under eternal inflation new universes are always being created at an exponentially increasing rate so there are always far more young than old universes. So under this theory if you are uncertain of whether you are at a relatively early (pre-singularity) or relatively late (post-singularity) point in the universe you are almost certainly in the relatively early state because there are so many more universes in this state.

Note: Eliezer and Robin object to this idea for reasons I don't understand.

Comment author: RobinHanson 06 April 2009 12:16:08AM 2 points [-]

James, I don't think inflation implies there are more early than late universes, nor do I object to inflation. I just don't think inflation solves time-asymmetry.

Comment author: AlexU 06 April 2009 01:42:53AM 5 points [-]

Shouldn't the fact that they can probably imagine better versions of themselves reduce this probability? If you're in a holodeck, in addition to putting yourself at the center of the Singularity, why wouldn't you also give yourself the looks of Brad Pitt and the wealth of Bill Gates?

Comment author: JamesAndrix 06 April 2009 04:10:30AM *  16 points [-]

We are actually in a 'chip-punk' version of the past in which silicon based computers became available all the way back in the late 20th century. The original Eliezer made friendly AI with vacuum tubes.

Comment author: Yvain 06 April 2009 09:50:49AM *  5 points [-]
  1. No if they are in a historical simulation. The real architects of the Singularity weren't billionaires.
  2. No if they are in some kind of holo-game, for the same reason that people playing computer games don't hack them to make their character level infinity and impervious to bullets. Where would be the fun in that?
Comment author: Hans 06 April 2009 10:22:30AM 2 points [-]

Not really. Think of Nozick's experience machine. If you were to use the machine to simulate yourself in a situation extremely close to the center of the singularity, would you also give yourself the looks of Brad Pitt and the wealth of Bill Gates?

a) Would this not make the experience feel so 'unreal' that your simulated self would have trouble believing it's not a simulation, and therefore not enjoy the simulation at all? In constructing the simulation, you need to define how many positive attributes you can give your simulated self before it realizes that its situation is so improbable that it must be a simulation. I'd use caution and not make my simulated self too 'lucky.'

b) More importantly, you may believe that a) doesn't apply, and that your simulated self would take the blue pill, and willingly choose to continue to live in the simulation. Even then, having great looks and great wealth would probably distract you from creating the singularity. All I'd care about is the singularity, and I'd design the simulation so that I have a comfortable, not too distracting life that would allow me to focus maximally on the singularity, and nothing else.

Comment author: AlexU 06 April 2009 02:01:07PM *  5 points [-]

I agree these are possibilities. However, it seems to me that if you're going to use improbable good fortune in some areas as evidence for being in a holodeck, it only makes sense to use misfortune (or at least lack of optimization, or below-averageness) in other areas as evidence against it. It doesn't sit well with me to write off every shortcoming as an intentional contrivance to make the simulation more "real" for you, or to give you additional challenges. Of course, we're only talking a priori probability here; if, say, Eliezer directly catalyzed the Singularity and found himself historically renowned, the odds would have to go way up.

Comment author: RobinHanson 06 April 2009 02:17:46AM 3 points [-]

The alternate explanation is of course far more likely a priori.

Comment author: Eliezer_Yudkowsky 06 April 2009 04:50:25AM 0 points [-]

How likely is it that, say, at least 10 people think they're Barack Obama, only one of which is correct?

Comment author: RobinHanson 06 April 2009 12:22:02PM 2 points [-]

Being mistaken about your importance is different from, and much more common than, being mistaken about who/where you are.

Comment author: Eliezer_Yudkowsky 06 April 2009 12:27:30PM 2 points [-]

Unless most conscious observers are ancestor simulations of people in positions of historical importance, in which case most people are correct about the importance of the position and incorrect about who/where they are.

(Vide Doomsday Argument, Simulation Argument, and the "surprise" of finding yourself on Ancient Earth rather than much later in a civilization's development. Of course these are all long-standing controversies in anthropics, I'm just raising their existence.)

Among people who believe themselves to be Barack Obama, most are mistaken about their position rather than the importance of the position.

Comment author: RobinHanson 06 April 2009 03:31:38PM 1 point [-]

Agreed.

Comment author: gjm 06 April 2009 08:23:39AM 1 point [-]

Not all that unlikely. There have certainly been a lot of people who have believed themselves to be Napoleon or Jesus. I'd say 10 Obamas seems a little right now, but I wouldn't be at all surprised by, say, three.

Comment author: gjm 06 April 2009 08:44:03PM 1 point [-]

"seems a little MUCH right now", I meant.

Comment author: Unknowns 02 June 2010 12:41:34PM -1 points [-]

Note that the alternate explanation is MUCH more probable.

Comment author: CronoDAS 06 April 2009 01:50:07PM 0 points [-]