AlexU comments on Real-Life Anthropic Weirdness - Less Wrong

24 Post author: Eliezer_Yudkowsky 05 April 2009 10:26PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (86)

You are viewing a single comment's thread. Show more comments above.

Comment author: AlexU 06 April 2009 01:42:53AM 5 points [-]

Shouldn't the fact that they can probably imagine better versions of themselves reduce this probability? If you're in a holodeck, in addition to putting yourself at the center of the Singularity, why wouldn't you also give yourself the looks of Brad Pitt and the wealth of Bill Gates?

Comment author: JamesAndrix 06 April 2009 04:10:30AM *  16 points [-]

We are actually in a 'chip-punk' version of the past in which silicon based computers became available all the way back in the late 20th century. The original Eliezer made friendly AI with vacuum tubes.

Comment author: Yvain 06 April 2009 09:50:49AM *  5 points [-]
  1. No if they are in a historical simulation. The real architects of the Singularity weren't billionaires.
  2. No if they are in some kind of holo-game, for the same reason that people playing computer games don't hack them to make their character level infinity and impervious to bullets. Where would be the fun in that?
Comment author: Hans 06 April 2009 10:22:30AM 2 points [-]

Not really. Think of Nozick's experience machine. If you were to use the machine to simulate yourself in a situation extremely close to the center of the singularity, would you also give yourself the looks of Brad Pitt and the wealth of Bill Gates?

a) Would this not make the experience feel so 'unreal' that your simulated self would have trouble believing it's not a simulation, and therefore not enjoy the simulation at all? In constructing the simulation, you need to define how many positive attributes you can give your simulated self before it realizes that its situation is so improbable that it must be a simulation. I'd use caution and not make my simulated self too 'lucky.'

b) More importantly, you may believe that a) doesn't apply, and that your simulated self would take the blue pill, and willingly choose to continue to live in the simulation. Even then, having great looks and great wealth would probably distract you from creating the singularity. All I'd care about is the singularity, and I'd design the simulation so that I have a comfortable, not too distracting life that would allow me to focus maximally on the singularity, and nothing else.

Comment author: AlexU 06 April 2009 02:01:07PM *  5 points [-]

I agree these are possibilities. However, it seems to me that if you're going to use improbable good fortune in some areas as evidence for being in a holodeck, it only makes sense to use misfortune (or at least lack of optimization, or below-averageness) in other areas as evidence against it. It doesn't sit well with me to write off every shortcoming as an intentional contrivance to make the simulation more "real" for you, or to give you additional challenges. Of course, we're only talking a priori probability here; if, say, Eliezer directly catalyzed the Singularity and found himself historically renowned, the odds would have to go way up.