Roko comments on Shock Level 5: Big Worlds and Modal Realism - Less Wrong

15 [deleted] 25 May 2010 11:19PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (140)

You are viewing a single comment's thread. Show more comments above.

Comment author: Yvain 26 May 2010 10:30:18AM *  25 points [-]

Does this theory really alter the probability that your next chocolate bar will turn into a hamster? After all, if there were only one of you, maybe there's a one in a trillion chance that one is in a simulation whose alien overlords will turn a chocolate bar into a hamster. If there are a trillion of you, and one of those trillion is in such a simulation, and your subjective experience has an equal chance of continuing down any branch, then the probability of the bar turning into the hamster is still one in a trillion. Although I've never seen a proof, intuitively you'd expect those two probabilities to be the same, or at least not be able to predict how they differ.

It all adds up to normality...except that this takes a lot of the oomph out of the project to reduce existential risk. Saving all humanity from destruction makes a much better motivator for me than reducing the percentage of branches of humanity that end in destruction by an insignificaaEEEEGH MY KEYBOARD JUST TURNED INTO A BADGER!!11asdaghf

Comment deleted 26 May 2010 11:36:44AM [-]
Comment author: Kutta 26 May 2010 01:20:49PM *  2 points [-]

It sounds a bit chicken-and-egg to me. My subjective probability estimate of simulators' motivations comes great part from the frequency and nature of observed bizarre events. Based on what I know about my universe the vast majority of my simulators don't interfere with my physical laws.

Comment deleted 26 May 2010 02:30:58PM *  [-]
Comment author: Nick_Tarleton 26 May 2010 08:30:15PM *  3 points [-]

I hear things like this a lot, but I'm not sure if I've heard a clear reason to think that the people that the simulators (of a long-running, naturalistic simulation) are interested in should be more likely to be conscious, or otherwise gain any sort of epistemological or metaphysical significance.

Comment deleted 26 May 2010 11:55:46PM [-]
Comment author: MichaelVassar 28 May 2010 04:41:09PM 4 points [-]

"interesting" is very much the wrong word though. More like informative regarding the optimization target that one cooperates by pursuing.

Comment author: Yvain 26 May 2010 04:46:04PM 0 points [-]

Isn't the measure of the set of me not in simulations (in a big world) equal to the probability that I'm not in a simulation (if there's only one of me)?