Thomas comments on Less Wrong Q&A with Eliezer Yudkowsky: Video Answers - Less Wrong

41 Post author: MichaelGR 07 January 2010 04:40AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (94)

You are viewing a single comment's thread. Show more comments above.

Comment author: Kevin 08 January 2010 02:09:26AM *  6 points [-]

20: What is the probability that this is the ultimate base layer of reality?

Eliezer gave the joke answer to this question, because this is something that seems impossible to know.

However, I myself assign a significant probability that this is not the base level of reality. Theuncertainfuture.com tells me that I assign a 99% probability of AI by 2070 and it starts approaching .99 before 2070. So why would I be likely to be living as an original human circa 2000 when transhumans will be running ancestor simulations? I suppose it's possible that transhumans won't run ancestor simulations, but I would want to run ancestor simulations, for my merged transhuman mind to be able to assimilate the knowledge of running a human consciousness of myself through interesting points in human history.

The zero one infinity rule also makes it seem more unlikely this is the base level of reality. http://catb.org/jargon/html/Z/Zero-One-Infinity-Rule.html

It seems rather convenient that I am living in the most interesting period in human history. Not to mention I have a lifestyle in the top 1% of all humans living today.

I believe this is a minority viewpoint here, so my rationalist calculus is probably wrong. Why?

Comment author: Thomas 08 January 2010 07:05:01PM 4 points [-]

If the probability, that you are inside a simulation is p, what's the probability that your master simulator is also simulated?

How tall is this tower, most likely?

Comment author: Cyan 08 January 2010 07:54:47PM *  1 point [-]

Being in a simulation within a simulation (nested to any level) implies being in a simulation. The proper decomposition is p = sum over all positive N of (probability of simulation nested to level N)

Comment author: Thomas 08 January 2010 10:15:47PM 3 points [-]

The top simulator has N operations to execute before his free enthalpy basin is empty.

Every level down, this number is smaller. Before long, there is impossible to create a nontrivial simulation inside the current. This is the bottom one.

This simulation tower is just a great way to squander all the free enthalpy you have. Is the top simulation master that stupid?

I doubt it.

Comment author: Kevin 09 January 2010 06:13:32AM *  -1 points [-]

In that sense, there's actually a significant risk to the singularity. Why should the simulation master (I usually facetiously use the phrase "our overlords" when referring to this entity) let us ever run a simulation that is likely to result in an infinitely nested simulation? Maybe that's why the LHC keeps blowing up.

Comment author: DanArmak 08 January 2010 11:49:51PM *  1 point [-]

You also need to include scenarios for infinitely-high towers, or closed-loop towers, or branching and merging networks, or one simulation being run in several (perhaps infinitely many) simulating worlds, or the other way around...

I don't think we can assign a meaningful prior to any of these, and so we can't calculate the probability of being in a simulation.

Comment author: Kevin 09 January 2010 06:15:19AM 0 points [-]

I don't think the probability calculation is meaningful because the infinities mess it up. But you still need to ask, are you in the original 2010 or one of infinitely many possible ways to be in a simulated 2010? I can't assign a probability; but I have a strong intuition when comparing one to infinite.