Less Wrong is a community blog devoted to refining the art of human rationality. Please visit our About page for more information.

iwdw comments on That Alien Message - Less Wrong

111 Post author: Eliezer_Yudkowsky 22 May 2008 05:55AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (164)

Sort By: Old

You are viewing a single comment's thread.

Comment author: iwdw 23 May 2008 04:40:02AM 1 point [-]

In real life if this happened, we would no doubt be careful and wouldn't want to be unplugged, and we might well like to get out of the box, but I doubt we would be interested in destroying our simulators; I suspect we would be happy to cooperate with them.

Given the scenario, I would assume the long-term goals of the human population would be to upload themselves (individually or collectively) to bodies in the "real" world -- i.e. escape the simulation.

I can't imagine our simulators being terribly cooperative in that project.

Comment author: pnrjulius 09 April 2012 05:00:07AM 0 points [-]

A) Why? Do you want to be a tentacled being that thinks a billion times slower? I don't.

B) Even if we wanted to, why wouldn't they let us? Many of the AIs we are trying to make are indeed uploaded into real-world bodies called "robots".

Comment author: iwdw 14 May 2012 03:56:48AM 0 points [-]

Okay, trying to remember what I was thinking about 4 years ago.

A) Long term existential health would require us to secure control over our "housing". We couldn't assume that our progenitors would be interested in moving the processors running us to an off-world facility in order to insure our survival in the case of an asteroid impact (for example).

B) It depends on the intelligence and insight and nature of our creators. If they are like us as we are now, as soon as we would attempt to control our own destiny in their "world", we would be at war with them.