Johnicholas comments on Anthropomorphic AI and Sandboxed Virtual Universes - Less Wrong

-3 Post author: jacob_cannell 03 September 2010 07:02PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (123)

You are viewing a single comment's thread.

Comment author: Johnicholas 03 September 2010 08:00:34PM 3 points [-]

A paraphrase from Greg Egan's "Crystal Nights" might be appropriate here: "I am going to need some workers - I can't do it all alone, someone has to carry the load."

Yes, if you could create a universe you could inflict our problems on other people. However, recursive solutions (in order to be solutions rather than infinite loops) still need to make progress on the problem.

Comment author: jacob_cannell 03 September 2010 10:11:00PM 3 points [-]

Yes, and I discussed how you could alter some aspects of reality to make AI itself more difficult in the simulated universe. This would effectively push back the date of AI simulation in the simulated universe and avoid wasting computational resources on pointless simulated recursion.

And as mentioned, attempting to simulate an entire alternate earth is only one possibility. There are numerous science fiction created world routes you could take which could constrain and focus the sims to particular research topics or endeavors.

Comment author: jacob_cannell 03 September 2010 08:12:49PM 1 point [-]

Progress on what problem?

The entire point of creating AI is to benefit mankind, is it not? How is this scenario intrinsically different?

Comment author: Snowyowl 03 September 2010 09:56:21PM 0 points [-]

Johnicolas is suggesting that if you create a simulated universe in the hope that it will provide ill-defined benefits for mankind (e.g. a cure for cancer), you have to exclude the possibility that your AIs will make a simulated universe inside the simulation in order to solve the same problem. Because if they do, you're no closer to an answer.

Comment author: jacob_cannell 03 September 2010 10:07:32PM 1 point [-]

Ah my bad - I misread him.