steven comments on Contaminated by Optimism - Less Wrong

10 Post author: Eliezer_Yudkowsky 06 August 2008 12:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (74)

Sort By: Old

You are viewing a single comment's thread.

Comment author: steven 06 August 2008 08:38:38AM 0 points [-]

Do you devote a significant amount of your time and resources to making paperclips, given the possibility that you're being simulated?

Keeping everyone alive would not take a significant amount of a paperclip maximizer's time and resources. (Though for utilitarians this probably means it doesn't count.) But the key difference is this: human-like goal systems seem like they will gain access to lots more simulation resources than paperclip maximizers or some other specific human-indifferent goal system (the set of all human-indifferent goal systems together is a different matter, but they're not a coherent bloc).