In response to Failed Utopia #4-2
Comment author: Bogdan_Butnaru2 21 January 2009 01:56:36PM 12 points [-]

That's not the message Eliezer tries to convey, Russell.

If I understood it, it's more like "The singularity is sure to come, and transhumanists should try very hard to guide it well, lest Nature just step on them and everyone else. Oh, by the way, it's harder than it looks. And there's no help."

In response to Failed Utopia #4-2
Comment author: Bogdan_Butnaru2 21 January 2009 01:51:32PM 6 points [-]

I was just thinking: A quite perverse effect in the story would be if the genie actually _could_ have been stopped and/or improved: That is, its programming allowed it to be reprogrammed (and stop being evil, presumably leading to better results), but due to the (possibly complex) interaction between its 107 rules it didn't actually have any motivation to reveal that (or teach the necessary theory to someone) before 90% of people decided to kill it.