Birgitte comments on Not Taking Over the World - Less Wrong

21 Post author: Eliezer_Yudkowsky 15 December 2008 10:18PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (91)

Sort By: Old

You are viewing a single comment's thread.

Comment author: Birgitte 16 December 2008 03:26:32AM 3 points [-]

Eliezer: Let's say that someone walks up to you and grants you unlimited power.

Lets not exaggerate. A singleton AI wielding nanotech is not unlimited power; it is merely a Big Huge Stick with which to apply pressure to the universe. It may be the biggest stick around, but it's still operating under the very real limitations of physics - and every inch of potential control comes with a cost of additional invasiveness.

Probably the closest we could come to unlimited power, would be pulling everything except the AI into a simulation, and allowing for arbitrary amounts of computation between each tick.

Billy Brown: If you give each individual whatever they want you’ve just destroyed every variety of collectivism or traditionalism on the planet, and those who valued those philosophies will curse you.

It's probably not the worst tradeoff, being cursed only by those who feel their values should take precedence over those of other people.