James_Miller comments on An Xtranormal Intelligence Explosion - Less Wrong

4 Post author: James_Miller 07 November 2010 11:42PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (80)

You are viewing a single comment's thread. Show more comments above.

Comment author: James_Miller 08 November 2010 02:32:41AM 1 point [-]

For almost any objective an AI had, it could better accomplish it the more free energy the AI had. The AI would likely go after entropy losses from both stars and people. The AI couldn't afford to wait to kill people until after it had dealt with nearby stars because by then humans would have likely created another AI god.

Comment author: Pavitra 08 November 2010 03:08:14AM 0 points [-]

Assuming that by "AI" you mean something that maximizes a utility function, as opposed to a dumb apocalypse like a grey-goo or energy virus scenario.

Comment author: bogdanb 08 November 2010 07:23:22AM *  3 points [-]

I can see how a “dumb apocalypse like a grey-goo or energy virus” would be Artificial, but why would you call it Inteligent?

On this site, unless otherwise specified, AI usually means “at least as smart as a very smart human”.

Comment author: Pavitra 08 November 2010 01:36:01PM 2 points [-]

Yeah, that makes sense. I was going to suggest "smart enough to kill us", but that's a pretty low bar.