James_Miller comments on AGI Quotes - Less Wrong

6 Post author: lukeprog 02 November 2011 08:25AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (88)

You are viewing a single comment's thread.

Comment author: James_Miller 02 November 2011 01:57:29PM 14 points [-]

We probably make thousands of species extinct per year through our pursuit of instrumental goals, why is it so hard to imagine that AGI could do the same to us?

Michael Anissimov

Comment author: Logos01 04 November 2011 02:05:11AM *  0 points [-]

... I wonder how "alone" I am in the notion that AGI causing human extinction may not be a net negative, in that so long as it is a sentient product of human endeavors it is essentially a "continuation" of humanity.

Comment author: JoshuaZ 04 November 2011 02:10:34AM *  4 points [-]

Two problems: An obnoxious optimizing process isn't necessarily sentient. And how much would you really want such a continuation if it say tried to put everything in its future lightcone into little smiley faces?

If it helps ask yourself how you feel about a human empire that expands through its lightcone preemptively destroying every single alien species before they can do anything with a motto of "In the Prisoners' Dilemma, Humanity Defects!" That sounds pretty bad doesn't it? Now note that the AGI expansion is probably worse than that.

Comment author: Logos01 04 November 2011 02:13:46AM -2 points [-]

Two problems: An obnoxious optimizing process isn't necessarily sentient.

Hence my caveat.

And how much would you really want such a continuation if it say tried to put everything in its future lightcone into little smiley faces?

I find the plausibility of a sentient AGI constrained to such a value to be vanishingly small.

If it helps ask yourself how you feel about human empire that expands through its lightcone preemptively destroying every single alien species before they can do anything with a motto of "In the Prisoner's Dilemma, Humanity Defects!" That sounds pretty bad doesn't it?

Not especially, no.

Comment author: JoshuaZ 04 November 2011 02:39:48AM 3 points [-]

I find the plausibility of a sentient AGI constrained to such a value to be vanishingly small.

It is one example of what could happen, smileys are but a specific example. (Moreover, this is an example which is disturbingly close to some actual proposals). The size of mindspace is probably large. The size of mindspace that does something approximating what we want is probably a small portion of that.

Not especially, no.

And the empire systematically wipes out human minorities and suppresses new scientific discoveries because they might disrupt stability. As a result, and to help prevent problems, everyone but a tiny elite is denied any form of life-extension technology. Even the elite has their lifespan only extended to about a 130 to prevent anyone from accumulating too much power and threatening the standard oligarchy. Similarly, new ideas for businesses are ruthlessly suppressed. Most people will have less mobility in this setting than an American living today. Planets will be ruthlessly terraformed and then have colonists forcively shipped their to help start the new groups. Most people have the equivalent of reality TV shows and the hope of the winning the lottery to entertain themselves. Most of the population is so ignorant that they don't even realize that humans originally came from a single planet.

If this isn't clear, I'm trying to make this about as dystopian as I plausibly can. If I haven't succeeded at that, please imagine what you would think of as a terrible dystopia and apply that. If really necessary, imagine some puppy and kitten torturing too.