Eliezer_Yudkowsky comments on Why safety is not safe - Less Wrong

48 Post author: rwallace 14 June 2009 05:20AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (97)

You are viewing a single comment's thread. Show more comments above.

Comment author: asciilifeform 14 June 2009 03:11:40PM *  3 points [-]

Would you have hidden it?

You cannot hide the truth forever. Nuclear weapons were an inevitable technology. Likewise, whether or not Eurisko was genuine, someone will eventually cobble together an AGI. Especially if Eurisko was genuine, and the task really is that easy. The fact that you seem persuaded of the possibility of Lenat having danced on the edge of creating hard takeoff gives me more interest than ever before in a re-implementation.

Reading "value is fragile" almost had me persuaded that blindly pursuing AGI is wrong, but shortly after, "Safety is not Safe" reverted me back to my usual position: stagnation is as real and immediate a threat as ever there was, vastly dwarfing any hypothetical existential risks from rogue AI.

For instance, bloat and out-of-control accidental complexity have essentially halted all basic progress in computer software. I believe that the lack of quality programming systems will lead (and may already have led) directly to stagnation in other fields, such as computational biology. The near-term future appears to resemble Windows Vista rather than HAL. Engelbart's Intelligence Amplification dream has been lost in the noise. I thus expect civilization to succumb to Natural Stupidity in the near term future, unless a drastic reversal in these trends takes place.

Comment author: Eliezer_Yudkowsky 15 June 2009 01:26:52AM 8 points [-]

Would you have hidden it?

I hope so. It was the right decision in hindsight, since the Nazi nuclear weapons program shut down when the Allies, at cost of some civilian lives, destroyed their source of deuterium. If they'd known they could've used purified graphite... well, they probably still wouldn't have gotten nuclear weapons in this Everett branch but they might have somewhere else.

Before 2001 I would probably have been on Fermi's side, but that's when I still believed deep down that no true harm could come to someone who was only faithfully trying to do science. (I.e. supervised universe thinking.)