Rain comments on Tallinn-Evans $125,000 Singularity Challenge - Less Wrong

27 Post author: Kaj_Sotala 26 December 2010 11:21AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (369)

You are viewing a single comment's thread. Show more comments above.

Comment author: JoshuaZ 29 December 2010 05:44:06AM 2 points [-]

This doesn't address the most controversial aspect, which is that AI would go foom. If extreme fooming doesn't occur this isn't nearly as big an issue. That is an issue where many people have discussed it and not all have come away convinced. Robin Hanson had a long debate with Eliezer over this and Robin was not convinced. Personally, I consider fooming to be unlikely but plausible. But how likely one thinks it is matters a lot.

Comment author: Rain 29 December 2010 06:04:42PM 4 points [-]

This doesn't address the most controversial aspect, which is that [nuclear weapons] would [ignite the atmosphere]. If extreme [atmospheric ignition] doesn't occur this isn't nearly as big an issue.

Even without foom, AI is a major existential risk, in my opinion.