timtyler comments on Q&A with Abram Demski on risks from AI - Less Wrong

22 Post author: XiXiDu 17 January 2012 09:43AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (70)

You are viewing a single comment's thread.

Comment author: timtyler 17 January 2012 01:21:27PM *  2 points [-]

Anyway, AI would be lower on my list than global warming.

That is pretty ridiculous prioritisation - if you ask me.

Comment author: Emile 17 January 2012 01:43:01PM *  3 points [-]

It seems like a very reasonable position that

  • global warming is more likely to cause massive deaths than AI, but

  • AI is more likely to exterminate mankind than global warming

Comment author: timtyler 17 January 2012 02:10:58PM *  1 point [-]

The term "existential risks" is in the question being asked. I think it should count as context.

Comment author: Emile 17 January 2012 02:53:24PM 1 point [-]

True - though maybe some consider "major catastrophe causing the collapse of civilization as we know it" as falling under existential risk, even if it would take much more than that to actually put mankind in danger.

I wonder if Demski would actually give a high probability of human extinction because of global warming, or whether it's just that he used a broad interpretation of "existential risk".

Comment author: abramdemski 18 January 2012 07:53:50PM 3 points [-]

Yea, I have to admit that when I wrote that I meant "lower on my list of concerns for the next century".

Comment author: timtyler 17 January 2012 03:04:00PM -2 points [-]

Global warming is surely fluff - even reglaciation poses a bigger risk.