KatjaGrace comments on Superintelligence 29: Crunch time - Less Wrong

8 Post author: KatjaGrace 31 March 2015 04:24AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (26)

You are viewing a single comment's thread.

Comment author: KatjaGrace 31 March 2015 04:28:45AM 5 points [-]

Are you concerned about AI risk? Do you do anything about it?

Comment author: Sebastian_Hagen 31 March 2015 09:25:42PM 5 points [-]

It's the most important problem of this time period, and likely human civilization as a whole. I donate a fraction of my income to MIRI.

Comment author: timeholmes 02 April 2015 10:37:00PM 2 points [-]

I'm very concerned with the risk, which I feel is at the top of catastrophic risks to humanity. With an approaching asteroid at least we know what to watch for! As an artist, I've been working mostly on this for the last 3 years, (see my TED talk "The Erotic Crisis", on YouTube) trying to think of ways of raise awareness and engage people in dialog. The more discussion the better I feel! And I'm very grateful for this forum and all who participated!