You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

katydee comments on Cult impressions of Less Wrong/Singularity Institute - Less Wrong Discussion

29 Post author: John_Maxwell_IV 15 March 2012 12:41AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (247)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 15 March 2012 10:54:49AM *  7 points [-]

There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative.

I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?

It seems like nobody who wouldn't do anything else anyway is doing something. I mean, I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.

Are there people who'd rather play games all day but sacrifice their lifes to solve friendly AI?

Comment author: katydee 16 March 2012 05:29:43PM 10 points [-]

I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.

I think at one point Eliezer said that, if not for AGI/FAI/singularity stuff, he would probably be a sci-fi writer. Luke explicitly said that when he found out about x-risks he realized that he had to change his life completely.