You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Will_Newsome comments on Cult impressions of Less Wrong/Singularity Institute - Less Wrong Discussion

29 Post author: John_Maxwell_IV 15 March 2012 12:41AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (247)

You are viewing a single comment's thread. Show more comments above.

Comment author: XiXiDu 15 March 2012 10:54:49AM *  7 points [-]

There seem to be more and more people who think sacrificing their life to help build FAI is an ethical imperative.

I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?

It seems like nobody who wouldn't do anything else anyway is doing something. I mean, I don't think Eliezer Yudkowsky or Luke Muehlhauser would lead significantly different lifes if there were no existential risks. They are just the kind of people who enjoy doing what they do.

Are there people who'd rather play games all day but sacrifice their lifes to solve friendly AI?

Comment author: Will_Newsome 19 March 2012 03:22:56AM *  9 points [-]

I have always been extremely curious about this. Do people really sacrifice their lifes or is it largely just empty talk?

I sacrificed some very important relationships and the life that could have gone along with them so I could move to California, and the only reason I really care about humans in the first place is because of those relationships, so...

This is the use of metaness: for liberation—not less of love but expanding of love beyond local optima.

— Nick Tarleton's twist on T.S. Eliot