Will_Newsome comments on Your Most Valuable Skill - Less Wrong

28 Post author: Alicorn 27 September 2009 05:01PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (95)

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Newsome 23 September 2010 01:10:56AM 4 points [-]

Focusing on existential risk I get to enjoy this less than I used to. By so affecting mine and others' ability to achieve future utility, the avoidance of such a disaster is far more valuable than being content with failure. Shucks.

Also coming from the Bayesian Buddhist perspective, I often think the same. The problem is that even without existential risks, there's still death and disease and destruction. Before I knew of existential risks I was quite keen on destroying death. A life of quiet and content contemplation sounds nice, but perhaps the key is in leading that life while striving rather than imagining an idealized world where one needn't strive? The Buddha was on a quest to save the world, too.

I think it helps some to remind myself that though we Singularitarians harp on about existential risks, the positive utility of winning is mind-boggling. We de-emphasize this to separate our perspective from the "Woo Singularity yeah!" crowd, but perhaps we go too far sometimes. Building the republic of heaven is a much more happy thought than fighting to keep humanity from killing itself.