FrankAdamek comments on Cookies vs Existential Risk - Less Wrong

8 Post author: FrankAdamek 30 August 2009 03:56AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (22)

You are viewing a single comment's thread. Show more comments above.

Comment author: FrankAdamek 31 August 2009 03:06:44PM 0 points [-]

This is an interesting idea that seems worth my looking into. Do you have sources, links, etc? It certainly could be helpful to draw attention to risk mitigation that is done for short term reasons, might be easier to get people to work on.

Comment author: rwallace 31 August 2009 06:22:03PM 1 point [-]

I don't have sources to hand, but here's a post I wrote about the negative side: http://lesswrong.com/lw/10n/why_safety_is_not_safe/

On the positive side, consider playing video games, an activity certainly carried out for short-term reasons, yet one of the major sources of funding for development of higher performance computers, an important ingredient in just about every kind of technological progress today.

Or consider how much research in medicine (another key long-term technology) is paid for by individual patients in the present day with the very immediate concern that they don't want to suffer and die right now.

Comment author: FrankAdamek 02 September 2009 04:09:19AM 0 points [-]

I don't think lack of hardware progress is a major problem in avoiding existential disaster.

I read your post, but I don't see a reason that a lack of understanding for certain past events should bring us to devalue our current best estimates for ways to reduce danger. I wouldn't be remotely surprised if there are dangers we don't (yet?) understand, but why presume an unknown danger isn't localized in the same areas as known dangers? Keep in mind that reversed stupidity is not intelligence.

Comment author: rwallace 02 September 2009 11:47:44AM 0 points [-]

Because it has empirically turned out not to be. Reversed stupidity is not intelligence, but it is avoidance of stupidity. When we know a particular source gives wrong answers, that doesn't tell us the right answers, but it does tell us what to avoid.