private_messaging comments on Some Thoughts on Singularity Strategies - Less Wrong

26 Post author: Wei_Dai 13 July 2011 02:41AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (29)

You are viewing a single comment's thread. Show more comments above.

Comment author: private_messaging 23 July 2012 05:58:53AM *  -1 points [-]

The discussion you link is purely ideological: pessimist, narrow minded cynicism about human race (on Nesov's side), versus the normal view, without any justifications what so ever for either view.

The magical optimizer allows for space colonization (probably), cures for every disease, solution to energy problems, and so on. We do not have as much room for intelligent improvement when it comes to destroying ourselves - the components for deadly diseases come pre made by evolution, the nuclear weapons already have been invented, etc. The capacity of destruction is bounded by what we have to lose (and we already have the capacity to lose everything), the capacity for growth is bounded by much larger value of what we may gain.

Sure, the magical friendly AI is better than anything else. So is flying carpet better than a car.

When you focus so much on the notion that others are stupid, you forget how hostile is the very universe we live in, you neglect how important it is to save ourselves from external-ish factors. As long as viruses like common cold and flu can exist and be widespread, it is only a matter of time until there is a terrible pandemic killing an enormous number of people (and potentially crippling economy). We haven't even gotten rid of dangerous parasites yet. Not even top of the foodchain really, if you count parasites. We are also stuck on a rock hurling through space full of rocks, and we can't go anywhere.