CarlShulman comments on Bioconservative and biomoderate singularitarian positions - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (36)
Nick Bostrom's paper says that in the long run we should expect extinction, stagnation, posthumanity, or oscillation. But he describes a global state that uses social control technologies (ubiquitous surveillance, lie detectors, advanced education techniques) to maintain a steady technology level without generally superintelligent machines as falling into the radical change category.
What strong bioconservatism needs to work is a global order/singleton that can maintain its values, not necessarily superintelligent software entities.
Also, while I like the post, I wonder if it would work better for your own blog than Less Wrong, since it doesn't really draw on or develop rationality ideas very much.
It seems that what is getting voted up at the moment is mainly generic rationality stuff not future/planning oriented stuff (not that I ever expected my stuff to get voted up).
Generic rationality is maybe the only thing we share and worrying about the future is perhaps only a minority pursuit on LW now.
I think it is appropriate to have (non-promoted) articles on side topics of interest to large segments of the Less Wrong community.
Good point about the option on promotion.