CarlShulman comments on Bioconservative and biomoderate singularitarian positions - Less Wrong

10 [deleted] 02 June 2009 01:19PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (36)

You are viewing a single comment's thread.

Comment author: CarlShulman 02 June 2009 08:45:21PM 2 points [-]

Nick Bostrom's paper says that in the long run we should expect extinction, stagnation, posthumanity, or oscillation. But he describes a global state that uses social control technologies (ubiquitous surveillance, lie detectors, advanced education techniques) to maintain a steady technology level without generally superintelligent machines as falling into the radical change category.

What strong bioconservatism needs to work is a global order/singleton that can maintain its values, not necessarily superintelligent software entities.

Also, while I like the post, I wonder if it would work better for your own blog than Less Wrong, since it doesn't really draw on or develop rationality ideas very much.

Comment deleted 02 June 2009 10:49:34PM [-]
Comment author: whpearson 03 June 2009 10:04:38PM 0 points [-]

It seems that what is getting voted up at the moment is mainly generic rationality stuff not future/planning oriented stuff (not that I ever expected my stuff to get voted up).

Generic rationality is maybe the only thing we share and worrying about the future is perhaps only a minority pursuit on LW now.

Comment author: JGWeissman 02 June 2009 09:06:00PM 2 points [-]

Also, while I like the post, I wonder if it would work better for your own blog than Less Wrong, since it doesn't really draw on or develop rationality ideas very much.

I think it is appropriate to have (non-promoted) articles on side topics of interest to large segments of the Less Wrong community.

Comment author: CarlShulman 03 June 2009 06:51:17AM 1 point [-]

Good point about the option on promotion.