NancyLebovitz comments on Open Thread for January 8 - 16 2014 - Less Wrong

5 Post author: tut 08 January 2014 12:14PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (343)

You are viewing a single comment's thread.

Comment author: NancyLebovitz 09 January 2014 12:30:40AM 9 points [-]

If you're expecting the singularity within a century, does it make sense to put any thought into eugenics except for efforts to make it easy to avoid the worst genetic disorders?

Comment author: Nornagest 09 January 2014 01:05:04AM *  5 points [-]

That seems to depend on a number of assumptions -- your timeline, whether you expect a soft or a hard takeoff, the centrality of raw intelligence vs. cultural effects to research quality, possible nonlinearity of network effects on intellectual output. But I'd bet that the big one is time: if you think (unrealistically, but run with it) that you can improve a test population's intelligence by 50%, that could be very significant if you're expecting a 2100 singularity but likely won't be if you're expecting one before they graduate from college.

Comment author: David_Gerard 09 January 2014 11:12:22AM *  4 points [-]

This could be generalised to putting any thought into anything. Will the singularity be achieved within one childhood? More smart people may be useful to apply to the problem. If you're smart, make more smart people.

Comment author: Manfred 09 January 2014 04:21:36AM *  2 points [-]

Good point. The cutoff is not necessarily the singularity, either - once we have sufficiently awesome genetic engineering, there's no point to eugenics.