gwern comments on How inevitable was modern human civilization - data - Less Wrong

30 Post author: taw 20 August 2009 09:42PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (103)

You are viewing a single comment's thread. Show more comments above.

Comment author: Simon_Jester 26 August 2009 09:25:50AM 0 points [-]

A machine-phase civilization might still find (3a) or (3b) an issue depending on whether nanotech pans out. We think it will, but we don't really know, and a lot of technologies turn out to be profoundly less capable than the optimists expect them to be in their infancy. Science fiction authors in the '40s and '50s were predicting that atomic power sources would be strongly miniaturized (amusingly, more so than computing devices); that never happened and it looks like the minimum size for a reasonably safe nuclear reactor really is a large piece of industrial machinery.

If nanotech does what its greatest enthusiasts expect, then the minimum size of industrial base you need to create a new technological civilization in a completely undeveloped solar system is low (I don't know, probably in the 10-1000 ton range), in which case the payload for your starship is low enough that you might be able to convince people to help you build and launch it. Extremely capable nanotech also helps on the launch end by making the task of organizing the industrial resources to build the ship easier.

But if nanotech doesn't operate at that level, if you actually need to carry machine tools and stockpiles of exotic materials unlikely to be found in asteroid belts and so on... things could be expensive enough that at any point in a civilization's history it can think of something more interesting to do with the resources required to build an interstellar colony ship. Again, if the construction cost of the ship is an order of magnitude greater than the gross planetary product, it won't get built, especially if very few people actually want to ride it.

Also, could you define "singleton" for me, please?

Comment author: gwern 26 August 2009 11:42:06AM -1 points [-]

'singleton' as I've seen it used seems to be one possible Singularity in which a single AI absorbs everyone and everything into itself in a single colossal entity. We'd probably consider it a Bad Ending.

Comment author: Vladimir_Nesov 26 August 2009 11:54:07AM *  2 points [-]

See Nick Bostrom (2005). What is a Singleton?

A singleton is a more general concept than intelligence explosion. The specific case of a benevolent AGI singleton aka FAI is not a bad ending. Think of it as Nature 2.0, supervised universe, not as a dictator.

Comment author: gwern 26 August 2009 12:41:45PM 0 points [-]

I stand corrected! Maybe this should be a wiki article - it's not that common, but it's awfully hard to google.

Comment author: Vladimir_Nesov 26 August 2009 02:03:16PM 3 points [-]