bokov comments on Blind Spot: Malthusian Crunch - Less Wrong

4 Post author: bokov 18 October 2013 01:48PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (184)

You are viewing a single comment's thread. Show more comments above.

Comment author: bokov 19 October 2013 03:41:25AM *  2 points [-]

I like the idea of space colonization, but it's not clear that it's a practical, let alone robust, way to get our eggs into more baskets.

I read somewhere that to calibrate the logistics of getting everyone off Earth, you should consider how much it would cost and how long it would take to load every human onto a passenger jet and fly them all to the same continent. I wish I could find that essay. Long story short, it would take a loooot of resources. So, it probably won't be our eggs in particular getting into more baskets, but at least the eggs of some fellow humans.

On existential risk overall, my reading on AI has been pushing me towards the point of view that actually global warming -> civilizational collapse may be our best hope for the future, if it can only happen fast enough to prevent the development of a superintelligence.

I see two outcomes: either there are enough exploitable resources left to rebuild a technological civilization, in which case someone will get back to pursuing superintelligence, or there are not enough exploitable resources left to rebuild a technological civilization in which case we piss away our last days throwing spears and dying of dysentery. Or maybe we evolve into non tool-using creatures like in Galapagos. In any case, the left of the Drake Equation remains at zero. Breaking out of the overshoot/collapse cycle means the risk of going out with a bang, but the alternative is the certainty of going out with a whimper.

Comment author: ThisSpaceAvailable 23 October 2013 06:20:45PM 0 points [-]

As far as x-risk is concerned, we all have the same eggs.