PhilGoetz comments on Exterminating life is rational - Less Wrong

17 Post author: PhilGoetz 06 August 2009 04:17PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (272)

You are viewing a single comment's thread. Show more comments above.

Comment author: PhilGoetz 06 August 2009 09:14:57PM 1 point [-]

It would be more helpful if you explained why each of the many reasons I gave are insensible.

Comment author: homunq 07 August 2009 10:32:01AM 3 points [-]

When arguing about the future, the imaginable is not all there is. You essentially gave several imaginable futures (some in which risks continue to arise, and others in which they do not) and did some handwaving about which class you considered likely to be larger. There are three ways to dispute this: to dispute your handwaving (eg, you consider compression of subjective time to be a conclusive argument, as if this is inevitable), to propose not-considered classes of future (eg, technology continues to increase, but some immutable law of the universe means that there are only a finite number of apocalyptic technologies), or to maintain that there are large classes of future which cannot possibly be imagined because they do not clearly fall into any categories such as we are likely to define in the present. If you use the latter dispute, arguing about probability is just arguing about which uninformative prior to use.

Comment author: PhilGoetz 08 August 2009 03:59:34AM *  0 points [-]

I'm not pretending this is an airtight case. If you previously assumed that existential threats converge to zero as rationality increases; or that rationality is always the best policy; or that rationality means expectation maximization; and now you question one of those things; then you've gotten something out of it.

Comment author: torekp 08 August 2009 01:36:50AM 0 points [-]

homung suggests that there may be immutable laws of the universe that mean there are only a finite number of apocalyptic technologies. Note that even if the probability of such technological limits is small, in order for Phil's argument to work, either that probability would have to be infinitesimal, or some of the doomsday devices have to continue to be threatening after the various attack/defense strategies reach a very mature level of development. All of the probabilities look finite to me.

Comment author: PhilGoetz 08 August 2009 04:01:23AM 0 points [-]

No; that probability about a property of the universe is a one-shot trial. It only has to be false once, out of one trial.

Comment author: torekp 08 August 2009 01:07:22PM 1 point [-]

So your thesis is not that rationality dooms civilization, but only that as far as we know, it might. I get it now.

Comment author: timtyler 07 August 2009 05:40:15PM -2 points [-]

You talk as if you have presented a credible case - but you really haven't.

Instead there is a fantasy about making black holes explode (references, please!) another fantasy about subjective time compression outstripping expansion - and a story about disasters triggering other disasters - which is true, but falls a long way short of a credible argument that civilisation is likely to be wiped out once it has spread out a bit.

Comment author: PhilGoetz 08 August 2009 03:51:01AM *  0 points [-]

Instead there is a fantasy about making black holes explode (references, please!)

You have me there. We have not yet successfully detonated a black hole.

Small black holes are expected to eventually explode. Large black holes are expected to take longer than the expected life of the universe to evaporate to that point.

Anyway, I'm not a physicist. It's just a handwavy example that maybe there is some technology with solar-scale or galaxy-scale destructive power. When all the humans lived on one island, they didn't imagine they could one day destroy the Earth.

Comment author: DBseeker 11 August 2009 02:52:48AM 2 points [-]

Anyway, I'm not a physicist. It's just a handwavy example that maybe there is some technology with solar-scale or galaxy-scale destructive power.

Then the example is pointless. A weapon powerful enough to cause extinction galaxy wide is a very big if. It's unlikely there would be, simply because of the massive distances between stars.

Also, if you base your argument (or part of it, anyways) on such an event, it is equally fair to state "if not". And in the case of "if not" (which I imagine to be highly more likely), the argument must end there.

Therefor, it is likely to assume that yes, we could outrun our own destructive tendencies.

When all the humans lived on one island, they didn't imagine they could one day destroy the Earth.

At that point in our evolution we had no firm grasp on what "world" even meant, let alone a basic understanding of scale. Now, we do. We also have a basic understanding of the universe, and a method to increase our understanding (Ability to postulate theories, run experiments and collect evidence). When all humans (most likely an ancestor) were contained in one geographic coordinate, none of these things even existed as concepts. There are a few more problems with this comparison, but I'll leave them alone for now, as it does nothing to bring them out.

Comment author: timtyler 11 August 2009 09:18:22PM -1 points [-]

I wasn't asking for references supporting the idea that we had detonated a black hole. It's an incredible weapon, which seems to have a low probability of existing - based on what we know about physics. The black hole at the center of our galaxy is not going to go away any time soon.

Bizarre future speculations which defy the known laws of physics don't add much to your case.