Vladimir_Nesov comments on Sarah Connor and Existential Risk - Less Wrong

-9 [deleted] 01 May 2011 06:28PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (77)

You are viewing a single comment's thread. Show more comments above.

Comment author: fubarobfusco 01 May 2011 10:50:56PM *  -1 points [-]

Sure; but the CIA also classifies "leading a peaceful, democratic political uprising" as worthy of violence; so they're not a very good guide.

More seriously: Today there are probably dozens or hundreds of processes going on that, if left unchecked, could lead to the destruction of the world and all that you and I value. Some of these are entirely mindless. I'm rather confident that somewhere in the solar system is an orbiting asteroid that will, if not deflected, eventually crash into the Earth and destroy all life as we know it. Everyone who is proceeding with their lives in ignorance of that fact is thereby participating in a process which, if unchecked, leads to the destruction of the world and all that is good. I hope that we agree that this belief does not justify killing people who oppose the funding of anti-asteroid defense.

But if you are seriously ready to kill someone who has her finger poised above the "on" switch of an unfriendly AGI (which is to say, an AGI that you believe is not sufficiently proven to be Friendly), then you are very likely susceptible to a rather trivial dead man's switch. The uFAI creator merely needs to be sufficiently confident in their AI's positive utility that they are willing to set it up to activate if they (the creator) are killed. Then, your readiness to kill is subverted. And ultimately, a person who is clever enough to create uFAI is clever enough to rig any number of nth-order dead man's switches if they really think they are justified in doing so.

Which means, in the limit case, that you're reduced to either (1) going on a massacre of everyone involved in AI, machine learning, or related fields; or (2) resorting to convincing people of your views and concerns rather than threatening them.

Comment author: Vladimir_Nesov 01 May 2011 10:55:49PM *  2 points [-]

I'm rather confident that somewhere in the solar system is an orbiting asteroid that will, if not deflected, eventually crash into the Earth and destroy all life as we know it.

Huh? Downvoted for sloppy reasoning. This most likely won't happen on the timescale where "life as we know it" continues to exist.

Comment author: JoshuaZ 01 May 2011 11:55:52PM 2 points [-]

This most likely won't happen on the timescale where "life as we know it" continues to exist.

The Chicxulub asteroid impact did wipe out almost all non-ocean life. That asteroid was 8-12 km. It is estimated that an impact of that size happens every few hundred million years. So this claim seems inaccurate. On the other hand, the WISE survey results strongly suggests that no severe asteroid impacts are likely in the next few hundred years.

Comment author: wedrifid 02 May 2011 12:18:18AM 0 points [-]

It is estimated that an impact of that size happens every few hundred million years. So this claim seems inaccurate.

Only if you expect life as we know it to last in the order of a few hundred million years. That probability of that happening is too low for me to even put a number to it.

Comment author: fubarobfusco 01 May 2011 11:11:15PM *  1 point [-]

Would you mind posting your reasoning, instead of just posting your conclusions and an insult?

I should clarify that I was intending to set some sort of boundary condition on the possible futures of life on earth, rather than predicting a specific end to it: If life comes to no other end, at the very least, eventually we'll get asteroided if we stay here. This by itself does not justify killing people in a fight for asteroid-prevention; so what would justify killing people?

Comment author: wedrifid 02 May 2011 12:24:52AM 4 points [-]

Would you mind posting your reasoning

Timescale of life as we know it continuing to exist: Short
Timescale of killer asteroids hitting earth: Long

Comment author: JoshuaZ 02 May 2011 01:00:23AM 1 point [-]

Are we running into definitional issues of what we mean by "life as we know it?" That term has some degree of ambiguity that may be creating the problem.

Comment author: wedrifid 02 May 2011 02:39:49AM 0 points [-]

Are we running into definitional issues of what we mean by "life as we know it?" That term has some degree of ambiguity that may be creating the problem.

Quite possibly. Although one of the features of 'life as we know it' that will not survive for hundreds of millions of years is living exclusively on earth. So the disagreement would remain independently of definition.