TLDR: 

As with any other global risk, trying to prevent the AGI cataclysm is not enough. If there is a high risk that the cataclysm will happen, we should also start thinking how to survive during the cataclysm. There are realistic ways to survive.

Compare: a wise person works on preventing the nuclear war, but also builds nuclear bunkers for the case the war still happens.

The survival mindset

This post will bring some hope. But let's start with some bad news: 

If you’re a human, you’re going to die anyway. Because humans die from old age, cancer, wars, etc etc. 

If you live long enough, you may see efficient anti-aging therapies, mind uploading etc. But currently, death is the default. 

The imminent AGI cataclysm doesn’t change the situation much. It doesn’t matter what kills you: the AGI or cancer.

After realizing this simple truth, many humans give up.  But some – fight back.

The ones who fight back have a chance to survive.

For example, there is a chance that cryonics will work. And thus, even if you’re doomed to die from cancer, you still have a chance to survive. But the dying patients who reject cryonics – have no chance.

Another example: a nuclear war. Instead of giving up, you build a bunker, or move to New Zealand, or find another way to improve your chances of survival, regardless of how small are the chances.

The same general approach is helpful in every “hopeless” situation: 

even if death is inevitable, you never give up. Instead, you search for ways to trick death.

 

 

There is no such thing as “death with dignity”. You live forever or die trying. 

AGI is not so different from a global nuclear war

Imagine that in X years there will be an AGI who will disassemble you into useful atoms.

Obviously, you should try to prevent that from happening. For example, by helping with alignment research.

But even if you’re certain that the unaligned AGI is absolutely inevitable, you should keep the same survival mindset.

If the nuclear war is inevitable, prepare yourself. If the AGI cataclysm is inevitable, prepare yourself.

Types of AGI hostility

The reasons why AGI may try to kill you depend on its hostility:

1. no hostility: you’re made of useful atoms which it can use for something else

2. hostility against selected humans: you-in-particular and your pals are a threat to its survival

3. hostility against a large group of humans: e.g. the AGI was tasked to perpetrate a genocide

4. hostility against all humans: the AGI may view all humans as a threat by default.

Let’s list some ways to trick death in each scenario.

Surviving the “no hostility” AGI

OK, the AGI decided to disassemble the Earth to build more paperclips or something. How to survive in this situation?

The best way to do that is to be as far from the AGI as possible. Depending on your AGI timelines, a Mars colony may help with that. There is a non-zero chance that the AGI will eventually stop making paperclips from your planet (e.g. by finding a more efficient way to do that).

Another hypothetical option is to upload your mind and transmit in the directions of the nearest habitable exoplanets. Again, the availability of the option depends on the AGI timeline.

It's also reasonable to try to make backups for our civilization, as the disassembly can ruin much of it. Even if the AGI kills billions, the survived civilization could eventually bring them back to life

A short-term option is to buy a good boat. The ocean is likely to be a low-priority source of atoms for any such AGI. A similar solution is to move to a place that doesn’t have a lot of ores or other sources of rare metals etc.

Surviving the hostility against selected humans

In this scenario, the best option is to be as harmless for AGIs as possible.

For a sufficiently advanced AGI, the only real threat is another AGI. So, if you’re not trying to build an AGI, you’re likely to be spared. A god will not bother with killing ants.

But you may find yourself in the situation where every single GPU and CPU on Earth is deep-fried for some reason. And this will cause a lot of problems in a society that depends on the tech. 

In general, if you've survived in an AGI cataclysm, there is a risk that many will not be so lucky. Makes sense to prepare yourself for a large-scale societal collapse.

Surviving the hostility against a large group of humans

If the first AGI is created in the North Korea, and you happen to be an American, eventually you may find yourself in the situation where millions of self-replicating drones are trying to kill all your compatriots in the name of the Supreme Leader and The Only True Ideology of Juche.

While the scenario may sound ridiculous today, it’s but the logical conclusion of the progress of weapons of mass destruction. If the North Korea has managed to build ICBMs with nukes, maybe they are capable of writing some software too. And besides the Korea, there are China, Russia, Iran...

One possible solution is to move to a neutral country, especially if some technically advanced dictatorship already declared its intent to destroy your home country. This could help against a few other risks, including a nuclear war.

Surviving the hostility against all humans

Why would AGI *want* to kill all humans?

The most likely reason is that the AGI perceives humans in general as a threat.

Unless the AGI is delusional, it will view humans as a threat only in the first few seconds of the AGI’s existence. After the AGI has replicated itself across the interwebs, and disabled all development of other AGIs, then humans objectively cease to be a threat, as humans can do nothing to stop the AGI in such a situation.

In this scenario, the most reasonable course of action is to welcome our new digital overlord, and then try to adapt to the new world He will create.

Conclusion

Obviously, we should try to prevent the AGI cataclysm by working on the alignment problem.

But if there is a high probability that the cataclysm will happen, we should:

  • never give up
  • make some preparations to increase our chances of survival (they are never zero).
New Comment
3 comments, sorted by Click to highlight new comments since:

If quantum immortality is true, you will survive in the worlds where AI will not kill everybody immediately. This could be in several cases:

  • AI is friendly.
  • AI keeps humans for some instrumental purpose.
  • AI is hostile and is going to torture you. 
  • AI doesn't care about humans nor interested in their atoms and they may die lately because of some environment degradation.
  • You are a single survivor, whom AI is unable or failed to kill.

Which is most probable?

Disagree that being harmless is enough (in the serious unaligned takeover scenarios).  You have to be valuable, or at least more costly to kill than the resources you will divert from the AI's goals. 

A god will not bother with killing ants.

Humans kill a LOT of ants.  I personally try to kill any that come to my notice near me.

Bad environmental effects could kill humans. For example, AI starts to send millions of missiles to space and atmosphere will be polluted with their exhaust.