Note: This post is almost completely tongue-in-cheek. Obviously the chances of December 21, 2012 heralding in an apocalypse, definable maybe as an event causing billions of deaths and/or global catastrophic infrastructure damage, are slim to none.

But they aren't actually none.

Let's say there's a 5% chance of a superhuman General AI being developed in the next 10 years and ushering in the singularity. Let's say 4/5 of those scenarios would lead to a Bad End which could reasonably be called an apocalypse (or an "AI-pocalypse", perhaps). And let's say the distribution of probability isn't linear, but is exponentially skewed somehow so that given the emergence of AI in the next 120 months, the chances of it happening in this very month are 1 in 100,000. Then let's divide that by 30 so we can get our apocalypse rolling on the right day.

5/100 * 4/5 * 1/100000 * 1/30 = 1 in 75,000,000

Admittedly, low odds. This comports with the fact that there would have to be a lot of unknown development progress being made already on the problem for an intelligence explosion to be anywhere on the horizon.

But compared to some of the scenarios debunked by NASA (http://www.space.com/18678-2012-mayan-apocalypse-fears-nasa.html), such as a collision with the rogue planet Nibaru, or Earth being sucked into the supermassive black hole in the center of the Milky Way 30,000 light years away, the AI-doomsday scenario starts to seem relatively plausible.

I think the only other (relatively) plausible contenders would be the release of a pandemic-causing biological weapon, or the start of an international nuclear war. I haven't done any Fermi calculations on those, but I'm sure their probability exceeds that of solar flares scourging the surface of the Earth.

New Comment
22 comments, sorted by Click to highlight new comments since: Today at 4:54 PM

If doomdsay is inevitable it is interesting to ask what would we prefer - to die from UFAI or from biological weapons catastrophe. I prefer UFAI, because:

It still be intellgence which will explore universe.

Even if it destroy humanity it will share some of its values.

It could ressurect humanity in simulation, and most likely will do it many times in order to study chances of its own existence and AI frecuency in the universe.

It could kill us quickly and without useless pain.

:(

This would be like the passenger pigeon and the dodo rooting for humanity in a war against space aliens.

I feel the same way. I see FAI as an attempt to cheat evolution. But I would still root from the uAI from our planet to win over the other uAI's in the same sense I root for my daughter's volleyball team and refer to their opponents du jour as "the bad guys."

Machine intelligence would still be evolution. Evolution - as usually defined - is changes in frequency of heritable information over time. It would be a genetic takeover, but the planet has probably seen those before.

I dunno, if we all die from superAIDS, intelligent life will evolve on earth again, and share more of our values than a catastrophic AI.

Is it probable for intelligent life to evolve?

If we assume primates and other intelligent social mammals continue to exist then yes, the transition from their level to human level is comparatively minor to the steps needed to get that far.

and share more of our values than a catastrophic AI

How do you know?

It could kill us quickly and without useless pain.

Why would it care to avoid inflicting pain? If it finds that extreme mental anguish and/or physical distress makes humans flail and screech in curious ways, it would have no reason to not repeat the observations over and over.

There are many FAI failure modes that don't involve gratuitous torture. I think 'could' is justified here, especially in comparison to Bioweapon catastrophe.

Right, there are plenty of failure modes, some less unpleasant than others, some are probably horrific beyond our worst nightmares. I suspect that any particular set of scenarios that we find comforting would have measure zero in the space of possible outcomes. If so, preferring death by AI over death by a bioweapon is but a failure of imagination.

It doesn't take much comfort to beat a bioweapon that actually succeeds in killing everyone.

Simply using our atoms to make paperclips, and being quick about it, wins.

When it comes to big disasters, volcanoes are a pretty frequent culprit. The Yellowstone supervolcano doesn't seem to be likely to blow up any time soon, but it could really wreck most of North America at some point in the next few hundred thousand years.

I have a tongue and a cheek also.

I think the chances of the world ending on 21 December 2012 are LESS than of it ending on any other day, because having it end on that day would be so stupid. And I am willing to put my money where my mouth is, I will bet anybody who is interested at 1,000,000:1 odds that the world does not end 21 December 2012. Record your bet with me in these comments. If the world ends 21 Dec you win 1,000,000 X what you bet. If the word doesn't end on 22 Dec, please pay off your bet with me by paypal to mwengler@gmail.com.

Jokes aside, it might be more probable. If the population is primed to panic minor events could escalate far more than they otherwise would.

How about I take the opposite bet: It is either equally as likely or more likely that the world will end on December 22nd than on December 21st. If the world fails to end on either day, or ends on both days, you owe me one percent of your winnings from this bet; if the world ends on the 22nd and doesn't end on the 21st, you owe me the the total proceeds of your winnings. If the world ends on the 21st and not on the 22nd, I will match your winnings from this bet.

Deal?

What if you give me 1,000,000 X now, and on the 22nd if the world hasn't ended I give you 1,000,001 X (adjusted for inflation) back?

I think that the most likely doomsday scenario would be somebody/group/thing looking to take advantage of the notability of the day itself to launch some sort of attack. Many people would be more likely to panic and others would initially be suspicious of reports of disasters. The system would less be able to deal effectively with threats. It might represent the best chance for an attacker to start WW3.

p~.00001: Israel and Palestine enter into open war, nuclear and/or biological weapons are used by the belligerents, and third parties escalate with each other; perhaps Iran retaliates against Israel and then a NATO country retaliates against Iran. I don't think such an event would stop human life, but I think it might stop civilization as we know it.

As there seems to be no evidence pointing to deadly diseases, AI, or anything on that particular day, I'm of the opinion that the closest thing to an apocalypse will be rioting due to the fact that we are scared. The earth won't explode or anything (likely) but I'll still be staying indoors to avoid the potential insanity. It'll be just like friday 13, only bigger.

I don't think there will be significant amounts of insanity. There might be a few suicides, but no public hazard.

My 1 minus epsilon prediction is that there will be gloating skeptics on 12/22.

What's your 50% prediction about the number of gloating skeptics?