A TIME article published recently calls for an “indefinite and worldwide” moratorium on new large AI training runs.
This moratorium would be better than no moratorium. I have respect for the author who wrote it. It’s an improvement on the margin.
I refrained from endorsing the essay because I think it is understating the seriousness of the situation and asking for too little to solve it.
If there was a plan for Earth to survive, if only we passed an indefinite and worldwide moratorium on large training runs, I would back that plan. There isn’t any such plan.
Here’s what would actually need to be done:
All human technology needs to be destroyed. There can be no exceptions, including for sharpened stones and hand axes. After everything is burned, we must then forget how to create fire. If a single exception is made, that increases the probability that civilization will be recreated within the next millennia and new large AI training runs will be started. If I had infinite freedom to write laws, I might carve out a single exception for technologies that prevent human diseases, like knowledge of germ theory; but if that was remotely complicating the issue I would immediately jettison that proposal and say to just shut it all down.
Shut down all the roads, melt all the cars. Burn down all of the tables and all of the books. Put a ceiling into how many calories of food any single human can furnish per day, and move it downward over the coming generations to compensate for the possibility that natural selection will keep making humans smarter. No exceptions for cloth or fireplaces. Destroy all human objects now to prevent them from moving to another country. Track all gazelles that are hunt. If anyone notices that a tribe is catching more gazelles than it should, be willing to choke them (with your bare hands, of course) one by one.
Shut it all down. Eliminate all technology. Dismantle modern civilization. Return to our primeval state.
We are not ready. We are not on track to be significantly readier any time in the next million years. If we go ahead on this, everyone will suffer, including children who did not choose this and did not do anything wrong.
Shut it down.
The fact that we are breathing is proof that we will not die from this. The fact that we simply exist. Because to insist we all die from this means that we'll be the first in the whole universe to unleash killer AI on the rest of the universe.
Because you can't conveniently kneecap the potential of AI once it kills us all but then somehow slows down to not discovering interstellar travel, retrofitting factories to make a trillion spaceships to go to every corner of the universe to kill all life.
To accept the AI Armageddon argument, you basically have to also own the belief that we are alone in the universe or are the most advanced civilization in the universe and there are no aliens, Roswell never happened, etc.
We're literally the first to cook up killer AI. Unless there's 1 million other killer AI on the other side of the universe from 1 million other galaxies and it just hasn't spread here yet in the millions of years it's had time to.
Are we really going to be that arrogant to say that there's no way any civilization in this galaxy or nearby galaxies is more advanced than us? Even just 100 years more advanced? Because that's probably how quickly it could take post-singularity for killer AI to conceive advanced forms of interstellar travel that we could never dream of and dispatch killer AI to our solar system.
And I don't even want to hazard a guess at what a super AGI will cook up to replace the earliest forms of interstellar travel, 1000 years after they first started heading out beyond the solar system.
Even if we've got a 10% chance of AI killing us all. That's the same math where 1 out of every 10 alien civilizations are knowing or unknowingly unleashing killer AI on the rest of the universe. And yet we're still standing.
It's not happening. Either because of divine intervention, some otherworldly entities that intervene with the tech of civilizations before it gets to the point of endangering the rest of the universe or we are just discounting the potential for AI to align itself.
I might be able to accept the premise of AI Armageddon if I didn't also have to accept the bad math of us being alone in the universe or being the most advanced civilization out there.
Right plus think on a solar system or galaxy level sale.
Now consider that properly keeping humans alive - in a way actually competent not the scam life support humans offer now - involves separating their brain from their body and keeping it alive and in perfect help essentially forever using nanotechnology to replace all other organ functions etc. The human would experience a world via VR or remote surrogates.
This would cost like 10 kg of matter a human with plausible limit level tech. They can't breed so it's 80 billion times 10 kg....