Nobel laureate Marie Curie died of aplastic anemia, the victim of radiation from the many fascinating glowing substances she had learned to isolate.
How could she have known? And the answer, as far as I can tell, is that she couldn't. The only way she could have avoided death was by being too scared of anything new to go near it. Would banning physics experiments have saved Curie from herself?
But far more cancer patients than just one person have been saved by radiation therapy. And the real cost of banning physics is not just losing that one experiment - it's losing physics. No more Industrial Revolution.
Some of us fall, and the human species carries on, and advances; our modern world is built on the backs, and sometimes the bodies, of people who took risks. My father is fond of saying that if the automobile were invented nowadays, the saddle industry would arrange to have it outlawed.
But what if the laws of physics had been different from what they are? What if Curie, by isolating and purifying the glowy stuff, had caused something akin to a fission chain reaction gone critical... which, the laws of physics being different, had ignited the atmosphere or produced a strangelet?
At the recent Global Catastrophic Risks conference, someone proposed a policy prescription which, I argued, amounted to a ban on all physics experiments involving the production of novel physical situations - as opposed to measuring existing phenomena. You can weigh a rock, but you can't purify radium, and you can't even expose the rock to X-rays unless you can show that exactly similar X-rays hit rocks all the time. So the Large Hadron Collider, which produces collisions as energetic as cosmic rays, but not exactly the same as cosmic rays, would be off the menu.
After all, whenever you do something new, even if you calculate that everything is safe, there is surely some probability of being mistaken in the calculation - right?
So the one who proposed the policy, disagreed that their policy cashed out to a blanket ban on physics experiments. And discussion is in progress, so I won't talk further about their policy argument.
But if you consider the policy of "Ban Physics", and leave aside the total political infeasibility, I think the strongest way to frame the issue - from the pro-ban viewpoint - would be as follows:
Suppose that Tegmark's Level IV Multiverse is real - that all possible mathematical objects, including all possible physical universes with all possible laws of physics, exist. (Perhaps anthropically weighted by their simplicity.)
Somewhere in Tegmark's Level IV Multiverse, then, there have undoubtedly been cases where intelligence arises somewhere in a universe with physics unlike this one - i.e., instead of a planet, life arises on a gigantic triangular plate hanging suspended in the void - and that intelligence accidentally destroys its world, perhaps its universe, in the course of a physics experiment.
Maybe they experiment with alchemy, bring together some combination of substances that were never brought together before, and catalyze a change in their atmosphere. Or maybe they manage to break their triangular plate, whose pieces fall and break other triangular plates.
So, across the whole of the Tegmark Level IV multiverse - containing all possible physical universes with all laws of physics, weighted by the laws' simplicity:
What fraction of sentient species that try to follow the policy "Ban all physics experiments involving situations with a remote possibility of being novel, until you can augment your own intelligence enough to do error-free cognition";
And what fraction of sentient species that go ahead and do physics experiments;
Survive in the long term, on average?
In the case of the human species, trying to ban chemistry would hardly have been effective - but supposing that a species actually could make a collective decision like that, it's at least not clear-cut which fraction would be larger across the whole multiverse. (We, in our universe, have already learned that you can't easily destroy the world with alchemy.)
Or an even tougher question: On average, across the multiverse, do you think you would advise an intelligent species to stop performing novel physics experiments during the interval after it figures out how to build transistors and before it builds AI?
I am deeply honored.