Another possible resolution of the Fermi paradox based on the many world interpretation of QM:
Let us assume that advanced civilizations find overwhelming evidence for the many world hypothesis as the true, infallible theory of physics. Additionally, assume that there is a quantum mechanical process that has a huge payoff at a very small probability: the equivalent of a cosmic lottery, where the chances of obliteration are close to 1, the chance of winning is close to zero, but the payoff is HUGE. It is like going into a room, where you win a billion dollar with p=1:1000000 and die a sudden, painless death at p= 999999:1000000. Still, for the many world hypothesis is true, you will experience the winning for sure.
Now imagine that at some point of its existence every very advanced civilization faces the decision to make the leap of face in the many world interpretation: start the machine that obliterates them in almost every branches of the Everett-multiverse, while letting them live on in a few branches with a huge amount of increased resources (energy/ computronium/ whatever) Since they know that their only subjective experience will be of getting the payoff at a negligible risk, they will choose the path of trickling down in some of the much narrower Everett-branches.
However, it would mean for any outsider civilizations are that they simply vanish from their branch of the Universe at a very high probability. Since every advanced civilization would be faced with the above extremely seducing way of gaining cheap resources, the probability that two of them will share the same universe will get infinitesimally small.
Subscribe to RSS Feed
= f037147d6e6c911a85753b9abdedda8d)
I admit that your analysis is quite convincing, but will play the devil's advocate just for fun:
1) We see a lot of cataclysmic events in our universe, the source of which are at least uncertain. It is definitely a possibility that some of them could originate from super-advanced civilizations going up in flame. (Maybe due to accidents or deliberate effort)
2) Maybe the minority that does not approve trickling down the narrow branch is even less inclined to witness the spectacular death of the elite and live on in a resource-exhausted section of the universe and therefore decides to play along.
3) Even if a small risk-averse minority of the civilization is left behind, when it reaches a certain size again, large part of it will decide again to go down the narrow path so it won't grow significantly over time.
4) If the minority becomes so extremely conservative and risk-averse (due to selection after some iterations of 3) then it necessarily means that it has also lost its ambitions to colonize the galaxy and will just stagnate along a few star systems and will try to hide from other civilizations to avoid any possible conflicts, so we would have difficulties to detect them.
Good points. However: (1) Most of the cataclysms we see are either fairly explicable (supernovae) or seem to occur only at remote points in spacetime, early in the evolution of the universe, when the emergence of intelligent life would have been very unlikely. Quasars and gamma ray bursts cannot plausibly be industrial accidents in my opinion, and supernovae need not be industrial accidents.
(2)Possible, but I can still imagine large civilizations of people whose utility function is weighted such that "99.9999% death plus 0.0001% superman" is inferior to "continued mortal existence."
(3)Again possible, but there will be a selection effect over time. Eventually, the remaining people (who, you will notice, live in a universe where people who try to ascend to godhood always die) will no longer think ascending to godhood is a good idea. Maybe the ancients were right and there really is a small chance that the ascent process works and doesn't kill you, but you have never seen it work, and you have seen your civilization nearly exterminated by the power-hungry fools who tried it the last ten times.
At what point do you decide that it's more likely that the ancients did the math wrong and the procedure just flat out does not work?
(4)The minority might have no problems with risks that do not have a track record of killing everybody. However, you have a point: a rational civilization that expects the galaxy to be heavily populated might be well advised to hide.