So your entire argument boils down to another person who thinks transcension is universally convergent and this is the solution to the Fermi paradox?
No . .. As I said above, even if transcension is possible, that doesn't preclude some expansion. You'd only get zero expansion if transcension is really easy/fast. On the convergence issue, we should expect that the main development outcomes are completely convergent. Transcension is instrumentally convergent - it helps any realistic goals.
I don't see what your reversible computing detour adds to the discussion, if you can't show that making only a few cold brains sans any sort of cosmic engineering is universally convergent.
The reversible computing stuff is important for modeling the structure of advanced civs. Even in transcension models, you need enormous computation - and everything you could do with new universe creation is entirely compute limited. Understanding the limits of computing is important for predicting what end-tech computation looks like for both transcend and expand models. (for example if end-tech optimal were energy limited, this predicts dyson spheres to harvest solar energy)
The temperatures implied by 10,000x energy density on earth preclude all life or any interesting computation.
I never said anything about using biology or leaving the Earth intact. I said quite the opposite.
Advanced computation doesn't happen at those temperatures, for the same basic reasons that advanced communication doesn't work for extremely large values of noise in SNR. I was trying to illustrate the connection between energy flow and temperature.
You need to show your work here. Why is it unlikely? Why don't they disassemble solar systems to build ever more cold brains? I keep asking this, and you keep avoiding it.
First let us consider the optimal compute configuration of a solar system without any large-scale re-positioning, and then we'll remove that constraint.
For any solid body (planet,moon,asteroid,etc), there is some optimal compute design given it's structural composition, internal temp, and incoming irradiance from the sun. Advanced compute tech doesn't require any significant energy - so being closer to the sun is not an advantage at all. You need to expend more energy on cooling (for example, it takes about 15 kilowatts to cool a single current chip from earth temp to low temps, although there have been some recent breakthroughs in passive metamaterial shielding that could change that picture). So you just use/waste that extra energy cooling the best you can.
So, now consider moving the matter around. What would be the point of building a dyson sphere? You don't need more energy. You need more metal mass, lower temperatures and smaller size. A dyson sphere doesn't help with any of that.
Basically we can rule out config changes for the metal/rocky mass (useful for compute) that: 1.) increase temperature 2.) increase size
The gradient of improvement is all in the opposite direction: decreasing temperature and size (with tradeoffs of course).
So it may be worth while investing some energy in collecting small useful stuff (asteroids) into larger, denser computational bodies. It may even be worth while moving stuff farther from the star, but the specifics really depend on a complex set of unknowns.
One of the big unknowns of course being the timescale, which depends on the transcend issue.
Now for the star itself, it has most of the mass, but that mass is not really accessible, and most of it is in low value elements - we want more metals. It could be that the best use of that matter is to simply continue cooking it in the stellar furnace to produce more metals - as there is no other way, as far as i know.
But doing anything with the star would probably take a very long amount of time, so it's only relevant in non-transcendent models.
In terms of predicted observations, in most of these models there are few if any large structures, but individual planetary bodies will probably be altered from their natural distributions. Some possible observables: lower than expected temperatures, unusual chemical distributions, and possibly higher than expected quantities/volumes of ejected bodies.
Some caveats: I don't really have much of an idea of the energy costs of new universe creation, which is important for the transcend case. That probably is not a reversible op, and so it may be a motivation for harvesting solar energy.
There's also KIC 8462852 of course. If we assume that it is a dyson swarm like object, we can estimate a rough model for civs in the galaxy. KIC 8462852 has been dimming for at least a century. It could represent the endphase of a tech civ, approaching it's final transcend state. Say that takes around 1,000 years (vaguely estimating from the 100 years of data we have).
This dimming star is one out of perhaps 10 million nearby stars we have observed in this way. Say 1 in 10 systems will ever develop life, the timescale spread or deviation is about a billion years - then we should expect to observe about 1 in 10 million endphase dimming stars, given that phase lasts only 1,000 years. This would of course predict a large number of endstate stars, but given that we just barely detected KIC 8462852 because it was dimming, we probably can't yet detect stars that already dimmed and then stabilized long ago.
Advanced computation doesn't happen at those temperatures
Could it make sense to use an enormous amount of energy to achieve an enormous amount of cooling? Possibly using laser cooling or some similar technique?
Our sun appears to be a typical star: unremarkable in age, composition, galactic orbit, or even in its possession of many planets. Billions of other stars in the milky way have similar general parameters and orbits that place them in the galactic habitable zone. Extrapolations of recent expolanet surveys reveal that most stars have planets, removing yet another potential unique dimension for a great filter in the past.
According to Google, there are 20 billion earth like planets in the Galaxy.
A paradox indicates a flaw in our reasoning or our knowledge, which upon resolution, may cause some large update in our beliefs.
Ideally we could resolve this through massive multiscale monte carlo computer simulations to approximate Solonomoff Induction on our current observational data. If we survive and create superintelligence, we will probably do just that.
In the meantime, we are limited to constrained simulations, fermi estimates, and other shortcuts to approximate the ideal bayesian inference.
The Past
While there is still obvious uncertainty concerning the likelihood of the series of transitions along the path from the formation of an earth-like planet around a sol-like star up to an early tech civilization, the general direction of the recent evidence flow favours a strong Mediocrity Principle.
Here are a few highlight developments from the last few decades relating to an early filter:
The Future(s)
When modelling the future development of civilization, we must recognize that the future is a vast cloud of uncertainty compared to the past. The best approach is to focus on the most key general features of future postbiological civilizations, categorize the full space of models, and then update on our observations to determine what ranges of the parameter space are excluded and which regions remain open.
An abridged taxonomy of future civilization trajectories :
Collapse/Extinction:
Civilization is wiped out due to an existential catastrophe that sterilizes the planet sufficient enough to kill most large multicellular organisms, essentially resetting the evolutionary clock by a billion years. Given the potential dangers of nanotech/AI/nuclear weapons - and then aliens, I believe this possibility is significant - ie in the 1% to 50% range.
Biological/Mixed Civilization:
This is the old-skool sci-fi scenario. Humans or our biological descendants expand into space. AI is developed but limited to human intelligence, like CP30. No or limited uploading.
This leads eventually to slow colonization, terraforming, perhaps eventually dyson spheres etc.
This scenario is almost not worth mentioning: prior < 1%. Unfortunately SETI in current form is till predicated on a world model that assigns a high prior to these futures.
PostBiological Warm-tech AI Civilization:
This is Kurzweil/Moravec's sci-fi scenario. Humans become postbiological, merging with AI through uploading. We become a computational civilization that then spreads out some fraction of the speed of light to turn the galaxy into computronium. This particular scenario is based on the assumption that energy is a key constraint, and that civilizations are essentially stellavores which harvest the energy of stars.
One of the very few reasonable assumptions we can make about any superintelligent postbiological civilization is that higher intelligence involves increased computational efficiency. Advanced civs will upgrade into physical configurations that maximize computation capabilities given the local resources.
Thus to understand the physical form of future civs, we need to understand the physical limits of computation.
One key constraint is the Landauer Limit, which states that the erasure (or cloning) of one bit of information requires a minimum of kTln2 joules. At room temperature (293 K), this corresponds to a minimum of 0.017 eV to erase one bit. Minimum is however the keyword here, as according to the principle, the probability of the erasure succeeding is only 50% at the limit. Reliable erasure requires some multiple of the minimal expenditure - a reasonable estimate being about 100kT or 1eV as the minimum for bit erasures at today's levels of reliability.
Now, the second key consideration is that Landauer's Limit does not include the cost of interconnect, which is already now dominating the energy cost in modern computing. Just moving bits around dissipates energy.
Moore's Law is approaching its asymptotic end in a decade or so due to these hard physical energy constraints and the related miniaturization limits.
I assign a prior to the warm-tech scenario that is about the same as my estimate of the probability that the more advanced cold-tech (reversible quantum computing, described next) is impossible: < 10%.
From Warm-tech to Cold-tech
There is a way forward to vastly increased energy efficiency, but it requires reversible computing (to increase the ratio of computations per bit erasures), and full superconducting to reduce the interconnect loss down to near zero.
The path to enormously more powerful computational systems necessarily involves transitioning to very low temperatures, and the lower the better, for several key reasons:
Assuming large scale quantum computing is possible, then the ultimate computer is thus a reversible massively entangled quantum device operating at absolute zero. Unfortunately, such a device would be delicate to a degree that is hard to imagine - even a single misplaced high energy particle could cause enormous damage.
Stellar Escape Trajectories
The Great Game
If two civs both discover each other's locations around the same time, then MAD (mutually assured destruction) dynamics takeover and cooperation has stronger benefits. The vast distances involve suggest that one sided discoveries are more likely.
Spheres of Influence
Conditioning on our Observational Data
Observational Selection Effects
All advanced civs will have strong instrumental reasons to employ deep simulations to understand and model developmental trajectories for the galaxy as a whole and for civilizations in particular. A very likely consequence is the production of large numbers of simulated conscious observers, ala the Simulation Argument. Universes with the more advanced low temperature reversible/quantum computing civilizations will tend to produce many more simulated observer moments and are thus intrinsically more likely than one would otherwise expect - perhaps massively so.
Rogue Planets
Although the error range is still large, it appears that free floating planets outnumber planets bound to stars, and perhaps by a rather large margin.
Assuming the galaxy is colonized: It could be that rogue planets form naturally outside of stars and then are colonized. It could be they form around stars and then are ejected naturally (and colonized). Artificial ejection - even if true - may be a rare event. Or not. But at least a few of these options could potentially be differentiated with future observations - for example if we find an interesting discrepancy in the rogue planet distribution predicted by simulations (which obviously do not yet include aliens!) and actual observations.
Also: if rogue planets outnumber stars by a large margin, then it follows that rogue planet flybys are more common in proportion.
Conclusion
SETI to date allows us to exclude some regions of the parameter space for alien civs, but the regions excluded correspond to low prior probability models anyway, based on the postbiological perspective on the future of life. The most interesting regions of the parameter space probably involve advanced stealthy aliens in the form of small compact cold objects floating in the interstellar medium.
The upcoming WFIST telescope should shed more light on dark matter and enhance our microlensing detection abilities significantly. Sadly, it's planned launch date isn't until 2024. Space development is slow.