I expect everyone here has an opinion on the Drake Equation. (Comment if I'm wrong.) And that's because it is an easy story to remember and spread. Never mind its glaring inadequacy or the symbols it uses: it gives you a number of alien civilizations and somehow that sticks. I'd like to see if a science meme with similar properties could be created to carry a transhumanist payload. So. Could you convince a random person of the following three points if you wanted to?
- We're getting increasingly confident estimates on the number and distribution of planets in our galaxy.
- The other factors in the Drake equation have been discussed a lot - they remain guesses till we find something, but at least they aren't going to change a lot until we do.
- So we should be able to estimate, very roughly and while mumbling about priors, an expected distance to the next planetary body with primitive life, with sentient life or with self-improving life (i.e. something like AIs that can exponentially grow that biosphere's cognitive capacity).
I think you could. And if you do, and if you can give a number of light-years, regardless of how much you emphasize the low confidence, aliens will suddenly seem more real to that random person. And so will, if not full transhumanism, at least some vague notion that intelligence must grow much like life does. I think that could reach a lot of people.
(If anybody complains that the expectation of some Singularity-like development is ideological: no, it is a reasonable guess based on the current evidence, much like Drake's expectation of every technological civilization's eventual self-destruction was reasonable in his Cold War era.)
The brain I'm typing this from knows too little math or astronomy to do this locally, so I'm throwing out the idea. Anyone care to play with this?
Upvote. The Drake Equation and SETI seem at least as relevant as, say, Pascal's Mugging. GIGO, sure, but a standard dismissal in statistics is to say there's not enough data, more research needed. Isn't this where Bayes is supposed to win over frequentism, when it comes to imperfect or incomplete information?
Babyeater FAI would be very different, but could still give us big hints on how to make human FAI. It's the standard science process, instead of reinventing the wheel, stand on the shoulders of giants and learn what other smart people who've come before have figured out.