I've got to admit, I look at most of these and say "you're treating the social discomfort as something immutable to be routed around, rather than something to be fixed by establishing different norms". Forgive me, but it strikes me (especially in this kind of community with high aspie proportion) that it's probably easier to tutor the... insufficiently-assertive... in how to stand up for themselves in Ask Culture than it is to tutor the aspies in how to not set everything on fire in Guess Culture.
Amusingly, "rare earths" are actually concentrated in the crust compared to universal abundance and thus would make awful candidates for asteroid mining, while "tellurium", literally named after the Earth, is an atmophile/siderophile element with extreme depletion in the crust and one of the best candidates.
It strikes me that I'm not sure whether I'd prefer to lose $20,000 or have my jaw broken. I'm pretty sure I'd prefer to have my jaw broken than to lose $200,000, though. So, especially in the case that the money cannot actually be extracted back from the thief, I would tend to think the $200,000 theft should be punished more harshly than the jaw-breaking. And, sure, you've said that the $20,000 would be punished more harshly than the jaw-breaker, but that's plausibly just because 2 days is too long for a $100 theft to begin with.
I mean, most moral theories do either give the answers of "zero", "as large as can be fed", or "a bit less than as large as can be fed". Given the potential to scale feeding in the future, the latter two round off to "infinity".
I think the basic assumed argument here (though I'm not sure where or even if I've seen it explicitly laid out) goes essentially like this:
The earliness of life appearing on Earth isn't amazingly-consistent with life's appearance on Earth being a filter-break. It suggests either abiogenesis is relatively-easy or that panspermia is easy (as I noted, in the latter case abiogenesis could be as hard as you like but that doesn't explain the Great Silence).
Frankly, it's premature to be certain it's "abiogenesis rare, no panspermia" before we've even got a close look at Earthlike exoplanets.
I'll note that most of the theorised catastrophes in that vein look like either "planet gets ice-nined", "local star goes nova", or "blast wave propagates at lightspeed forever". The first two of those are relatively-easy to work around for an intelligent singleton, and the last doesn't explain the Fermi observation since any instance of that in our past lightcone would have destroyed Earth.
I've read most of that paper (I think I've seen it before, although there could be something else near-identical to it; I know I've read multiple[1] papers that claim to solve the Fermi Paradox and do not live up to their hype). TBH, I feel like it can be summed up as "well, there might be a Great Filter somewhere along the line, therefore no paradox". I mean, no shit there's probably a Great Filter somewhere, that's the generally-proposed resolution that's been going for decades now. The question is "what is the Filter?". And saying "a Filter exists" doesn't answer that question.
We've basically ruled out "Earthlike planets are rare". "Abiogenesis is rare" is possible, but note that you need "no lithopanspermia" for that one to hold up since otherwise one abiogenesis event (the one that led to us and which is therefore P = 1, whether on Earth or not) is enough to seed much of the galaxy. "Intelligence is rare" is a possibility but not obviously-true, ditto "technology is rare". Late filters (which, you will note, the authors assume to be a large possibility) appear somewhat less plausible but are not ruled out by any stretch. So yeah, it's still a wide-open question even if there are some plausible answers.
The other one I recall, besides Grabby Aliens, was one saying that Earthly life is downstream of a lithopanspermia event so there's no Fermi paradox; I don't get it either.
There is also the possibility of the parties competing over it to avoid looking "soft on AI", which is of course the ideal.
To the extent that AI X-risk has the potential to become partisan, my general impression is that the more likely split is Yuddite-right vs. technophile-left. Note that it was a Fox News reporter who put the question to the White House Press Secretary following Eliezer's TIME article, and a Republican (John Kennedy) who talked about X-risk in the Senate hearing in May, while the Blue-Tribe thinkpieces typically take pains to note that they think X-risk is science fiction.
As a perennial nuclear worrier, I should mention that while any partisan split is non-ideal, this one's probably preferable to the reverse insofar as a near-term nuclear war would mean the culture war ends in Red Tribe victory.
>Second, I imagine that such a near-miss would make Demis Hassabis etc. less likely to build and use AGIs in an aggressive pivotal-act-type way. Instead, I think there would be very strong internal and external pressures (employees, government scrutiny, public scrutiny) preventing him and others from doing much of anything with AGIs at all.
I feel I should note that while this does indeed form part of a debunk of the "good guy with an AGI" idea, it is in and of itself a possible reason for hope. After all, if nobody anywhere dares to make AGI, well, then, AGI X-risk isn't going to happen. The trouble is getting the Overton Window to the point where sufficient bloodthirst to actually produce that outcome (i.e. nuclear-armed countries saying "if anyone attempts to build AGI, everyone who cooperated in doing it hangs or gets life without parole, and if any country does not enforce this vigorously we will invade, and if they have nukes or have a bigger army than us then we pre-emptively nuke them because their retaliation is still higher-EV than letting them finish") is seen as something other than insanity, which a warning shot could well pull off.
This is not a permanent solution - questions of eventual societal relaxation aside, humanity cannot expand past K2 without the Jihad breaking down unless FTL is a thing - but it buys a lot of breathing time, which is the key missing ingredient you note in a lot of these plans.