Why does SI/LW focus so much on AI-FOOM disaster, with apparently much less concern for things like
- bio/nano-tech disaster
- Malthusian upload scenario
- highly destructive war
- bad memes/philosophies spreading among humans or posthumans and overriding our values
- upload singleton ossifying into a suboptimal form compared to the kind of superintelligence that our universe could support
Why, for example, is lukeprog's strategy sequence titled "AI Risk and Opportunity", instead of "The Singularity, Risks and Opportunities"? Doesn't it seem strange to assume that both the risks and opportunities must be AI related, before the analysis even begins? Given our current state of knowledge, I don't see how we can make such conclusions with any confidence even after a thorough analysis.
SI/LW sometimes gives the impression of being a doomsday cult, and it would help if we didn't concentrate so much on a particular doomsday scenario. (Are there any doomsday cults that say "doom is probably coming, we're not sure how but here are some likely possibilities"?)
That's a much lower standard than "should Luke make this a focus when trading breadth vs speed in making his document". If people get enthused about that, they're welcome to. I've probably put 50-300 hours (depending on how inclusive a criterion I use for relevant hours) into the topic, and saw diminishing returns. If I overlap with Eric Drexler or such folk at a venue I would inquire, and I would read a novel contribution, but I'm not going to be putting much into it given my alternatives soon.
I agree that it's a lower standard. I didn't mean to endorse Wei's claims in the original post, certainly not based on nanotech alone. If you don't personally think it's worth more of your time to pay attention to nanotech, I'm sure you're right, but it still seems like a collective failure of attention that we haven't talked about it at all. You'd expect some people to have a pre-existing interest. If you ever think it's worth it to further describe the conclusions of those 50-300 hours, I'd certainly be curious.