Hard nanotech (the kind usually envisioned in sci-fi) may be physically impossible, and at the very least is extremely difficult. The type of nanotech that is more feasible is 1.) top-down lithography (ie chips), and 2.) bottom up cellular biology, or some combinations thereof.
Biological cells are already near optimal nanotech robots in both practical storage density and computational energy efficiency (landauer limit). Even a superintelligence, no matter how clever, will not be able to design nanobots that are vastly more generally capable than biological cells. Robots are fundamentally limited by energy efficiency and storage density and biology is already operating at the physical limits for those key constraints. So plausible bottom up nanotech just looks like more 'boring' advanced biotech.
It would make evolutionary sense for current cells to be near optimal, and therefore for there to not be much opportunity for biotech/nanotech to do big powerful stuff. However, I notice that this leaves me confused about two things.
First, common rhetoric in the rationalist community is that this is a big risk. E.g. Robin Hanson advocated banning mirror cells, and I regularly hear people suggest working on preventing pandemics as an x-risk, or talk about how gain-of-function research is dangerous.
Secondly, there's the personal experience that we just had t...
I don't know much about nanotech/biotech, but what little I know suggests that this will be the earliest failure point where AI can cause doom for humans. I thought that because of this, I should start learning more about nanotech/biotech, and I thought that asking LessWrong for direction might be a place to start.
My heuristic for why nanotech/biotech is critical, and for why I am lumping them together:
Now, this was all a thought I came up with yesterday based on very little knowledge about nanotech/biotech, so this might be totally wrong and naive. But it seems very different from the common AI risk models, so I thought it would be strategically super important to consider if it's true.