Import background assumption of AGI risk being really bad.
(Wild speculative ideas, babbling)
How hard would it be to make a drug or genetically engineered microorganism that made humans more afraid of AI? (Or some similar mental effect) Is it something that could reasonably be made without AGI and with the sort of resources such a project could reasonably access?
How hard would it be to distribute such a substance to at least the portion of humanity most likely to develop AGI? Various distribution methods suggest themselves. Dosing food or water. Making it also do something else that people might want, and then giving or selling it. (Baldness cure/side effects may include excessive fear of AI/ maybe don't mention that). Making it literally infectious.
Would this actually result in AGI not being built until we know what we are doing?
In short, I am trying to tell if this is a wild but potentially workable idea, or just rubbish.
My (admittedly limited) knowledge of psychology and neurosciences suggests that this is not currently possible. Thankfully.
I feel like if you start seriously considering things that are themselves almost as bad as AI ruin in their implications in order to address potential AI ruin, you took a wrong turn somewhere.
If you can create a virus or something of the sort that makes people genuinely afraid of some vague abstract thing, you can make them scared of anything at all. Do I really need to spell it out how that would be abused?
On the other hand, do you really need to go that far?
Launch media campaign and you can get most of the same results without making the world much more dystopian than it already is.
The main risk here is that it's easy to scare people so much that all of the research gets shut down. And I expect that to be the reason there's not much scare about it in media yet. As far as I remember, that's why most researcher in the field were at first reluctant to admit there's a risk at all.
That's true if capability is there already.
If capability is maybe, possibly there but requires a lot of research to confirm the possibility and even more to get it going, I'd suggest that we might deal with it by acessing the risks and not going down that route.
I mean, that's precisely what this community seems to think about GoF re... (read more)