Greetings LessWrong community. I have written this prepping guide, to explain the threats of AGI in simple terms, and to propose the best course of action on an individual level, based on rational and logical thinking and strategies.
I am very concerned with how the topic is handled in the public sphere and on a personal level. I have identified and tried to explain many illogical fallacies that people commit to, which includes many if not most experts who give statements on this topic.
One of those fallacies is the principle to remain in inaction or to rely on herd mentality in a situation where the individual feels helpless. Another one concerns how to handle threats, depending on their magnitude, in absence of proof and certainty of knowledge. Another one would be to base your actions purely on faith into your personal beliefs about the outcome, when you cannot sufficiently rule out other outcomes to any meaningful degree.
I have seen many people argue very illogical things on this site, such that it makes no sense to concern yourself with societal collapse brought on by AGI, because it would end the world as we know it anyway. Or that we are incapable to know what to do, because we have not yet established what will truly happen.
While it might be true that some people would be content to really do nothing and invest nothing to address any risks, simply on a gamble like in Russian roulette, I don't think this suicidal mentality is what most people would recognize as an actually reasonable strategy for themselves and their family, if they only give the topic sufficient thought and consideration.
I would love to hear your feedback on my guide and have it analyzed through a rigorous logical lens as well, that fits the principles of LessWrong about logic and reason. I have been reading LessWrong from time to time, but I am not actually part of this community.
Unfortunately I have not been able to further support most of the logic proposed in this guide by some form of more academic framework to assess and manage (existential) risk, because such a framework just doesn't seem to exist.
Here is the direct onion link for faster access:
http://prepiitrg6np4tggcag4dk4juqvppsqitsvnwuobouwkwl2drlsex5qd.onion/
Edit: Please if you downvote, can you explain your rationale?
The audience is the general public. That is anyone who has the attention span and is smart enough to read what I wrote, without feeling the need to disregard the idea out of comfort, personal conviction, laziness, etc. I was toying with the idea of writing a much shorter version for stupid people. But then I think that is just an exercise in futility. And also I don't really like the idea of stupid people gaining such an existential advantage.
Unfortunately I am not allowed to create another post the next 7 days, due to low karma. I have written a new post however that I will post soon. It is quite long. If you are interested you can read it here:
https://pastebin.com/7WR0P8ZM
Maybe you could tell me how much it is on track, from your experience with this community. And I would also like to understand any reasons how my perspective could be flawed. I was reading quite a bit on here and watching some videos. Although influenced a lot by your replies, it is not meant to attack you, but those who follow by authority and influence of Eliezer and don't think for themselves instead.
I don't write and will not write those these things to please people though. If people cannot jump over their own shadow and process dissonance in a healthy constructive way, then so be it.