Let’s do an experiment in "reverse crowdfunding”. I will pay 50 USD to anyone who can suggest a new way of X-risk prevention that is not already mentioned in this roadmap. Post your ideas as a comment to this post.
Should more than one person have the same idea, the award will be made to the person who posted it first.
The idea must be endorsed by me and included in the roadmap in order to qualify, and it must be new, rational and consistent with modern scientific data.
I may include you as a co-author in the roadmap (if you agree).
The roadmap is distributed under an open license GNU.
Payment will be made by PayPal. The total amount of the prize fund is 500 USD (total 10 prizes).
The competition is open until the end of 2015.
The roadmap can be downloaded as a pdf from:
UPDATE: I uploaded new version of the map with changes marked in blue.
http://immortality-roadmap.com/globriskeng.pdf
Email: alexei.turchin@gmail.com
Simulation is an X-Risk that we stagnate our universal drive to growth and live in a simulation for the rest of our lives and extinguish ourselves from existence.
Bio-Hack is an X-Risk because if done wrong you would encourage all these small bio-tech interests and end up with someone doing it unsafely.
The failure of mini biohack groups could probably be classified as controlled regression->small catastrophy. Similar to the small nuclear catastrophies of current history and their ability to discourage any future risk taking behaviour in the area.
The advantage of common bio-hack groups is less reliance on the existing big businesses to save us with vaccines etc.
Indeed the suggestion of "Invite the full population to contribute to solving the problem" might be a better description.
New suggestion: "lower the barriers of entry into the field of assistance in X-risk". Easy explanation of the X-risks; easier availability of resources to attempt solutions. Assuming your main x-risks are 1. biotech; 2. nanotech; 3. nuclear, 4. climate change and 5. UFAI)
New suggestion: Teach x-risk from 5 years old upwards. So that the next generation of humans understand that when they play with these kinds of powerful forces - they risk a whole lot more than they realise. (hopefully before having an x-risky accident to explicitly warn people about things)
New idea - I don't think you covered: lock down all risk areas beneath piles of bureaucracy, paperwork, safety requirements and bullshit. No one gets to work on nuclear, no one gets to work on biotech without ridiculous safety standards, no one gets to create pollution without being arrested and charged, no one gets to code learning machines without strict supervision.
I like ideas about risks education and about bureaucracy I think I should include them in the map and award you 2 prizes. How I can transfer them?