Let’s do an experiment in "reverse crowdfunding”. I will pay 50 USD to anyone who can suggest a new way of X-risk prevention that is not already mentioned in this roadmap. Post your ideas as a comment to this post.
Should more than one person have the same idea, the award will be made to the person who posted it first.
The idea must be endorsed by me and included in the roadmap in order to qualify, and it must be new, rational and consistent with modern scientific data.
I may include you as a co-author in the roadmap (if you agree).
The roadmap is distributed under an open license GNU.
Payment will be made by PayPal. The total amount of the prize fund is 500 USD (total 10 prizes).
The competition is open until the end of 2015.
The roadmap can be downloaded as a pdf from:
UPDATE: I uploaded new version of the map with changes marked in blue.
http://immortality-roadmap.com/globriskeng.pdf
Email: alexei.turchin@gmail.com
I have an idea related to Plan B – Survive the Catastrophe.
The unfortunate reality is that we do not have enough resources to effectively prepare for all potential catastrophes. Therefore, we need to determine which catastrophes are more likely and adjust our preparation priorities accordingly.
I propose that we create/encourage/support prediction markets in catastrophes, so that we can harness the “wisdom of the crowds” to determine which catastrophes are more likely. Large prediction markets are good at determining relative probabilities.
Of course, the prediction market contracts could not be based on an actual extinction event because no one would be alive to collect the payoff! However, if the contracts are based on severe (but not existential) events, they would still help us infer more accurate estimates for extinction event probabilities.
I think that you have two ideas:
I don't buy the first one... But in fact prizes that I suggested in the opening post is something like it. I mean the idea to use money to extract wisdom of the crowd is good. But prediction market is not the best variant. Because majority of people have a lot of strange ideas about x-risks and such ideas would dominate.
The idea to prepare to most probable catastrophe is better. In fact, we could built bio and nuclear refuges, but not AI and nanotec... (read more)