You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Elo comments on Roadmap: Plan of Action to Prevent Human Extinction Risks - Less Wrong Discussion

13 Post author: turchin 01 June 2015 09:58AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (88)

You are viewing a single comment's thread.

Comment author: Elo 02 June 2015 09:56:01PM *  0 points [-]

Meta: I honestly didn't read the plan in full the first two times I posted. Instead I went to Wikipedia and looked up global catastrophic risk. Then once I had an understanding of what the definition of global catastrophic risk is; I thought up solutions (How would I best solve X) and checked if they were on the map.

The reason why I share this is because the first several things I thought of were not on the map. And it seems like several other answers are limited to "whats outside the box" (Think outside the box is a silly concept because it often involves people telling you exactly where the box is and where to think outside of) and indicated by things near the existing map. I am not sure that you are getting great improvements to the map from the way you have set the problem.

New idea: If I were hosting the map there would be a selection of known x-risk problems.
Something like:

AI:

  • paperclippers
  • UFAI
  • Oppressive AI (modifies our quality of life)
  • trickster AI (AI built with limits, i.e. human happiness - redefines its own reference term for human and happiness and kills all old humans that are not happy)

Nanotechnology:

  • 2nd gen Molecular assembly that escapes containment (on purpose or by accident)
  • Race to profit includes a game of chicken to take the highest risk.

Biotechnology:

  • new disease with no known relationship to existing diseases and high virulence (difficult to cure)
  • new strain of old disease (known effect and a race to fight it off)

Nuclear:

  • catastrophic death of all life by nuclear war and ongoing radiation
  • reduction of lifespan due to radioactivity induced cancer. (possibly reducing us back to pre-colonial civilisation)
  • concerning speed of mutation due to nuclear particles (either in humans or in things that harm us or ensure our wellbeing, i.e. viruses, food supplies)

Global climate:

  • planet becomes uninhabitable to humans
  • planet becomes less habitable to humans - slows down growth of science/technology
  • humans are forced underground limiting the progress of scientific research or our ability to sustain food produce.
  • humans are cut off from each other and forced to live in small colonies

And what each of the solutions on the solution map might help solve.

Edits: formatting. I still can't get the hang of formatting after this long!

Edit: it looks like you are working on other maps at http://immortality-roadmap.com/.

Comment author: turchin 02 June 2015 10:19:52PM *  1 point [-]

Yes, the site is not finished, but the map "Typology of human extinction risks" is ready and will be published next week. Around 100 risks will be listed. Any roadmap has its limitations because of its size and its basic 2D structure. Of course we could and should cover all option for all risks but it should be done in more details. Maybe I should do a map there to each risks will be suggested ways to its prevention.

Comment author: Elo 02 June 2015 10:32:45PM 0 points [-]

I didn't really know what x-risks you were talking about; which is why a map of x-risks would have helped me.

Comment author: turchin 02 June 2015 10:40:09PM 0 points [-]

Basically the same risks you listed here. I can PM you the map.