You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

djm comments on Engaging First Introductions to AI Risk - Less Wrong Discussion

20 Post author: RobbBB 19 August 2013 06:26AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (21)

You are viewing a single comment's thread.

Comment author: djm 19 August 2013 03:39:04PM 0 points [-]

I would really like to see a "What can you do to help" section. In fact maybe we should be seriously thinking about concrete ways to allow non mathematicians to also contribute to solving this problem.

Comment author: RobbBB 19 August 2013 07:06:17PM *  1 point [-]

What can you do to help? In order of priority, I think the top choices for non-specialists are:

  1. Get more money funneled into MIRI and FHI.
  2. Memetically propagate informed worry about AI risk. Find ways to make the idea stick.
  3. Improve your own and others' domain-general rationality.
  4. Try to acquire math knowledge and skills that might be in demand by FAI researchers over the next few decades.

If this sequence mix does its job, 2 is simple enough — tell people to begin by sharing the list. (Or other targeted articles and lists, depending on target audience.)

3 is a relatively easy sell, and the primary way I expect to contribute.

4 is quite difficult and risky at this stage.

1 is hard to optimize at the meta level because good signaling is hard. Our default method for beginning to combat high-level biases — write a Sequence focused on this specific issue to bring it to people's attentions — is unusually tricky to pull off here. Something off-site and independent — a petition, signed by lots of smart people, telling everyone that AI risk is important? an impassioned personal letter by a third party, briefly and shibbolethlessly laying out the Effective Altruism case for MIRI? — might be more effective.