Cyan comments on The Future of Humanity Institute could make use of your money - LessWrong

52 Post author: danieldewey 26 September 2014 10:53PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (25)

You are viewing a single comment's thread. Show more comments above.

Comment author: Cyan 03 October 2014 05:53:31AM *  2 points [-]

The stated core goal of MIRI/the old SIAI is to develop friendly AI. With regards to that goal, the sequences are advertising.

Kinda... more specifically, a big part of what they are is an attempt at insurance against the possibility that there exists someone out there (probably young) with more innate potential for FAI research than EY himself possesses but who never finds out about FAI research at all.