You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Evan_Gaensbauer comments on Room For More Funding In AI Safety Is Highly Uncertain - Less Wrong Discussion

12 Post author: Evan_Gaensbauer 12 May 2016 01:57PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (11)

You are viewing a single comment's thread. Show more comments above.

Comment author: turchin 13 May 2016 11:11:55AM 1 point [-]

I think that too much investment could result in more noise in the field. First of all because it will result in large number of published materials, which could exceed capacity of other researchers to read it. In result really interesting works will be not read. It will also attract in the field more people than actually clever and dedicated people exist. If we have 100 trained ai safety reserchers, which is overestimation , and we hire 1000 people, than real reasesrchers will be dissolved. In some fields like nanotech overinvestment result even in expel of original reaserchers because they prevent less educated ones to spent money as they want. But most dangerous thing is creating of many incomparable theories of friendliness, and even AIs based on them which would result in AI wars and extinction.

Comment author: Evan_Gaensbauer 14 May 2016 06:52:06AM 2 points [-]

Yeah, I read Eliezer's chapter "Artificial Intelligence as a Positive and Negative Factor in Global Risk" in Global Catastrophic Risks, and it was impressed with how far in advance he anticipated reactions to the rising popularity of AI safety, what it might be like when the public finally switched from skepticism to genuine concern, and what it might start to look like. Eliezer has also anticipated even safety-conscious work on AI might increase AI risk.

The idea some existing institutions in AI safety, perhaps MIRI, should expand much faster than others so it can keep up with all the published material coming out, and evaluate it, is neglected.