You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Nick_Roy comments on Q&A #2 with Singularity Institute Executive Director - Less Wrong Discussion

9 Post author: lukeprog 13 December 2011 06:48AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (47)

You are viewing a single comment's thread.

Comment author: Nick_Roy 13 December 2011 07:41:52PM 5 points [-]

Since it's difficult to predict the date of the invention of AGI, has SI thought about/made plans for how to work on the FAI problem for many decades, or perhaps even centuries, if necessary?

Comment author: Curiouskid 15 December 2011 01:55:15AM *  5 points [-]

As a subset of this question, do you think that establishing a school with the express purpose of training future rationalists/AGI programmers from an early age is a good idea? Don't you think that people who've been raised with strong epistemic hygiene should be building AGI rather than people who didn't acquire such hygiene until later in life?

The only reasons I can see for it not working would be: 1. predictions that AGIs will come before the next generation of rationalists comes along. (which is also a question of how early to start such an education program). 2. belief that our current researchers are up to the challenge. (even then, having lots of people who've had a structured education designed to produce the best FAI researchers would undeniably reduce existential risk. no?)

EDIT (for clarification): Eliezer has said:

"I think that saving the human species eventually comes down to, metaphorically speaking, nine people and a brain in a box in a basement"

Just as they would be building an intelligence greater than themselves, so to must we build human intelligences greater than ourselves.

Comment author: Normal_Anomaly 26 December 2011 07:55:15PM 0 points [-]

The only reasons I can see for it not working would be: 1. predictions that AGIs will come before the next generation of rationalists comes along. (which is also a question of how early to start such an education program). 2. belief that our current researchers are up to the challenge. (even then, having lots of people who've had a structured education designed to produce the best FAI researchers would undeniably reduce existential risk. no?)

I can't speak for the SIAI, but to me this sounds like a suboptimal use of resources, and bad PR. It trips my "this would sound cultish to the average person" buzzer. Starting a school that claimed it "emphasized critical thinking" to teach rationalists might be a good idea for someone with administrative talents who wanted to work on x-risk, but I can't see SIAI doing it.

Comment author: Curiouskid 27 December 2011 03:31:46AM 1 point [-]

How would you distribute resources? I think this is a natural response if one accepts the premise that the main bottleneck to AGI is a few key insights by geniuses (as Eliezer says).

Why do we care if people who aren't logical enough to see the reasoning behind the school think we're cultish?