MichaelGR comments on Less Wrong Q&A with Eliezer Yudkowsky: Ask Your Questions - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (682)
That's what I've been told, but I'm not entirely convinced. Since there are so many timelines out there, and since fundamental breakthroughs are hard to predict, I think it still deserves some attention as soon as possible, if only to know what to do if things start moving rapidly (an AGI team might not have many chances to recover from security mistakes).
I'll broaden my question a bit so that it applies to all people working on AGI and not just the SIAI.