Johnicholas comments on Less Wrong Q&A with Eliezer Yudkowsky: Ask Your Questions - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (682)
If you thought an AGI couldn't be built what would you dedicate your life to doing? Perhaps another formulation, or a related question: what is the most important problem/issue not directly related to AI.
At the Singularity Summit, this question (or one similar) was asked, and (if I remember correctly) EY answer was something like: If the world didn't need saving? Possibly writing science fiction.
Cool. But say the world does need saving, would there be a way to do it that doesn't involve putting something smarter than us in charge?