Wix comments on Q&A #2 with Singularity Institute Executive Director - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (47)
How much do members' predictions of when the singularity will happen differ within the Singularity Institute?
Eliezer Yudkowsky wrote:
...
There is more there, best to start here and read all we way down to the bottom of that thread. I think that discussion captures some of the best arguments in favor of friendly AI in the most concise way you can currently find.