turchin comments on Using the Copernican mediocrity principle to estimate the timing of AI arrival - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (17)
Yes, but we need to add "Humanity goes extinct before this date" which is also possible. ((( Sufficiently large catastrophe could prevent AI creation, like supervirus or nuclear war.
That would be another way for exponential growth in human AI research to stop, yes. You can think of it as one of the options under "(etc.)", or as a special case of "not enough resources".