Stuart_Armstrong comments on Against easy superintelligence: the unforeseen friction argument - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (48)
Yes, but extra positive noise in the strong FOOM scenarios doesn't make much difference. If a bad AI fooms in 30 minutes rather than an hour, or even in 5 minutes, we're still equally dead.
Positive noise might mean being able to FOOM from a lower starting base.
Point taken, though I've already increased my probability of early FOOM ( http://www.youtube.com/watch?v=ad4bHtSXiFE )
And I stand by the point that most noise will be negative. Start changing random things in, say, the earth's ecosystem, may open great new opportunities, but is most likely to cost us than benefit us.