gwern comments on Open thread, October 2011 - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (308)
I estimate, that a currently working and growing superintelligence has a probability in a range of 1/million to 1/1000. I am at least 50% confident that it is so.
Not a big probability but given the immense importance of such an object, it is already a significant event to consider. The very near term birth of a superintelligence is something to think about. It wouldn't be just another Sputnik launched by some other people you thought they are unable to make it, but they sure were. We know that well, it wouldn't be just a minor blow for a pride as Sputnik was for some, and a triumph for others who conceived it and launched it.
No, that could be a check mate in a first move.
Non the less, people are dismissive of any short term success in the field. I am not and I want to express it in an open thread.
The probability is already just an expression of your own uncertainty. Giving a confidence interval over the probability does not make sense.
If you can have a 95% confidence interval, why can't you have a >50% confidence interval as well?
50% confidence intervals are standard practice. But not the point and not what I questioned.
There is no way in which my comment can be read which would make your reply make sense in the context.