Kingreaper comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (432)
The "default case" occurs when not specifically avoided.
The company making the OS upgrade is going to do their best to avoid the computers it's installed on crashing. In fact, they'll probably hire quality control experts to make certain of it.
Why should AGI not have quality control?
It definitely should have quality control.
The whole point of the 'Scary idea' is that there should be an effective quality control for GAI, otherwise the risks are too big.
At the moment humanity has no idea on how to make an effective quality control - which would be some way to check if an arbitrary AI-in-a-box is Friendly.
Ergo, if a GAI is launched before Friendly AI problem has some solutions, it means that GAI was launched without a quality control performed. Scary. At least to me.