Kingreaper comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong

32 Post author: ciphergoth 30 October 2010 09:31AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (432)

You are viewing a single comment's thread. Show more comments above.

Comment author: Kingreaper 31 October 2010 11:12:47AM *  0 points [-]

The "default case" occurs when not specifically avoided.

The company making the OS upgrade is going to do their best to avoid the computers it's installed on crashing. In fact, they'll probably hire quality control experts to make certain of it.

Why should AGI not have quality control?

Comment author: PeterisP 01 November 2010 09:16:10PM 1 point [-]

It definitely should have quality control.

The whole point of the 'Scary idea' is that there should be an effective quality control for GAI, otherwise the risks are too big.

At the moment humanity has no idea on how to make an effective quality control - which would be some way to check if an arbitrary AI-in-a-box is Friendly.

Ergo, if a GAI is launched before Friendly AI problem has some solutions, it means that GAI was launched without a quality control performed. Scary. At least to me.