XiXiDu comments on Muehlhauser-Wang Dialogue - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (284)
What makes you believe that his expected utility calculation of reading Bostroms paper suggests that it is worth reading it?
He answered that in the interview.
He answered that in the interview.
He wrote that AI's with fixed goal architectures can't be general intelligent and that AI's with unpredictable goals can't be guaranteed to be safe but that we have to do our best to educate them and restrict their experiences.
He answered, and asserted it but didn't explain it.
He answered, but didn't show that. (This does not represent an assertion that he couldn't have or that in the circumstances that he should necessarily have tried.) (The previous disclaimer doesn't represent an assertion that I wouldn't claim that he'd have no hope of showing that credibly, just that I wasn't right now making such a criticism.) (The second disclaimer was a tangent too far.)
The latter claim is the one that seems the most bizarre to me. He seems to not just assume that the AIs that humans create will have programming to respond to 'education' regarding their own motivations desirably but that all AIs must necessarily do so. And then there is the idea that you can prevent a superintelligence from rebelling against you by keeping it sheltered. That doesn't even work on mere humans!
You're assuming that an AI can in some sense be (super) intelligent without any kind of training or education. Pei is making the entirely valid point that no known AI works that way.
Yes, but the answer was:
...which is pretty incoherent. His reference for this appears to be himself here and here. This material is also not very convincing. No doubt critics will find the section on "AI Ethics" in the second link revealing.
Nothing. That's what he should do, not what he knows he should do.