Perplexed comments on Ben Goertzel: The Singularity Institute's Scary Idea (and Why I Don't Buy It) - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (432)
Utility for what purpose? If we're talking about say, a paperclip maximizer, then its utility for human beings will be measured in paperclip production.
It won't be as efficient as specialized paperclip-production machines will, for the production of paperclips.
Yes, but you're unlikely to be happy with it: read the sequences, or at least the parts of them that deal with reasoning, the use of words, and inferential distances. (For now at least, you can skip the quantum mechanics, AI, and Fun Theory parts.)
At minimum, this will help you understand LW's standards for basic reasoning, and how much higher a bar they are than what constitutes "reasoning" pretty much anywhere else.
If you're reasoning as well as you say, then the material will be a breeze, and you'll be able to make your arguments in terms that the rest of us can understand. Or, if you're not, then you'll probably learn that along the way.