Rukifellth comments on Why Politics are Important to Less Wrong... - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (96)
Wait, if you're regarding the elimination of war, famine and disease as consolation prizes for creating an wFAI, what are people expecting from a sFAI?
God. Either with or without the ability to bend the currently known laws of physics.
No, really.
Really. That really is what people are expecting of a strong FAI. Compared with us, it will be omniscient, omnipotent, and omnibenevolent. Unlike currently believed-in Gods, there will be no problem of evil because it will remove all evil from the world. It will do what the Epicurean argument demands of any God worthy of the name.
Are you telling me that if a wFAI were capable of eliminating war, famine and disease, it wouldn't be developed first?
Well, I don't take seriously any of these speculations about God-like vs. merely angel-like creations. They're just a distraction from the task of actually building them, which no-one knows how to do anyway.
But still, if a wFAI was capable of eliminating those things, why be picky and try for sFAI?
Because we have no idea how hard it is to specify either. If, along the way it turns out to be easy to specify wFAI and risky to specify sFAI, then the reasonable course is expected. Doubly so since a wFAI would almost certainly be useful in helping specify a sFAI.
Seeing as human values are a miniscule target, it seems probable that specifying wFAI is harder than sFAI though.
"Specify"? What do you mean?
specifications a la programming.