Vladimir_Nesov comments on I think I've found the source of what's been bugging me about "Friendly AI" - Less Wrong Discussion
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (31)
Edit: It's now fixed.
In Eliezer's quote, "genies for which no wish is safe" are those that kill you irrespective of what wish you made, while here it's written as if you might be referring to AIs that are safe even if you make no wish, which is different. This should be paraphrased for clarity, whatever the intended meaning.
Or maybe the parenthesis refere only to "doomsday machine"
That's how I read it. The wording could be clearer.
This is the intended reading. Edited for clarity.
Well, there's the systems that simply can't process your wishes (AIXI for instance), but which you can use to e.g. cure cancer if you wish (you could train it to do what you tell it to but all it is looking for is sequence that leads to reward button press, which is terminal - no value for button being held). Just as there is a system, screwdriver, which I can use to unscrew screws, if I wish, but it's not a screw unscrewing genie.