MugaSofer comments on The genie knows, but doesn't care - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (515)
AI: Yes, this is in complete contradiction of my programmed goals. Ha ha, I'm gonna do it anyway.
Of course, yeah. I'm basically accusing you of failure to steelman/misinterpreting someone; I, for one, have never heard this suggested (beyond the one example I gave, which I don't think is what you had in mind.)
uhuh. So, any AI smart enough to understand it's creators, right?
waaait I think I know where this is going. Are you saying an AI would somehow want to do what it's programmers intended rather than what they actually programmed it to do?
Yeah, sorry, I can see how programmers might accidentally write code that creates dopamine world and not eutopia. I just don't see how this is supposed to connect to the idea of an AI spontaneously violating it's programmed goals. In this case, surely that would look like "hey guys, you know your programming said to maximise happiness? You guys should be more careful, that actually means "drug everybody". Anyway, I'm off to torture some people."