timtyler comments on No One Knows Stuff - Less Wrong
You are viewing a comment permalink. View the original post to see all comments and the full post content.
You are viewing a comment permalink. View the original post to see all comments and the full post content.
Comments (46)
I would still be prepared to call an agent "utilitarian" if it operated via maximising expected utility - even if its expectations turned out to be completely wrong, and its actions were far from those that would have actually maximised utility.
Humans are often a bit like this. They "expect" that hoarding calories is a good idea - and so that is what they do. Actually this often turns out to be not so smart. However, this flaw doesn't make humans less utilitarian in my book - rather they have some bad priors - and they are wired-in ones that are tricky to update.