Will_Newsome comments on Rationality Quotes: June 2011 - Less Wrong

4 Post author: Oscar_Cunningham 01 June 2011 08:17AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (470)

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Newsome 14 June 2011 01:36:47AM 0 points [-]

consequentialism is correct.

What do you mean by this? I know it doesn't mean that humans should generally use consequentialist reasoning, for example.

Comment author: Will_Sawin 14 June 2011 01:40:50AM 0 points [-]

It means that the right way to come up with deontological rules for humans is by thinking of them in the framework discussed in that post.

Comment author: Will_Newsome 14 June 2011 01:45:50AM *  0 points [-]

Okay, that and your belief that rule-utilitarianism isn't consequantialism leads me to think that your version of consequentialism is roughly "if you're attempting to be an FAI and you're not doing lots of multiplication then you're doing it wrong". Too far off?

Comment author: nshepperd 14 June 2011 05:19:19AM 0 points [-]

Instrumental vs terminal goals. Consequentialism is the ideal, but can't implement it so we have to approximate it with deontological rules due to limitations of our brains. The rules don't get their moral authority from nowhere, they depend on being useful for reaching the actual goal. Or: the only reason we follow the rules is because we know that we'll get a worse outcome if we don't.

Comment author: Will_Sawin 14 June 2011 02:00:08AM 0 points [-]

It's the difference between - a priori rules and a posteori rules, I guess?

I'm all for a posteori rules, but not a priori rules.