Will_Sawin comments on Rationality Quotes: June 2011 - Less Wrong

4 Post author: Oscar_Cunningham 01 June 2011 08:17AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (470)

You are viewing a single comment's thread. Show more comments above.

Comment author: Will_Sawin 14 June 2011 12:40:33AM *  0 points [-]

Is fun theory not relevant to lesswrongers?

What is the difference between fun theory and political theory?

ETA: Did you edit your comment? I didn't see some of the stuff at first.

...his ethical theory doesn't really fit neatly into the deontological/consequentialism dichotomy anyway. Arguably, his ethics/political theory amounts to consequentialism with "side-constraints" (that can even be violated in extreme circumstances). It doesn't seem to be any less consequentislist than, say, rule-utilitarianism.

but it's still not consequentialist, whereas, consequentialism is correct.

I think so, but I also think the Less Wrong ethical doctrine is wrong. At this point I think non-cognitivism is more probable than consequentialism (ask me next week and I might not, I go back and forth on the subject).

I still believe in consequentialism, as do most (presumably?) people on Less Wrong.

Comment author: Will_Newsome 14 June 2011 01:36:47AM 0 points [-]

consequentialism is correct.

What do you mean by this? I know it doesn't mean that humans should generally use consequentialist reasoning, for example.

Comment author: Will_Sawin 14 June 2011 01:40:50AM 0 points [-]

It means that the right way to come up with deontological rules for humans is by thinking of them in the framework discussed in that post.

Comment author: Will_Newsome 14 June 2011 01:45:50AM *  0 points [-]

Okay, that and your belief that rule-utilitarianism isn't consequantialism leads me to think that your version of consequentialism is roughly "if you're attempting to be an FAI and you're not doing lots of multiplication then you're doing it wrong". Too far off?

Comment author: nshepperd 14 June 2011 05:19:19AM 0 points [-]

Instrumental vs terminal goals. Consequentialism is the ideal, but can't implement it so we have to approximate it with deontological rules due to limitations of our brains. The rules don't get their moral authority from nowhere, they depend on being useful for reaching the actual goal. Or: the only reason we follow the rules is because we know that we'll get a worse outcome if we don't.

Comment author: Will_Sawin 14 June 2011 02:00:08AM 0 points [-]

It's the difference between - a priori rules and a posteori rules, I guess?

I'm all for a posteori rules, but not a priori rules.