You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

Kaj_Sotala comments on Are consequentialism and deontology not even wrong? - Less Wrong Discussion

15 Post author: Kaj_Sotala 02 June 2015 07:49AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (57)

You are viewing a single comment's thread. Show more comments above.

Comment author: Kaj_Sotala 08 July 2015 06:23:05PM 1 point [-]

I am not sure what you mean by "not even wrong".

I didn't answer this at first because I had difficulties putting my intuition to words. But here's a stab at it:

Suppose that at first, people believe that there is a God who has defined some things as sinful and others as non-sinful. And they go about asking questions like, "is brushing my teeth sinful or not", and this makes sense given their general set of beliefs. And a theologician could give a "yes" or "no" answer to that, which could be logically justified if you assumed some specific theology.

Then they learn that there is actually no God, but they still go about asking "is brushing my teeth sinful or not". And this no longer makes sense even as a question, because the definition of "sin" came from a specific theology which assumed the existence of God. And then a claim like "here's a theory which shows that brushing teeth is always sinful" would not even be wrong, because it wasn't making claims about any coherent concept.

Now consequentialists might say that "consequentialism is the right morality everyone should follow", but under this interpretation this wouldn't be any different from saying that "consequentialism is the right theory about what is sinful or not".

Comment author: Squark 09 July 2015 06:32:21PM 1 point [-]

Hi Kaj, thx for replying!

This makes sense as a criticism of versions of consequentialism which assume a "cosmic objective utility function". I prefer the version of consequentialism in which the utility function is a property of your brain (a representation of your preferences). In this version there is no "right morality everyone should follow" since each person has a slightly different utility function. Moreover, I clearly want other people to maximize my own utility function (so that my utility function gets maximized) but this is the only sense in which that is "right". Also, in contexts in which the difference between our utility functions is negligible (or we agreed to use an average utility function of some sort by bargaining) we sort of have a single morality that we follow although there is no "cosmic should" here, we're just doing the thing that is rational given our preferences.

Comment author: UtilonMaximizer 09 July 2015 07:12:08PM *  1 point [-]

The "preferences version" of consequentialism is also what I prefer. I've never understood the (unfortunately much more common) "cosmic objective utility function" consequentialism which, among other things, doesn't account for nearly enough of the variability in preferences among different types of brains.