You're looking at Less Wrong's discussion board. This includes all posts, including those that haven't been promoted to the front page yet. For more information, see About Less Wrong.

shminux comments on "Stupid" questions thread - Less Wrong Discussion

40 Post author: gothgirl420666 13 July 2013 02:42AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (850)

You are viewing a single comment's thread. Show more comments above.

Comment author: shminux 13 July 2013 05:20:38AM 9 points [-]

In what sense can utiliarianism be true or false?

Comment author: [deleted] 13 July 2013 05:48:46AM 1 point [-]

In the sense that we might want to use it or not use it as the driving principle of a superpowerful genie or whatever.

Casting morality as facts that can be true or false is a very convenient model.

Comment author: shminux 13 July 2013 06:46:23AM 1 point [-]

I don't think most people agree that useful = true.

Comment author: [deleted] 13 July 2013 05:24:37PM *  2 points [-]

Woah there. I think we might have a containment failure across an abstraction barrier.

Modelling moral propositions as facts that can be true or false is useful (same as with physical propositions). Then, within that model, utilitarianism is false.

"Utilitarianism is false because it is useful to believe it is false" is a confusion of levels, IMO.

Comment author: shminux 13 July 2013 06:22:30PM 0 points [-]

Modelling moral propositions as facts that can be true or false is useful

Sure, sometimes it is, depending on your goals. For example, if you start a religion, modeling certain moral proposition as true is useful. If you run a country, proclaiming the patriotic duty as a moral truth is very useful.

In the sense that we might want to use it or not use it as the driving principle of a superpowerful genie or whatever.

I don't see how this answers my question. And certainly not the original question

What experiences what you anticipate in a world where utilitarianism is true that you wouldn't anticipate in a world where it is false?

Comment author: [deleted] 13 July 2013 06:36:59PM 2 points [-]

Sure, sometimes it is, depending on your goals. For example, if you start a religion, modeling certain moral proposition as true is useful. If you run a country, proclaiming the patriotic duty as a moral truth is very useful.

I meant model::useful, not memetic::useful.

I don't see how this answers my question. And certainly not the original question

It doesn't answer the original question. You asked in what sense it could be true or false, and I answered that it being "true" corresponds to it being a good idea to hand it off to a powerful genie, as a proxy test for whether it is the preference structure we would want. I think that does answer your question, albeit with some clarification. Did I misunderstand you?

As for the original question, in a world where utilitarianism were "true", I would expect moral philosophers to make judgments that agreed with it, for my intuitions to find it appealing as opposed to stupid, and so on.

Naturally, this correspondence between "is" facts and "ought" facts is artificial and no more or less justified than eg induction; we think it works.

Comment author: AspiringRationalist 13 July 2013 03:30:50PM -1 points [-]

Not explicitly, but most people tend to believe what their evolutionary and cultural adaptations tell them it's useful to believe and don't think too hard about whether it's actually true.

Comment author: DanielLC 13 July 2013 08:49:28PM 0 points [-]

If we use deontology, we can control the genie. If we use utilitarianism, we can control the world. I'm more interested in the world than the genie.

Comment author: [deleted] 14 July 2013 04:43:30PM 1 point [-]

utilitarianism

Be careful with that word. You seem to be using it to refer to consequentialism, but "utilitarianism" usually refers to a much more specific theory that you would not want to endorse simply because it's consequentialist.

Comment author: [deleted] 14 July 2013 05:04:07AM 0 points [-]

?

What do you mean by utilitarianism?

Comment author: DanielLC 15 July 2013 03:52:53AM 0 points [-]

I mean that the genie makes his decisions based on the consequences of his actions. I guess consequentialism is technically more accurate. According to Wikipedia, utilitarianism is a subset of it, but I'm not really sure what the difference is.

Comment author: [deleted] 16 July 2013 02:24:04AM 2 points [-]

Ok. Yeah, "Consequentialism" or "VNM utilitarianism" is usually used for that concepts to distinguish from the moral theory that says you should make choices consistent with a utility function constructed by some linear aggregation of "welfare" or whatever across all agents.

It would be a tragedy to adopt Utilitarianism just because it is consequentialist.

Comment author: DanielLC 16 July 2013 04:40:40AM 1 point [-]

I get consequentialism. It's Utilitarianism that I don't understand.

Comment author: Eugine_Nier 17 July 2013 02:54:24AM 0 points [-]

Minor nitpick: Consequentialism =/= VNM utilitarianism

Comment author: [deleted] 17 July 2013 04:36:32AM 1 point [-]

Right, they are different. A creative rereading of my post could interpret it as talking about two concepts DanielLC might have meant by "utilitarianism".

Comment author: CoffeeStain 14 July 2013 12:03:27AM *  1 point [-]

It seems to me that people who find utilitarianism intuitive do so because they understand the strong mathematical underpinnings. Sort of like how Bayesian networks determine the probability of complex events, in that Bayes theorem proves that a probability derived any other way forces a logical contradiction. Probability has to be Bayesian, even if it's hard to demonstrate why; it takes more than a few math classes.

In that sense, it's as possible for utilitarianism to be false as it is for probability theory (based on Bayesian reasoning) to be false. If you know the math, it's all true by definition, even if some people have arguments (or to be LW-sympathetic, think they do).

Utilitarianism would be false is such arguments existed. Most people try to create them by concocting scenarios in which the results obtained by utilitarian thinking lead to bad moral conclusions. But the claim of utilitarianism is that each time this happens, somebody is doing the math wrong, or else it wouldn't, by definition and maths galore, be the conclusion of utilitarianism.