RichardChappell comments on Virtue Ethics for Consequentialists - Less Wrong

33 Post author: Will_Newsome 04 June 2010 04:08PM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (178)

You are viewing a single comment's thread.

Comment author: RichardChappell 05 June 2010 09:06:29PM 15 points [-]

Isn't this just Indirect Consequentialism?

It's worth noting that pretty much every consequentialist since J.S. Mill has stressed the importance of inculcating generally-reliable dispositions / character traits, rather than attempting to explicitly make utility calculations in everyday life. It's certainly a good recommendation, but it seems misleading to characterize this as in any way at odds with the consequentialist tradition.

Comment deleted 06 June 2010 08:27:25PM [-]
Comment author: Mass_Driver 09 June 2010 03:18:16AM 5 points [-]

This is a useful dilemma. What are some of the possible motivators for refusing to become a gangster?

  • You don't really care about saving the world; the only consequence that actually matters to you is being a nice person.

  • You don't trust your conclusion that Operation: Gangsta will save the world; you place so much heuristic faith in virtues that you actually expect any calculation that outputs a recommendation to become a gangster to be fatally flawed.

  • You don't trust your values not to evolve away from saving the world if you become a gangster; it might be impossible or extremely risky to save the world by thugging out because being a thug makes you care less about saving the world; you might have a career of evil and then just spend the proceeds on casinos, hitmen, and mansions.

Comment author: SilasBarta 11 June 2010 03:30:49PM 3 points [-]

The second and the third are the most convincing reasons, but EY already explained how those follow from using deontology rather than virtue ethics as a heuristic for handling the fact that you are a consequentialist running on corrupt hardware. This calls into question how much insight Will_Newsome has provided with this article.

His point in that article, if you'll recall, is that deontology is consequentialism, just one meta-level up and with the knowledge that your hardware distorts your moral cognition in predictable ways.

Comment author: Jack 09 June 2010 04:29:02AM *  0 points [-]

The problem is becoming a gangster strikes me, just on pragmatic grounds, as a very bad way to fund saving the world so all these motivations are hard to evaluate.

Comment author: Mass_Driver 09 June 2010 04:52:20AM *  5 points [-]

Sure, but try to cope with the dilemma as best you can. If you can think of a better example, great! If not, try to imagine a situation where being a gangster would be pragmatic. Maybe you're the godfather's favorite child, recently returned from the military and otherwise unskilled. Maybe you live in a dome on a colony planet that is essentially one big corrupt city, and ordinary entrepreneurship doesn't pay off properly. Maybe you're a member of a despised or even outlawed ethnicity in medieval times, and no one will sit still to listen to your brilliant ideas about how to build better water mills and eradicate plague unless you first establish yourself as a powerful and wealthy fringe figure.

In general, when trying to evaluate an argument that you're initially inclined to disagree with, you should try to place your self in The Least Convenient Possible World for refuting that argument. That way, if you still manage to refute the argument, you'll at least have learned something. If you stop thinking when the ordinary world doesn't seem to validate a hypothesis that you didn't believe in to begin with, you don't really learn anything.

Comment author: Will_Newsome 09 June 2010 05:25:20AM 0 points [-]

I would do what sounded like the consequentialist thing to do and become a gangster. Not only would I be saving the world but I'd also be pretty badass if I was doing it right. Rationalists should win when possible and what not. Consequentialism-ism is the key Virtue.

Comment author: Blueberry 09 June 2010 04:23:22PM 0 points [-]

Being badass is a close second.

Comment author: Eneasz 09 June 2010 03:56:57PM 0 points [-]

There isn't much of a dilemma if you assume there are some states worse than death. Eternal torture is less preferable to non-existence. A malicious world of pain and vice is less preferable than a non-existent world. By becoming a malicious, vice-filled person you are moving the world in the direction of being worse than non-existent, and thus are defeating your stated goal. You are doing more to destroy the world than to save it.

Comment deleted 09 June 2010 09:41:21PM [-]
Comment author: Eneasz 09 June 2010 10:14:32PM 1 point [-]

The least convenient possible world is one with superhumanly intelligent AIs that can have complete confidence in their source code, and predict with complete confidence that these means (thuggishness) will in fact lead to those ends (saving the world).

However in that world the world has already been saved (or destroyed) and so this is not relevant. In any relevant world the actor who is resorting to thuggishness to save the world is a human running on hostile hardware, and would be stupid not to take that into consideration.

Comment deleted 10 June 2010 11:55:40AM [-]
Comment author: Eneasz 10 June 2010 03:15:58PM 1 point [-]

I consider the "P" in LCPW to be important. If the agents in question are post-human then it's too late to worry about saving the world. If you still have to save the world, then standard human failure modes do apply.