Strange7 comments on Overcoming the mind-killer - Less Wrong

10 Post author: woozle 17 March 2010 12:56AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Comments (126)

You are viewing a single comment's thread. Show more comments above.

Comment author: woozle 25 March 2010 10:42:21PM 2 points [-]

Much discussion about "minimization of suffering" etc. ensued from my first response to this comment, but I thought I should reiterate the point I was trying to make:

I propose that the ultimate terminal value of every rational, compassionate human is to minimize suffering.

(Tentative definition: "suffering" is any kind of discomfort over which the subject has no control.)

All other values (from any part of the political continuum) -- "human rights", "justice", "fairness", "morality", "faith", "loyalty", "honor", "patriotism", etc. -- are not rational terminal values.

This isn't to say that they are useless. They serve as a kind of ethical shorthand, guidelines, rules-of-thumb, "philosophical first-aid": somewhat-reliable predictors of which actions are likely to cause harm (and which are not) -- memes which are effective at reducing harm when people are infected by them. (Hence society often works hard to "sugar coat" them with simplistic, easily-comprehended -- but essentially irrelevant -- justifications, and otherwise encourage their spread.)

Nonetheless, they are not rational terminal values; they are stand-ins.

They also have a price:

  • they do not adapt well to changes in our evolving rational understanding of what causes harm/suffering, so that rules which we now know cause more suffering than benefit are still happily propagating out in the memetic wilderness...
  • any rigid rule (like any tool) can be abused.

...

I seem to have taken this line of thought a bit further than I meant to originally -- so to summarize: I'd really like to hear if anyone believes there are other rational terminal values other than (or which cannot ultimately be reduced to) "minimizing suffering".

Comment author: Strange7 25 March 2010 10:48:11PM 5 points [-]

I propose that the ultimate terminal value of every rational, compassionate human is to minimize suffering.

I disagree. I'll take suffering rather than death any day, thank-you-very-much.

Furthermore, I have reason to believe that, if I were offered the opportunity to instantaneously and painlessly wipe out all life in the universe, many compassionate humans would support my decision not to do so, despite all the suffering which is thereby allowed to continue.