# Yvain comments on Rationality Quotes September 2012 - Less Wrong

7 03 September 2012 05:18AM

You are viewing a comment permalink. View the original post to see all comments and the full post content.

Sort By: Best

You are viewing a single comment's thread.

Comment author: 01 September 2012 02:20:44PM 56 points [-]

Do unto others 20% better than you expect them to do unto you, to correct for subjective error.

-- Linus Pauling

Comment author: 01 September 2012 07:14:46PM 17 points [-]

Citation for this was hard; the closest I got was Etzioni's 1962 The Hard Way to Peace, pg 110. There's also a version in the 1998 Linus Pauling on peace: a scientist speaks out on humanism and world survival : writings and talks by Linus Pauling; this version goes

I have made a modern formulation of the Golden Rule: "Do unto others 20 percent better than you would be done by - the 20 percent is to correct for subjective error."

Comment author: 03 September 2012 07:15:29AM 3 points [-]

Did you take "expect" to mean as in prediction, or as in what you would have them do, like the Jesus version?

Comment author: 02 September 2012 07:13:46PM 1 point [-]

How about doing unto others what maximizes total happiness, regardless of what they'd do unto you?

Comment author: 02 September 2012 09:46:29PM 6 points [-]

The former is computationally far more feasible.

Comment author: 13 September 2012 12:34:17PM 2 points [-]

Doing unto others that which causes maximum total happiness leaves you vulnerable to Newcomb problems. You want to do unto others that which logically entails maximum total happiness. Under certain conditions, this is the same as Pauling's recommendation.

Comment author: 13 September 2012 05:53:22PM 0 points [-]

I never mentioned causation. If you find a way to maximize it acausally, do that.

Comment author: 02 September 2012 11:56:56PM *  2 points [-]
Comment author: 03 September 2012 12:13:53AM *  0 points [-]

It's impossible to find a strategy that produces happiness better than trying to produce happiness, since if you knew of one, you'd try to produce happiness by following that strategy. If this method is what works best, then in doing what works best, you'd follow this method.

Also, linking to TVTropes tends to fall under generalizing from fictional evidence.

Comment author: 03 September 2012 02:40:49AM 1 point [-]

Art imitates life. ;)

And it's not hard to think of real life examples of atrocities "justified" on utilitarian grounds that the rest of the world thinks are anything but justifiable. The Reign of Terror during the French Revolution, for example, is generally regarded as having gone too far.

Comment author: 05 September 2012 04:23:13AM 1 point [-]

Would it help if the link were aimed at the real life section?

Comment author: 05 September 2012 08:33:53PM 1 point [-]

It has been deleted to prevent edit war.

Comment author: 02 September 2012 09:06:42PM 2 points [-]

It's a nice sentiment, but the optimization problem you suggest is usually intractable.

Comment author: 02 September 2012 10:51:15PM 2 points [-]

It's better to at least attempt it than just find an easier problem and do that. You might have to rely on intuition and such to get any answer, but you're not going to do well if you just find something easier to optimize.

Comment author: 02 September 2012 11:04:22PM 3 points [-]

Yes, but there's no way a pithy quote is going to solve the problem for you. It might, however, contain a useful heuristic.

Comment author: 02 September 2012 07:45:27PM 2 points [-]

By acting in a way that discourages them from hurting you, and encouraging them to help you, you are playing your part in maximizing total happiness.

Comment author: 02 September 2012 10:50:18PM 0 points [-]

Yeah, but it's not necessarily the ideal way to act. Perhaps you should act generally better than that, or perhaps you should try to amplify it more. Do what you can to find out the optimal way to act. At least pay attention if you find new information. Don't just make a guess and assume you're correct.

Comment author: 03 September 2012 05:21:38AM 0 points [-]

You don't think you should discourage others from hurting you? I think that seems sort of obvious. Now, if you could somehow give a person a strong incentive to help you/ not hurt, while simultaneously granting them a shitload of happiness, that seems ideal. This doesn't really exclude that, it's just on the positive side of doing/ being done unto.

Comment author: 03 September 2012 05:56:32AM 0 points [-]

You should probably discourage others from hurting you. It's just not clear how much.

Comment author: 03 September 2012 06:12:04AM 1 point [-]

As much as possible for the least amount of harm possible and the least amount of wasted time and resources, obviously. Which varies on a case by case basis.

I mean if it was practical, you'd give your friends 2 billion units of happiness, and then after turning the cheek to your enemies, grant them 1.9 billion units of happiness, but living on planet earth, giving you 80% of the crap you gave me seems about right.

Comment author: 04 September 2012 07:57:32AM *  3 points [-]

...living on planet earth, giving you 80% of the crap you gave me seems about right.

Consider the consequences if everyone follows your rule. Assume someone gives you one unit of crap, possibly accidentally. You respond with 0.8 units. (It's hard to measure this precisely, but for the sake of argument let's assume that both of you manage to get it exactly right). He, in turn, responds with a further 0.64 units of crap. You respond to this with 0.512 units.

This is, of course, an infinite geometric series. The end result (over an infinite time period) is that you recieve 2 and 7/9 units of crap, while the other person recieves 2 and 2/9 units of crap. He recieves exactly 80% of the amount that you recieved, but you recieved over twice as much as you started out recieving.

If you return x% of the crap you get (for 0<x<100), and everyone else follows the same rule, then the total crap you recieve for every starting unit of crap is:

$\frac{1}{1-\left( \frac{x}{100}\right)^2 }$

This is clearly minimized at x=0.

Comment author: 04 September 2012 05:13:38PM 3 points [-]

Alternatively: he could notice that he gave you 1 unit of crap and assume the 0.8 units of crap you gave him is an equal penalty.

If someone yells at you, you're likely to respond - but if someone yells at you because you just pushed them, you're less likely to respond.

Comment author: 04 September 2012 08:01:29AM 0 points [-]

Or he could know I was going to give him the .512 units, from prior experience, and not give .64, which is the whole point.

Comment author: 04 September 2012 08:06:29AM 1 point [-]

That assumes that he is following a different rule from the rule that you are following. Does knowing that he will give you the 0.64 units prevent you from giving him the 0.8 units?

Comment author: 03 September 2012 08:43:11AM 0 points [-]

Not necessarily. If I horribly torture Jim because Jim stepped on my toes, then I am not maximizing total happiness; the unhappiness given to Jim by the torture outwieghs the unhappiness in me that is prevented by having no-one step on my toes.

Comment author: 03 September 2012 09:21:10AM 4 points [-]

That's a lot of effort and pain to prevent someone stepping on your toes.

Also, I'm not sure that'd be a terribly effective way to prevent harm to yourself. I mean, to the extent possible, once everyone knows you tortured Jim, people will be scared shitless to step on your toes, but Jim and Jim's family are very likely to murder you, or at least sue you for all your money and put you in jail for a long time.

Comment author: 04 September 2012 08:04:58AM 1 point [-]

You are correct; it is not terribly effective. However, any disproportionate response to a minor, or even an imagined, slight will reduce total unhappiness while discouraging others from hurting me.

Comment author: 04 September 2012 08:29:22AM 2 points [-]

No. I just told you. Sometimes a disproportionate response encourages other people to hurt you. That's actually part of the rule.

Comment author: 03 September 2012 07:53:58AM 0 points [-]

How about doing unto others what maximizes total happiness, regardless of what they'd do unto you?

You may do that if you must, I recommend against it.

Comment author: 03 September 2012 05:38:39PM 0 points [-]

Why do you recommend against it? Do you have a more complicated utility function?

Comment author: 04 September 2012 02:15:56PM 2 points [-]

Most human utility functions give their own happiness more weight than other's. If you take into account that humans increase the happiness of others because it makes themself happy, you could even say that human utility functions only care about the happiness of their corresponding humans - but that is close to a tautology ("the utility function cares about the utility of the agent only").