You jump to a muddy pond to save a drowning child. You're rewarded with a magnified "thank you", ruined cloting and an unpleasant day.

You witness a crime and report it to the police. The resulting judicial process will take up tens of hours of your time with barely any compensation.

You refuse to break laws or lie to the government officials in situations where that is the norm. You're rewarded with extra bureucracy, extra fees, or not getting something you needed. (Many laws can exists only because they're not followed to the letter.)

To reduce CO₂ footprint, you take a train instead of an airplane, losing dozens of labor hours in time and money.[1]

You have to make a taboo tradeoff. You quickly calculate the expected utility of each choice. Instead of obfuscating this process like a decent person, you state your numbers plainly. You get shunned/fired/not-re-elected.

You do the right thing, you end up worse off. You did foresee this. "No good deed goes unpunished", as the saying goes.

In each of these situations, the incentives are stacked against doing the virtuous thing. This is inevitable; if incentives we're the other way around, there would be no virtue points to be gained in those acts, so they wouldn't be on this list. (Not everything is horribly broken.)

Goodhart's law states:

Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes.

To tax or publicly shun an activity is to create an incentive against doing that. However, the incentive must target an observable behavior. Typically the observed quantity is then reduced. However, the reduction often consists mostly of recategorization or hiding.

When Finland introduced a tax on candy, companies moved to produce cookies, which were not classified as candy. [2]

Several behaviors are considered social dark matter and not talked about. And others are considered valuable enough to do semi-costly virtue signaling.

In legal contexts, the hiding part is a crime, for instance tax or benefits fraud. Naturally crimes have punishments, and it's typically quite easy to make the punishment worse than the benefit obtained from the illegal activity. If you get caught, that is. The hard part is enforcement. Expected utility of a crime stays positive if the probabilty of getting caught is low. If there's almost no enforcement, then the tax is not on the orignal target, but a tax on honesty.

I try not to judge people too much for following the incentives. Often the sacrifice required to do the right thing would be unreasonable. The judgement, whether directly expressed or not, is just another incentive. If it's less than the original sacrifice, then the incentive points towards hiding that behavior. Or, recursively, if hiding is expensive, then the judgement is simply suffered. A lose-lose situation.

I don't want the less-moral people to be better off than the more-moral people[3]. So when I see a system that's really broken, naturally I abuse the system as much as possible. If the system gets fixed, I'm happy since now nobody exploits it. If the system doesn't get fixed, at least I don't end up worse off because of it. I try to be careful about not self-modifying too much by doing this, but it's hard[4].

Sometimes it's better to let people abuse the system a bit. The optimal amount of fraud is non-zero. At what point you should abandon the righteousness? How much are you willing to sacrifice to uphold the system?


  1. My criticism against (ecological) compensations is subject for another post. For this post the relevant part is that reducing carbon emissions is considered prosocial. ↩︎

  2. The tax has since been removed as the resulting market distortion was considered unacceptable. ↩︎

  3. Freely replace "moral" with "honest", "cooperative" or "prosocial" as needed. ↩︎

  4. Habits are hard to change. You are the actions you take, not the thoughts you think. ↩︎

New Comment
1 comment, sorted by Click to highlight new comments since:
[-][anonymous]20

Related excellent writing by Zvi: Assymetric Justice. Relevant excerpt:

We can consider three point-based justice systems.

In the asymmetric system, when bad action is taken, bad action points are accumulated. Justice punishes in proportion to those points to the extent possible. Each action is assigned a non-negative point total.

In the symmetric system, when any action is taken, good or bad, points are accumulated. This can be and often is zero, is negative for bad action, positive for good action. Justice consists of punishing negative point totals and rewarding positive point totals.

In what we will call the Good Place system (Spoiler Alert for Season 1), when any action is taken, good or bad, points are accumulated as in the symmetric system. But there's a catch (which is where the spoiler comes in). If you take actions with good consequences, you only get those points if your motive was to do good. When a character attempts to score points by holding open doors for people, they fail to score any points because they are gaming the system. Gaming the system isn't allowed.

Thus, if one takes action even under the best of motives, one fails to capture much of the gains from such action. Second or higher order benefits, or surprising benefits, that are real but unintended, will mostly not get captured.

The opposite is not true of actions with bad consequences. You lose points for bad actions whether or not you intended to be bad. It is your responsibility to check yourself before you wreck yourself.

When (Spoiler Alert for Season 3) an ordinary citizen buys a tomato from a supermarket, they are revealed to have lost twelve points because the owner of the tomato company was a bad guy and the company used unethical labor practices. Life has become too complicated to be a good person. Thus, since the thresholds never got updated, no one has made it into The Good Place for centuries.

The asymmetric system is against action. Action is bad. Inaction is good. Surprisingly large numbers of people actually believe this. It is good to be you, but bad to do anything. 

The asymmetric system is not against every action. This is true. But effectively, it is. Some actions are bad, some are neutral. Take enough actions, even with the best of intentions, even with fully correct knowledge of what is and is not bad, and mistakes will happen.

So any individual, any group, any company, any system, any anything, that takes action, is therefore bad.

Also the Copenhagen Interpretation of Ethics, of course, which nobody should go through life without reading and internalizing its lessons, and the Toxoplasma of Rage by Scott.