In a recent bonus episode of the Bayesian Conspiracy podcast, Eneasz Brodski shared a thought experiment that caused no small amount of anguish. In the hypothetical, some eccentric but trustworthy entity is offering to give you an escalating amount of money for your fingers, starting at $10,000 for the first one and increasing 10x per finger up to $10 trillion for all of them.[1] On encountering this thought experiment, Eneasz felt (not without justification) that he mostly valued his manual dexterity more than wealth. Then, two acquaintances pointed out that one could use the $10 trillion to do a lot of good, and Eneasz proceeded to feel terrible about his decision.

I had several responses to this episode, but today I'm going to focus on one of them: the difference between cost and sacrifice.

How Ayn Rand Made Me a Better Altruist

But first, a personal anecdote. I was raised Catholic, and like the good Catholic boy that I was, I once viewed altruism through the lens of personal sacrifice. For the uninitiated, Catholic doctrine places a strong emphasis on this notion of sacrifice - an act of self-abnegation which places The Good firmly above one's own wants or needs. I felt obligated to help others because it was the Right Thing to Do, and I accepted that being a Good Person meant making personal sacrifices for the good of others, regardless of my own feelings. I divided my options into "selfish" and "selfless" categories, and felt guilty when choosing the former. Even as I grew older and my faith in Catholicism began to wane, this sense of moral duty persisted. It was a source of considerable burden and struggle, for me, made worse by the fact that the associated cultural baggage was so deeply ingrained as to be largely invisible to me.

Then, in a fittingly kabbalistic manner, Atlas Shrugged flipped my world upside down.[2] 

Ayn Rand, you see, did not believe in sacrifice. In her philosophy, the only real moral duty is the duty to oneself and one's own principles. She happened to hold a great many other convictions about what those principles ought to be, some of which I now dispute; but in this, I believe, she was wholly correct.

My teenage self, at least, found this perspective incredibly freeing. (Perhaps a bit too freeing, as I've always been the sort of person who enjoys being smugly right about things, and taking the word "selfish" as a compliment for a couple years did not do my social life any favors.) But I emerged from this phase like the titular unburdened Titan himself, having thoroughly abandoned all thought of dutifully adhering to any principles besides my own.

Which of course led me to wonder, for perhaps the first time: What are my principles? If my morals are not to be guided by God nor by the expectations of others, but by my own reflectively endorsed desires, then what do I actually want?

It turns out that I want to help people. I want to ease suffering and promote wellbeing; I want to create things people value; I want to surround myself with a thriving civilization filled with flourishing people.

In abandoning the values that had been imposed on me, I discovered that my own values included a strong preference for the wellbeing of others. And that makes all the difference.

Cost vs Sacrifice

Let's return to the ten-finger demon. We'll set aside, for now, the argument that the money from selling fingers has lots of selfish benefits. That's not what this post is about. Let's focus specifically on the opportunity to Do Good, and what it means for us.

Here's the thing about thought experiments. They're not supposed to be traps for the unwary. In the best case, they are ways to notice problems in our thinking by making choices stark and binary. If a decision posed in a thought experiment makes you feel utterly miserable, that is a warning sign.

In the podcast, one person says something to the effect of: "[I don't like it], but if you really pressed me, I would make the [painful] sacrifice so that I could use the money to help others."

I applaud the sentiment, but this is the wrong way to think about the problem.

Buying something more valuable with something less valuable should never feel like a terrible deal. If it does, something is wrong.

If enough money to end world hunger, lift millions out of poverty, delay global warming, fund a bunch of medical research, outspend the lobbying efforts of multibillion-dollar companies, and and do a half-dozen similar things seems more valuable to you then manual dexterity, then you may have discovered something interesting about your preferences. 

If, however, your instinct is to keep your fingers and feel guilty about it, then perhaps you should ask: whence comes this guilt? Am I failing to live up to a standard I have set for myself? Or am I allowing the standards set by others to override my own preferences? 

If you value $10 trillion worth of improvements to the world more than you value ten fingers, then this transaction is not a sacrifice. It is a cost you are paying to get more of what you want.

If, on reflection, you actually value your fingers more than the leverage $10 trillion buys you, then you shouldn't pay that cost.

Own Your Values

It's a mistake to do as I once did, and divide the outcomes you are capable of achieving into buckets of "selfish" and "selfless", especially if doing so makes you inclined to always let one bucket win at the expense of the other. The universe does not distinguish between selfish goals and selfless ones.

When I was a Reliability Engineer, I donated some of my money to the Against Malaria Foundation. I did not donate everything and decide to live as a pauper. Setting aside how that would have made me worse at my actual job (and at making money to donate), I didn't do that because I don't want to live that way.

I'd take the $10 trillion, even if I couldn't use it to buy prosthetics or live in luxury or whatever, because $10 trillion is a massive amount of leverage that I likely can't match any other way. With it, I could steer the world in ways that according to my own values are better than having functional hands. It's a slam dunk. But this is not, in my view, taking a "selfless" option over a "selfish" one. I just want the leverage more than I want my hands.

For the glowfic fans out there, Alicorn's Bella characters embodied this philosophy with their Three Questions: What do I want? What do I have? How can I use what I have to get what I want?

Don't ask, "Am I a bad person?" Instead, ask "What do I want to achieve?" and make it so. The Replacing Guilt series has more to say on this topic as well. [3]

I implore all altruists, non-altruists, and aspiring altruists alike to make your choices and own them. Leave the hand-wringing to those with all their fingers.

  1. ^

    It was further stipulated that this would not cause inflation or have some other horrible monkey's paw effect; it's just $10 trillion worth of anything money can buy you.

  2. ^

    I don't claim it's a perfect book, but it does contain messages that some people - like young Joe - badly need to hear, and that less emphatic sources often fail to convey.

  3. ^

    For those wondering, I found these posts valuable well before I started working for the author.

New Comment
13 comments, sorted by Click to highlight new comments since:

Buying something more valuable with something less valuable should never feel like a terrible deal. If it does, something is wrong.

It's completely normal to feel terrible about being forced to choose only one of two things you value very highly. Human emotions don't map onto utility comparisons in the way you're suggesting.

True, it can always hurt. I note, however, that's not quite the same thing as feeling like you made a terrible deal, and also that feeling pain at the loss of a treasured thing is not the same as feeling guilty about the choice. 

Many deals in the real world have a lot of positive surplus. Most deals I would like to make have positive surplus. I would still make a deal to get something more valuable with something less valuable, but if the margins are very thin (or approaching zero), then I wouldn't like the deal even as I make it. I can feel like it's a terrible deal because the deals I want would have a lot more surplus to them, ideally involving a less painful cost.

Not the most important response to this essay but "Leave the hand-wringing to those with all their fingers" made me laugh. Thanks for the smile.

[-]dr_s5-2

I think there's one fundamental problem here IMO, which is that not everything is fungible, and thus not everything manages to actually comfortably exist on the same axis of values. Fingers are not fungible. At the current state of technology, once severed, they're gone. In some sense, you could say, that's a limited loss. But for you, as a human being, it may as well be infinite. You just lost something you'll never ever have back. All the trillions and quadrillion dollars in the world wouldn't be enough to buy it back if you regretted your choice. And thus, while in some sense its value must be limited (it's just the fingers of one single human being after all, no? How many of those get lost every day simply because it would have been a bit more expensive to equip the workshop with a circular saw that has a proper safety stop?), in some other sense the value of your fingers to you is infinite, completely beyond money.

Bit of an aside - but I think this is part of what causes such a visceral reaction in some people to the idea of sex reassignment surgery, which then feeds into transphobic rationalizations and ideologies. The concept of genuinely wanting to get rid of a part of your body that you can't possibly get back feels so fundamentally wrong on some level to many people, it pretty much alone for them seals the deal that you must either be insane or having been manipulated by some kind of evil outside force.

Lots of things have a value that we might call "infinite" according to this argument. Everything from a human life to reading a book spoiler counts as "something you cannot buy back if you regret it later." 

Even if we choose to label some things as "non-fungible", we must often weigh them against each other nevertheless. I claim, not that the choice never hurts, but that there is no need to feel guilty about it. 

[-]dr_s21

Well, yes, it's true, and obviously those things do not necessarily all have genuine infinite value. I think what this really means in practice is not that all non-fungible things have infinite value, but that because they are non-fungible, most judgements involving them are not as easy or straightforward as simple numerical comparisons. Preferences end up being expressed anyway, but just because practical needs force a square peg in a round hole doesn't make it fit any better. I think this in practice manifests in high rates of hesitation or regret for decisions involving such things, and the general difficulty of really squaring decisions like these We can agree in one sense that several trillion dollars in charity are a much greater good than someone not having their fingers cut off, and yet we generally wouldn't call that person "evil" for picking the latter option because we understand perfectly how to someone their own fingers might feel more valuable. If we were talking about fungible goods we'd feel very differently. Replace cutting one's fingers with e.g. demolishing their house.

I think the whole concept of labeling goods as "fungible" or "non-fungible" is a category error. Everything trades off against something. 

Either you value your fingers more than what [some specific amount of money] will buy you or you don't. If you value your fingers more, then keeping them is the right call for you. 

I agree with your first paragraph. I think the second is off-topic in a way that encourages readers, and possibly you yourself, to get mind-killed. Couldn’t you use a less controversial topic as an example? (Very nearly any topic is less controversial.) And did you really need to compound the problem by assigning motivations to other people whom you disagree with? That’s a really good way to start a flame war.

[-]dr_s20

I think it's a very visible example that right now is particularly often brought up. I'm not saying it's all there is to it but I think the fundamental visceral reaction to the very idea of self-mutilation is an important and often overlooked element of why some people would be put off by the concept. I actually think it's something that makes the whole thing a lot more understandable in what it comes from than the generic "well they're just bigoted and evil" stuff people come up with in extremely partisan arguments on the topics. These sort of psychological processes - the fact that we may first have a gut-level reaction, and only later rationalize it by constructing an ideological framework to justify why the things that repulses us are evil - are very well documented, and happen all over the place. Does not mean everyone who disagrees with me does so because of it (nor that everyone who agrees doesn't do it!) but it would be foolish to just pretend this never happens because it sounds a bit offensive to bring up in a debate. The entire concept of rationality is based around the awareness that yeah, we're constantly affected by cognitive biases like these, and separating the wheat from the chaff is hard work.

And by the way it's an excellent example of the reverse too. Just like people who are not dysphoric are put off by mutilation, people who are are put off by the feeling of having something grafted onto their bodies that doesn't belong. Which is sort of the flip side of it. Essentially we tend to have a mental image of our bodies and a strong aversion to that shape being altered or disturbed in some way (which makes all kinds of sense evolutionarily, really). Ironically enough, it's probably via the mechanism of empathy that someone can see someone else do something to their body that feels "wrong" and cringe/be grossed out on their behalf (if you think trans issues are controversial, consider the reactions some people can have even to things like piercings in particularly sensitive places).

I think another problem with the hypothetical is scope insensitive. I mean I read 10 trillions usd and feel no difference from 10 millions usd or less. And it is unclear whether 10 millions is worth 10 of my fingers, while intellectually I think 10 trillions supposed to be worth it. Hence the discomfort.

I think money is relatively neat value-holder here, because we can map people, and their options on it.

I don't intuitively know how much money 1 mln USD is, but I know a guy who is a millionaire, and more or less know what he is capable of buying for himself or spending on charity.

I don't intuitively grasp how much 1 billion USD is, but we have examples of billionaires and their actions to guesstimate what that means.

Similarly, I never lost a finger, but can practice using one hand, of just a few fingers of one hand to do everyday tasks, and see how much worse it is. I know several people with 1-2 fingers missing, and they do not seem particularly inconvenienced, some even play guitar! I know a guy with just one hand (which I think is much worse than just missing all fingers on one hand) and he is limited in some things but does fine. So it seems even missing half of your fingers is not that bad if you have a decent middle class career and wealth, and would probably be less of a problem for a millionaire.

Even based on that imprecise financial intuition, I can guess it would not be worth it to sacrifice fingers for 1 mln (because its not that much money in the end), worth it for 10 mln (because it would set you for life), and if Im going for 1 billion I might just go all the way to 10s of trillions.

No idea whether I'd really sacrifice all 10 of my fingers to improve the world by that much, especially if we add the stipulation that I can't use any of the $10,000,000,000,000 to pay someone to do all of the things I use my fingers for( ͡° ͜ʖ ͡°). For me I am quite well divided on it, and it is an example of a pretty clean, crisp distinction between selfish and selfless values. If I kept my fingers, I would feel guilty, because I would be giving up the altruism I value a lot (not just because people tell me to), and the emotion that would result from that loss of value would be guilt, even though I self-consistenly value my fingers more. Conversely, if I did give up my fingers for the $10,000,000,000,000, I would feel terrible for different reasons( ͡° ͜ʖ ͡°), even though I valued the altruism more.

Of course, given this decision I would not keep all of my fingers in any case, as long as I could choose which ones to lose. $100,000,000 is well worth the five fingers on my right (nondominant) hand. My life would be better purely selfishly, given that I would never have to work again, and could still write, type, and ( ͡° ͜ʖ ͡°).