Imagine you had magical healing powers. Sitting quietly with someone and holding their hand you could restore them to health. While this would be a wonderful ability to have, it would also be a hard one: any time you spent on something other than healing people would mean unnecessary suffering. How could you justify a pleasant dinner with your family or a relaxing weekend at the beach when that meant more people living in pain?

But you already have these powers. Through the magic of effective charity you can donate money to help people right now. The tradeoff remains: time you give yourself when you could be working means money you don't earn which then can't go to help the people who would most benefit from it.

(I don't think this means you should try for complete selflessness; you need to balance your needs against others'. But the balance should probably be a lot further towards others' than it currently is.)

Update 2012-08-12: this is a response to hearing people offline saying that if they had magical "help other people" powers then they should spend lots of time using them, without having considered that they already have non-magical "help other people" powers.

I also posted this on my blog

New Comment
26 comments, sorted by Click to highlight new comments since:

This post is too low on interesting or useful new content, relative to reiteration of a standard ideological view about altruism. It would be different if the post described an interesting new argument, or a new more efficient means of helping people.

How could you justify a pleasant dinner with your family or a relaxing weekend at the beach when that meant more people living in pain?

Really, this is just a particular form of "what if you had super-high productivity, orders of magnitude higher than almost anyone else on Earth." If I had these powers, and knew they couldn't be duplicated through research and study, I would sell them at high prices. To extract as much of the surplus as possible, I would use the "financial aid" price-discrimination system employed by elite universities, demanding tax returns and other information to determine ability to pay, and then extracting a large portion of that potential in exchange for healing.

If one has to "lay on hands" only briefly, the expected annual revenue (provided one didn't get kidnapped or imprisoned) would be in the trillions of dollars. Even if a healing took a couple of hours, revenue would be in the tens of billions (driven primarily by the super-rich).

In this situation I would probably work 70-100 hour work weeks, and use a budget of billions of dollars to make quality of life while working as high as feasible, e.g. healing on the beach while getting massages, eating gourmet meals, hearing reports of scientific research projects I had commissioned, and so forth. And then a majority of the revenues would go towards improving global prospects.

One of the research projects should definitely be finding out how more people can have healing powers.

[This comment is no longer endorsed by its author]Reply

and knew they couldn't be duplicated through research and study

It would be different if the post described an interesting new argument

Hmm. I left out context that is more important than I thought it was: this post is a response to two different people in different conversations speculating in person about what they would do if they had powers like this. They both thought they would have an obligation to help people, but didn't think they had an obligation to in normal life.

I don't think it was a scale issue so much as being about magic or specialness.

This post is too low on interesting or useful new content, relative to reiteration of a standard ideological view about altruism.

The opportunity to help others is is not taken advantage of by either altruists or egoists as much as it would fulfill their values. When people continue to act contrary to their values, it's good to continue remind them.

In general, people don't act contrary to their values so much as their stated values are a simplified approximation of their actual values.

I am reminded of a particular SMBC involving Superman.

Somewhat more seriously, I can't help but think that this post starts with far-mode idealization: would it really be difficult to turn people away for personal reasons if you had magical healing powers? Or is it merely uncomfortable, and perhaps usually bad signalling, to admit you would?

I'm coming to think that "people are dying, every second you aren't stopping it you're murdering them" is a basilisk. And not too far removed from Roko's. Instead of fear of the future FAI torturing you, this basilisk does it by screaming at you inside your head.

How are you using "basilisk"?

My understanding of Roko's is that being exposed to it decreases global utility while exposure to "you should help other people and it's very important" increases it. But I don't know if that's relevant to basilisk status.

There's a difference between "you should help other people and it's very important" and "helping other people is so important that you should treat your quality of life as irrelevant". The latter leads to some combination of burnout, despair, and apathy, though possibly with some useful work done on the way to burnout.

I don't believe "helping other people is so important that you should treat your quality of life as irrelevant", because of the negative consequences you describe.

(I still don't see a basilisk here.)

You don't believe that, but

How could you justify a pleasant dinner with your family or a relaxing weekend at the beach when that meant more people living in pain?

that's how some people see the utilitarian calculation.

The latter leads to some combination of burnout, despair, and apathy, though possibly with some useful work done on the way to burnout.

The problem is precisely that people are reluctant to admit that they choose not burning out over helping others.

A basilisk, in this context, is a thought that kills you if you think it, which is excessive for this, and for Roko's. I mean a thought that breaks your cognitive processes in some way if you think it. Which I think is a fair way to describe someone who, on contact with the "you're murdering everyone you don't try to save" idea, is consumed with guilt that their every moment is not devoted with maximum effort to saving the world.

If I had magic healing powers, I'm pretty sure that rather than healing people all the time, my time would be better spent marketing myself and using my powers to become rich and famous, at which point I could help more people with my money than I could with a hands-on approach.

In fact, this is the most effective use of most superpowers.

Even after you factor in how long it takes to cure someone with money vs with magic? CarlShulman's idea sounds even more effective to me.

I would say that it's effectively the same idea, in more detail.

I had taken “at which point I could help more people with my money than I could with a hands-on approach” to imply ‘... and therefore no longer use the hands-on approach’; if you actually meant ‘... than I could with a hands-on approach alone’...

[-]Shmi40

But the balance should probably be a lot further towards others' than it currently is.

Why? My should is different from your should. Who is to say that your should is better for me than mine?

And no, I don't accept your "idea that we have some obligation to try to help other people". I hate obligations. They piss me off. Whatever I do, I do because I want to, not because I owe it to others.

Who is to say that your should is better for me than mine?

This seems like a bad general heuristic that should be more restricted in its application. Who is to say that following your understanding of your goals is better for you than following someone else's? You have to consider specific arguments, not just origins of statements or beliefs. Think it possible that you may be mistaken, etc.

[-]Shmi-10

No disagreement there, specific arguments ought to be considered. However, in my experience, if someone tells you that you have an obligation to do something (pray to $god, donate to $cause, enlist in $military, vote for $candidate, ...), they are not to be trusted with putting forth arguments, or even estimating prior[itie]s. So, ignore people like that entirely and do your own research from scratch.

So the heuristic is to only consider arguments that don't claim to be leading to any (novel/actionable) conclusions? This rule decides at the bottom line, stopping consideration of arguments that don't conclude with uncertainty or close match to intuitively natural desires, which would be bad if conclusions not of that form turn out to be knowably correct.

If you remain specific, you may get rid of "pray to $god", but not other similar things, "donate to $cause that's known to be worthless", but not other similar things, etc. That should lift most of the load without as many false negatives.

Wait, so you're saying that your right to freedom is more important than making this world as good as possible? By all moral systems I know of, that's morally wrong (though I'll admit I don't know many). Do you have a well-defined moral system you could point me to?

Even an ethical egoist would cooperate with a copy of herself in the prisoner's dilemma, if she's using the ‘right’ decision theory.

[-]Shmi-20

I am saying nothing of the sort. My point is that I distrust anyone who tells me that I am obligated to do stuff that they think is "right".