As Peter Singer writes in his book The Life You Can Save: "[t]he world would be a much simpler place if one could bring about social change merely by making a logically consistent moral argument". Many people one encounters might agree that a social change movement is noble yet not want to do anything to promote it, or want to give more money to a charity yet refrain from doing so. Additional moralizing doesn't seem to do the trick. ...So what does?
Motivating people to altruism is relevant for the optimal philanthropy movement. For a start on the answer, like many things, I turn to psychology. Specifically, the psychology Peter Singer catalogues in his book.
A Single, Identifiable Victim
One of the most well-known motivations behind helping others is a personal connection, which triggers empathy. When psychologists researching generosity paid participants to join a psychological experiment and then later gave these participants the opportunity to donate to a global poverty fighting organization Save the Children, two different kinds of information were given.
One random group of participants were told "Food shortages in Malawi are affecting more than three million children" and some additional information about how the need for donations was very strong, and these donations could help stop the food shortages.
Another random group of participants were instead shown the photo of Rokia, a seven-year-old Malawian girl who is desperately poor. The participants were told that "her life will be changed for the better by your gift".
Furthermore, a third random group of participants were shown the photo of Rokia, told about who she is and that "her life will be changed for the better", but ALSO told about the general information about the famine and told the same "food shortages [...] are affecting more than three million" -- a combination of both the previous groups.
Lastly, a fourth random group was shown the photo of Rokia, informed about her the same as the other groups, and then given information about another child, identified by name, and told that their donation would also affect this child too for the better.
It's All About the Person
Interestingly, the group who was told ONLY about Rokia gave the most money. The group who was told about both children reported feeling less overal emotion than those who only saw Rokia, and gave less money. The group who was told about both Rokia and the general famine information gave even less than that, followed by the group that only got the general famine information.1,2 It turns out that information about a single person was the most salient for creating an empathetic response to trigger a willingness to donate.1,2
This continues through additional studies. In another generosity experiment, one group of people was told that a single child needed a lifesaving medical treatment that costs $300K, and was given the opportunity to contribute towards this fund. A second random group of people was told that eight children needed a lifesaving treatment, and all of them would die unless $300K could be provided, and was given an opportunity to contribute. More people opted to donate toward the single child.3,4
This is the basis for why we're so willing to chase after lost miners or Baby Jessica no matter the monetary cost, but turn a blind eye to the mass unknown starving in the developing world. Indeed, the person doesn't even need to be particularly identified, though it does help. In another experiment, people asked by researchers to make a donation to Habitat for Humanity were more likely to do so if they were told that the family "has been selected" rather than that they "will be selected" -- even though all other parts of the pitch were the same, and the participants got no information about who the families actually were5.
The Deliberative and The Affective
Why is this the case? Researcher Paul Slovic thinks that humans have two different processes for deciding what to do. The first is an affective system that responds to emotion, rapidly processing images and stories and generating an intuitive feeling that leads to immediate action. The second is a deliberative system that draws on reasoning, and operates on words, numbers, and abstractions, which is much slower to generate action.6
To follow up, the Rokia experiment was done again, except yet another twist was added -- there were two groups, one told only about Rokia exactly as before, and one told only the generic famine information exactly as before. Within each group, half the group took a survey designed to arouse their emotions by asking them things like "When you hear the word 'baby' how do you feel?" The other half of both groups was given emotionally neutral questions, like math puzzles.
This time, the Rokia group gave far more, but those in the group who randomly had their emotions aroused gave even more than those who heard about Rokia but had finished math problems. On the other side, those who heard the generic famine information showed no increase in donation regardless of how heightened their emotions were.1
Futility and Making a Difference
Imagine you're told that there are 3000 refugees at risk in a camp in Rwanda, and you could donate towards aid that would save 1500 of them. Would you do it? And how much would you donate?
Now this time imagine that you can still save 1500 refugees with the same amount of money, but the camp has 10000 refugees. In an experiment where these two scenarios were presented not as a thought experiment but as realities to two separate random groups, the group that heard of only 3000 refugees were more likely to donate, and donated larger amounts.7,8
Enter another quirk of our giving psychology, right or wrong: futility thinking. We think that if we're not making a sizable difference, it's not worth making the difference at all -- it will only be a drop in the ocean and the problem will keep raging on.
Am I Responsible?
People are also far less likely to help if they're with other people. In this experiment, students were invited to participate in a market research survey. However, when the researcher gave the students their questionnaire to fill out, she went into a back room separated from the office only by a curtain. A few minutes later, noises strongly suggested that she had got on a chair to get something from a high shelf, and then fell off it, loudly complaining that she couldn't feel or move her foot.
With only one student taking the survey, 70% of them stopped what they were doing and offered assistance. However, when there were two students taking the survey, this number dropped down dramatically. Most noticeably, when the group was two students -- but one of the students was a stooge who was in on it and would always not respond, the response rate of the non-stooge participant was only 7%.9
This one is known as diffusion of responsibility, better known as the bystander effect -- we help more often when we think it is our responsibility to do so, and -- again for right or for wrong -- we naturally look to others to see if they're helping before doing so ourselves.
What's Fair In Help?
It's clear that people value fairness, even to their own detriment. In a game called "the Ultimatum Game", one participant is given a sum of money by the researcher, say $10, and told they can split this money with an anonymous second player in any proportion they choose -- give them $10, give them $7, give them $5, give them nothing, everything is fair game. The catch is, however, the second player, after hearing of the split anonymously, gets to vote to accept it or reject it. Should the split be accepted, both players walk away with the agreed amount. But should the split be rejected, both players walk away with nothing.
A Fair Split
The economist, expecting ideally rational and perfectly self-interested players, predicts that the second player would accept any split that gets them money, since anything is better than nothing. And the first player, understanding this, would naturally offer $1 and keep $9 for himself. At no point are identities revealed, so reputation and retribution are no issue.
But the results turn out to be quite different -- the vast majority offer an equal split. Yet, when an offer comes around that offers $2 or less, it is almost always rejected, even though $2 is better than nothing.10 And this effect persists even when played for thousands of dollars and persists across nearly all cultures.
Splitting and Anchoring in Charity
This sense of fairness persists into helping as well -- people generally have a strong tendency not to want to help more than the other people around them, and if they find themselves the only ones helping on a frequent basis, they start to feel a "sucker". On the flipside, if others are doing more, they will follow suit.11,12,13
Those told the average donation to a charity nearly always tend to give that amount, even if the average told to them is a lie, having secretly been increased or decreased. And it can be replicated even without lying -- those told about an above average gift were far more likely to donate more, even attempting to match that gift.14,15 Overall, we tend to match the behavior of our reference class -- those people we identify with -- and this includes how much we help. We donate more when we believe others are donating more, and donate less when we believe others are doing so.
Challenging the Self-Interest Norm
But there's a way to break this cycle of futility, responsibility, and fairness -- challenge the norm by openly communicating about helping others. While many religious and secular values insist that the best giving is anonymous giving, this turns out to not always be the case. While there may be other reasons to give anonymously, don't forget the benefits of giving openly -- being open about helping inspires others to help, and can help challenge the norms of the culture.
Indeed, many organizations now exist to help challenge the norms of donations and try to create a culture where they give more. GivingWhatWeCan is a community of 230 people (including me!) who have all pledged to donate at least 10% of their income to organizations working on ending extreme poverty, and submit statements proving so. BolderGiving has a bunch of inspiring stories of over 100 people who all give at least 20% of their income, with a dozen giving over 90%! And these aren't all rich people, some of them are even ordinary students.
Who's Willing to Be Altruistic?
While people are not saints, experiments have shown that people tend to grossly overestimate how self-interested other people are -- for one example, people estimated that males would overwhelmingly favor a piece of legislation to "slash research funding to a disease that affects only women", even while -- being male -- they themselves do not support such legislation.16
This also manifests itself in an expectation that people be "self-interested" in their philanthropic cause -- suggesting much stronger support for volunteers in Students Against Drunk Driving who themselves knew people killed in drunk driving accidents versus those people who had no such personal experiences but just thought it to be "a very important cause".17
Alex de Tocqueville, echoing the early economists who expected $9/$1 splits in the Ultimatum Game, wrote in 1835 that "Americans enjoy explaining almost every act of their lives on the principle of self-interest".18 But this isn't always the case, and in challenging the norm, people make it more acceptable to be altruistic. It's not just "goody two-shoes", and it's praiseworthy to be "too charitable".
A Bit of a Nudge
A somewhat pressing problem in getting people to help was in organ donation -- surely no one was inconvenienced by having their organs donated after they had died. Yet, why would people not sign up? And how could we get more people to sign up?
In Germany, only 12% of the population are registered organ donors. In nearby Austria, that number is 99.98%. Are people in Austria just less worried about what will happen to them after they die, or just that more altruistic? It turns out the answer is far more simple -- in Germany you must put yourself on the register to become a potential donor (opt-in), whereas in Austria you are a potential donor unless you object (opt-out). While people may be, for right or for wrong, worried about the fate of their body after it is dead, they appear less likely to express these reservations in opt-out systems.19
While Richard Thaler and Cass Sunstein argue in their book Nudge: Improving Decisions About Health, Wellness, and Happiness that we sometimes suck at making decisions in our own interest and all could do better with more favorable "defaults", such defaults are also pressing in helping people.
While opt-out organ donation is a huge deal, there's another similar idea -- opt-out philanthropy. Back before 2008 when the investment bank Bear Stearns still existed, Bear Stearns listed their guiding principle as philanthropy as fostering good citizenship and well-rounded individuals. To this effect, they required the top 1000 most highest paid employees to donate 4% of their salary and bonuses to non-profits, and prove it with their tax returns. This resulted in more than $45 million in donations during 2006. Many employees described the requirement as "getting themselves to do what they wanted to do anyway".
Conclusions
So, according to this bit of psychology, what could we do to get other people to help more, besides moralize? Well, we have five key take-aways:
(1) present these people with a single and highly identifiable victim that they can help
(2) nudge them with a default of opt-out philanthropy
(3) be more open about our willingness to be altruistic and encourage other people to help
(4) make sure people understand the average level of helping around them, and
(5) instill a responsibility to help and an understanding that doing so is not futile.
Hopefully, with these tips and more, helping people more can be come just one of those things we do.
References
(Note: Links are to PDF files.)
1: D. A. Small, G. Loewenstein, and P. Slovic. 2007. "Sympathy and Callousness: The Impact of Deliberative Thought on Donations to Identifiable and Statistical Victims". Organizational Behavior and Human Decision Processes 102: p143-53
2: Paul Slovic. 2007. "If I Look at the Mass I Will Never Act: Psychic Numbing and Genocide". Judgment and Decision Making 2(2): p79-95.
3: T. Kogut and I. Ritov. 2005. "The 'Identified Victim' Effect: An Identified Group, or Just a Single Individual?". Journal of Behavioral Decision Making 18: p157-67.
4: T. Kogut and I. Ritov. 2005. "The Singularity of Identified Victims in Separate and Joint Evaluations". Organizational Behavior and Human Decision Processes 97: p106-116.
5: D. A. Small and G. Lowenstein. 2003. "Helping the Victim or Helping a Victim: Altruism and Identifiability". Journal of Risk and Uncertainty 26(1): p5-16.
6: Singer cites this from Paul Slovic, who in turn cites it from: Seymour Epstein. 1994. "Integration of the Cognitive and the Psychodynamic Unconscious". American Psychologist 49: p709-24. Slovic refers to the affective system as "experiential" and the deliberative system as "analytic". This is also related to Daniel Kahneman's popular book Thinking Fast and Slow.
7: D. Fetherstonhaugh, P. Slovic, S. M. Johnson, and J. Friedrich. 1997. "Insensitivity to the Value of Human Life: A Study of Psychophysical Numbing". Journal of Risk and Uncertainty 14: p283-300.
8: Daniel Kahneman and Amos Tversky. 1979. "Prospect Theory: An Analysis of Decision Under Risk." Econometrica 47: p263-91.
9: Bib Lantané and John Darley. 1970. The Unresponsive Bystander: Why Doesn't He Help?. New York: Appleton-Century-Crofts, p58.
10: Martin Nowak, Karen Page, and Karl Sigmund. 2000. "Fairness Versus Reason in the Ultimatum Game". Science 289: p1183-75.
11: Lee Ross and Richard E. Nisbett. 1991. The Person and the Situation: Perspectives of Social Psychology. Philadelphia: Temple University Press, p27-46.
12: Robert Cialdini. 2001. Influence: Science and Practice, 4th Edition. Boston: Allyn and Bacon.
13: Judith Lichtenberg. 2004. "Absence and the Unfond Heart: Why People Are Less Giving Than They Might Be". in Deen Chatterjee, ed. The Ethics of Assistance: Morality and the Distant Needy. Cambridge, UK: Cambridge University Press.
14: Jen Shang and Rachel Croson. Forthcoming. "Field Experiments in Charitable Contribution: The Impact of Social Influence on the Voluntary Provision of Public Goods". The Economic Journal.
15: Rachel Croson and Jen Shang. 2008. "The Impact of Downward Social Information on Contribution Decision". Experimental Economics 11: p221-33.
16: Dale Miller. 199. "The Norm of Self-Interest". American Psychologist 54: 1053-60.
17: Rebecca Ratner and Jennifer Clarke. Unpublished. "Negativity Conveyed to Social Actors Who Lack a Personal Connection to the Cause".
18: Alexis de Tocqueville in J.P. Mayer ed., G. Lawrence, trans. 1969. Democracy in America. Garden City, N.Y.: Anchor, p546.
19: Eric Johnson and Daniel Goldstein. 2003. "Do Defaults Save Lives?". Science 302: p1338-39.
(This is an updated version of an earlier draft from my blog.)
Intense stress can be constructive. You're totally right that people will not have any idea how to deal with it. This could be either very good or very bad for the charity presenting intensely distressing imagery like what I think is needed to get people to react emotionally to statistics. If you present yourself as the solution to all of this, the guide who makes those feelings constructive, that could be very good. If the people can't handle the stress, they'll shut off. If you ASSIST them with handling stress, you will be seen as a leader in a hard situation, a source of comfort that gives a constructive outlet to emotion, meaning to pain. The difference could be this:
You see a crying child, you're a little sad, you give her 20 bucks.
You see a dying country, you are moved to act now, suddenly life has purpose, you experience a renewed sense of meaning.
It would have to be done very, very carefully to have the most constructive effect. Then again, what doesn't?