Beyond normal consequentialism (as discussed in other answers), there's a game theory angle, where if you aren't trying to model a norm into existence, it's worthwhile to only follow the norm once it's agreed violators will be punished.
See paulfchristiano's post on Moral Public Goods, which argues that you will often get into situations where people would be in favor of a norm were that norm enforced, while not being in favor of the behavior the norm calls for when the norm is not enforced.
Everything has an opportunity cost. I'd claim that when impact is very small, it is almost always the case that the opportunity cost is not worthwhile. In general, one can have far more impact by focusing on one or two high-impact actions rather than spending the same aggregate time/effort on lots of little things.
Much more detail is in The Epsilon Fallacy; also see the comments on that post for some significant counterarguments.
(I'm definitely not claiming that the psychological mechanism by which people ignore small-impact actions is to think through all of this rationally. But I do think that people have basically-correct instincts in this regard, at least when political signalling is not involved.)
Your example about cell phones is a prisoner dilemma. The choice to continue using the cell phone has more utility for each individual participant if they are the only person who would stop using it. At the same time it there would be higher utility for everyone, if everyone would choose cooperate in the prisoner dilemma and stop using their cell phone.
Having a government legislate that everyone picks cooperate in a prisoners dilemma is a way to solve the prisoner dilemma.
Even if a person wants to do something about a problem, it's often much more impactful to donate to an effective charity then to change personal behavior.
The recent founders pledge article on climate change that illustrates that principle for climate change. Animal Charity Evaluators might not be the most trustworthy source but when it comes to the numbers I see from EA's the same principle seems true for that area as well.
Even if a person wants to do something about a problem, it's often much more impactful to donate to an effective charity then to change personal behavior.
Not sure if you meant "then" or if it was a typo for "than", but either way I have an observation:
One can do both things: donate to an effective charity and change personal behavior, no?
One example I like is: vegan lifestyle vs. vegan activism.
Activism is a lot more impactful than becoming vegan oneself. By far. Because of the potential amount of people reached, and because activi...
There are good reasons for thinking nuclear power is part of the solution, in the short to medium term, but it's a major exaggeration to call it the only solution.
And no, not every CC proponent is anti nuclear. NASA scientist James Hansen was one of the earliest proponents of climate change and is pro nuclear.
Possible explanations:
1) Many impacts are not just small, but effectively zero, or even slightly negative. Spending more effort/resources to do things that APPEAR good but actually don't matter, is a net harm.
2) Some items have threshold or nonlinear impact such that it's near-zero unless everybody (or at least more than are likely) does them. This gets to second-order arguments of "my example won't influence the people who need to change", but the argument does recurse well.
3) The world is, in fact, full of irresponsible people. Unfortunately, it's mostly governed by those same people.
4) Reasons given for something don't always match the actual causality. "It wouldn't matter" is more socially defensible than "I value my comfort over the aggregate effect".
5) Relative rather than absolute measures - "I'm a sucker" vs "the world is slightly better".
6) The https://en.wikipedia.org/wiki/Bystander_effect may not be a real thing, but there is an element of social proof in the idea that if most people are doing something, it's probably OK.
If my action has a zero or infinitesimal positive impact on the relevant problem, while a negative and non-infinitesimal impact on me, cost-benefit analysis concludes I should not do it. I think OP needs to do more work to justify why they think this is not so.
I didn't claim that is not the case.
You seem to think that an altruist action that harms me but benefits the whole planet should have at least a certain amount X of positive impact on the planet... otherwise it's not worth certain sacrifices. And to that, I say: Fair enough!
To give an absurd example: Giving up civilized life, and starting to live in the middle of the forest without any technology would be a silly, disproportionate, ineffective sacrifice to do in order to help Climate Change. It's a nonsensical plan. And I agree with you.
I th...
Does this phenomenon have a name?
Laziness, apathy, indifference, lack of self-responsibility, weakness, stupidity, selfishness, herd mentality?
Ultimately the only person's behaviour you can change is your own. Either you chose to do better things or you don't. Lead by example if you care, otherwise you don't care enough to change.
Why do we refuse to take action claiming our impact would be too small?
We don't. Manifestly so. We (or those of us with enough skill to do so) engage others to leverage the impact, increasing it manifold. Examples are abound, most recently Greta Thunberg, but in general every time people organize for a cause. Of course, not everyone can do it, people are all different and it often takes a passionate leader, a champion of a specific cause, to get the ball rolling. But it happens all the time, just check the news.
Of course I was referring specifically about people who, in your words, cannot do it. :)
I worded it as "we" instead of "some people" in order to take my fair bit of personal responsibility: Even though I fully acknowledge the incredible importance of Climate Change, through my actions I am often part of this group of irresponsible people I refer to.
That being said, I found your answer really enlightening. Thank you. :)
To illustrate with an hypothetical example: If we suddenly found out that mobile phone frequencies destroy the planet …
I find that phenomena like this are almost entirely pointless to illustrate with hypothetical examples, and much more fruitful to instead illustrate with actual examples.
Note, however, that if you do this, you may get responses protesting that actually, your supposed “actual examples” are not, in fact, examples of your claimed phenomenon. This, of course, is very much a feature, and not at all a bug—as it is quite possible that the phenomenon you thought was real, in fact… isn’t. In the latter case, what you would expect is precisely that all your attempts to provide actual examples would be met with skepticism and protest.
That's why I didn't focus on actual examples, and only briefly mentioned a couple in the begginning. :P
I'm more interested in the psychological phenomenon, rather than specific instances in real life, or wether its occurrence is a good or a bad thing.
e.g. Maybe it makes sense to not cooperate. I don't know. That's out of scope here.
I've seen this often in problems like climate change or animal exploitation:
"The solution is up to others. The powerful. The governments. The policy makers."
In this way people frequently delegate their share of responsibility to more powerful or visible entities.
To illustrate with an hypothetical example: If we suddenly found out that mobile phone frequencies destroy the planet, instead of stopping using them, many people would say:
But the only reason the government needs to ban cell phones is that the world is full of irresponsible people who need to be coerced into doing the right thing!
Does this phenomenon have a name? Does anybody here know the underlying psychological mechanism? Is it a genuine blindness about the sea being made up of millions of small droplets? An excuse to avoid responsibility? Something else?