Well, in political economics, this phenomenon is called, concentrated benefits and diffuse costs.
I find the inverse more annoying: when people target something as a negative because the costs are concentrated and the benefits diffuse.
Well, there's a broader effect based on an availability heuristic, where people accept explanations that come easily to mind even (as in this case) where the reasons for their availability are not actually related to their truth. So you could consider this a special case of availability bias.
Though that is admittedly a very broad category, possibly uselessly so.
The problem with the availability heuristic is that once I get used to using it as an explanation, it comes more easily to mind, so I am more likely to accept explanations involving it.
I don't know of a standard name - maybe you could call it the problem of asymmetric feedback?
This gets discussed in various contexts - for instance, Marcel Zeelenberg has written about the role of feedback in regret aversion. Suppose you have a choice between a sure thing (win $10) and a risky gamble (50% chance you win $20, 50% chance you get nothing). If you choose the risky gamble and you lose then you'll know for certain that the other option would've been better, which makes you feel regret. But if you choose the sure thing, that typically means that you'll never know how the other option would have turned out (maybe you would've won $20 but maybe you would've gotten nothing), so you won't feel much regret. The risky option produces more expected regret than the safe option, and so regret aversion leads people to prefer the safe option. Zeelenberg and colleagues (1996) showed this by varying what feedback people got: if we arrange it so that you'll get feedback about the risky gamble regardless of what choice you make, then choosing the sure thing won't help you avoid regret (since it also carries a 50% chance of finding out that the other option would've been better), and indeed a lower percentage of people chose the sure thing when they were guaranteed to get feedback regardless of their choice.
There are certain harmful behaviors people are tricked into engaging in, because whereas the benefits of the behavior are concentrated, the harms are diffuse or insidious. Therefore, when you benefit, P(benefit is due to this behavior) ≈ 1, but when you're harmed, P(harm is due to this behavior) << 1, or in the insidious form, P(you consciously notice the harm) << 1.
An example is when I install handy little add-ons and programs that, in aggregate, cause my computer to slow down significantly. Every time I use one of these programs, I consciously appreciate how useful it is. But when it slows down my computer, I can't easily pinpoint it as the culprit, since there are so many other potential causes. I might not even consciously note the slowdown, since it's so gradual ("frog in hot water" effect).