in the USA you're four times more likely to be struck by thunder than by terrorists
Our minds are actually picking up on a valid statistical issue here, which is that the number of people killed by terrorists is much more variable than the number of people killed by lightning. Since lightning strikes are almost completely uncorrelated random events, the distribution of deaths by lightning is governed by the Central Limit Theorem and so is nearly Gaussian. If X people died from lightning in 2014, then it is very unlikely that 2X people will die from lightning in 2015, and astronomically unlikely that 100X people will so die.
In contrast, if X people die from terrorism in 2014, you cannot deduce very much about the probability that 100X people will die from terrorism in 2015. Nassim Taleb would say that lightning deaths happen in Mediocristan while terrorism deaths happen in Extremistan.
If X people died from lightning in 2014, then it is very unlikely that 2X people will die from lightning in 2015,
This doesn't actually follow from (annual?) lightning strikes being nearly Gaussian. A Gaussian distribution can have a standard deviation not much smaller than its mean, in which case a fall of 50% or a rise of 100% from one year to the next wouldn't be so unlikely. Indeed the last 8 counts of annual US lightning fatalities vary over a range of a factor of 2.
Chris Nolan's Joker is a very clever guy, almost Monroesque in his ability to identify hypocrisy and inconsistency. One of his most interesting scenes in the film has him point out how people estimate horrible things differently depending on whether they're part of what's "normal", what's "expected", rather than on how inherently horrifying they are, or how many people are involved.
Soon people extrapolated this observation to other such apparent inconsistencies in human judgment, where a behaviour that once was acceptable, with a simple tweak or change in context, becomes the subject of a much more serious reaction.
I think there's rationalist merit in giving these inconsistencies a serious look. I intuit that there's some sort of underlying pattern to them, something that makes psychological sense, in the roundabout way that most irrational things do. I think that much good could come out of figuring out what that root cause is, and how to predict this effect and manage it.
Phenomena that come to mind, are, for instance, from an Effective Altruism point of view, the expenses incurred in counter-terrorism (including some wars that were very expensive in treasure and lives), and the number of lives said expenses save, compared with the number of lives that could be saved by spending that same amount into improving road safety, increasing public helathcare expense where it would do the most good, building better lightning rods (in the USA you're four times more likely to be struck by thunder than by terrorists), or legalizing drugs.
What do y'all think? Why do people have their priorities all jumbled-up? How can we predict these effects? How can we work around them?