Related: Circular Altruism
One thing that many people misunderstand is the concept of personal versus societal safety. These concepts are often conflated despite the appropriate mindsets being quite different.
Simply put, personal safety is personal.
In other words, the appropriate actions to take for personal safety are whichever actions reduce your chance of being injured or killed within reasonable cost boundaries. These actions are largely based on situational factors because the elements of risk that two given people experience may be wildly disparate.
For instance, if you are currently a young computer programmer living in a typical American city, you may want to look at eating better, driving your car less often, and giving up unhealthy habits like smoking. However, if you are currently an infantryman about to deploy to Afghanistan, you may want to look at improving your reaction time, training your situational awareness, and practicing rifle shooting under stressful conditions.
One common mistake is to attempt to preserve personal safety for extreme circumstances such as nuclear wars. Some individuals invest sizeable amounts of money into fallout shelters, years worth of emergency supplies, etc.
While it is certainly true that a nuclear war would kill or severely disrupt you if it occurred, this is not necessarily a fully convincing argument in favor of building a fallout shelter. One has to consider the cost of building a fallout shelter, the chance that your fallout shelter will actually save you in the event of a nuclear war, and the odds of a nuclear war actually occurring.
Further, one must consider the quality of life reduction that one would likely experience in a post-nuclear war world. It's also important to remember that, in the long run, your survival is contingent on access to medicine and scientific progress. Future medical advances may even extend your lifespan very dramatically, and potentially provide very large amounts of utility. Unfortunately, full-scale nuclear war is very likely to impair medicine and science for quite some time, perhaps permanently.
Thus even if your fallout shelter succeeds, you will likely live a shorter and less pleasant life than you would otherwise. In the end, building a fallout shelter looks like an unwise investment unless you are extremely confident that a nuclear war will occur shortly-- and if you are, I want to see your data!
When taking personal precautionary measures, worrying about such catastrophes is generally silly, especially given the risks we all take on a regular basis-- risks that, in most cases, are much easier to avoid than nuclear wars. Societal disasters are generally extremely expensive for the individual to protect against, and carry a large amount of disutility even if protections succeed.
To make matters worse, if there's a nuclear war tomorrow and your house is hit directly, you'll be just as dead as if you fall off your bike and break your neck. Dying in a more dramatic fashion does not, generally speaking, produce more disutility than dying in a mundane fashion does. In other words, when optimizing for personal safety, focus on accidents, not nuclear wars; buy a bike helmet, not a fallout shelter.
The flip side to this, of course, is that if there is a full-scale nuclear war, hundreds of millions-- if not billions-- of people will die and society will be permanently disrupted. If you die in a bike accident tomorrow, perhaps a half dozen people will be killed at most. So when we focus on non-selfish actions, the big picture is far, far, far more important. If you can reduce the odds of a nuclear war by one one-thousandth of one percent, more lives will be saved on average than if you can prevent hundreds of fatal accidents.
When optimizing for overall safety, focus on the biggest possible threats that you can have an impact on. In other words, when dealing with societal-level risks, your projected impact will be much higher if you try to focus on protecting society instead of protecting yourself.
In the end, building fallout shelters is probably silly, but attempting to reduce the risk of nuclear war sure as hell isn't. And if you do end up worrying about whether a nuclear war is about to happen, remember that if you can reduce the risk of said war-- which might be as easy as making a movie-- your actions will have a much, much greater overall impact than building a shelter ever could.
No... Although I did see it could be read that way, so I added the disclaimer. I do admit that the disclaimer does not add much as there was no cost to me to write it. I'm sorry if I sounded that way.
I will attempt show my thought processes on this the best I can. An answer like this is what my question was trying to get. Yes, I understand that drawing the line is fuzzy, but it can be good to get a somewhat deeper look.
Think of the people of the world. Think of all the things people go around doing in day to day life. The families, the enjoyment people get. I am sure that this is something you value. Of course, you might have a higher weighting of the moral value of this for certain groups rather than others, like perhaps your family. But to have a weighting that much higher on your family members would have certain implications. If you had a weighting high enough to make you commit genocide rather than have your family die, that weighting must be very high, more than a billion to one. (Of course this depends on the size of your family. If you consider half the planet your family, we are discussing something else entirely.)
Lets repeat that for emphasis. 1000000000:1 ratio. What does that actually mean? it means that you would prefer rather than a minor inconvenience to a family member, you would prefer something a billion times worse happening to a non-family member. To use an often used example, you would rather have a stranger tortured for years rather than have a dust speck get in your family member's eye. This is something very much at odds with the normal human perception of morality. That is, while it may be self consistent, it absolutely contradicts what we normally consider morality. This is a strong indicator (though not definite of course) that something fishy is going on with that argument.
(There are some more points to be said, but this post is long enough already. For example, why do I assume that you can scale things this way? In other words why is scope insensitivity bad? If you want to talk about that more I will, but that is not the point of my comment.)
So basically, what I was asking might be better be written this way: Given the vastly different moral point of view you get from such a system of ethics, how do you justify it? That is to say, you do need to be able to come up with some other factor explaining how your system does fit in with our moral intuitions, and I genuinely can not think of such an explanation.
If I've followed your thought process correctly, you justify your moral intuitions because they are shared by most other humans, and since Kawoomba's intuitions aren't so popular, they require some other justification.
Yes?
Fair enough; that answers my question. Thanks.
For my own part, I think that's not much of a justification, but then I don't think that justifying moral intuitions is a particularly valuable exercise. They are what they are. If my moral intuitions are shared by a more powerful and influential group than yours, then our society will reflect... (read more)