An important, ongoing part of the rationalist project is to build richer mental models for understanding the world. To that end I'd like to briefly share part of my model of the world that seems to be outside the rationalist cannon in an explicit way, but which I think is known well to most, and talk a bit about how I think it is relevant to you, dear reader. Its name is "normalization of deviance".
If you've worked a job, attended school, driven a car, or even just grew up with a guardian, you've most likely experienced normalization of deviance. It happens when your boss tells you to do one thing but all your coworkers do something else and your boss expects you to do the same as them. It happens when the teacher gives you a deadline but lets everyone turn in the assignment late. It happens when you have to speed to keep up with traffic to avoid causing an accident. And it happens when parents lay down rules but routinely allow exceptions such that the rules might as well not even exist.
It took a much less mundane situation for the idea to crystalize and get a name. Diane Vaughan coined the term as part of her research into the causes of the Challenger explosion, where she described normalization of deviance as what happens when people within an organization become so used to deviant behavior that they don't see the deviance, even if that deviance is actively working against an important goal (in the case of Challenger, safety). From her work the idea has spread to considerations in healthcare, aeronautics, security, and, where I learned about it, software engineering. Along the way the idea has generalized from being specifically about organizations, violations of standard operating procedures, and safety to any situation where norms are so regularly violated that they are replaced by the de facto norms of the violations.
I think normalization of deviance shows up all over the place and is likely quietly happening in your life right now just outside where you are bothering to look. Here's some ways I think this might be relevant to you, and I encourage you to mention more in the comments:
- If you are trying to establish a new habit, regular violations of the intended habit may result in a deviant, skewed version of the habit being adopted.
- If you are trying to live up to an ideal (truth telling, vegetarianism, charitable giving, etc.), regularly tolerating violations of that ideal draws you away from it in a sneaky, subtle way that you may still claim to be upholding the ideal when in fact you are not and not even really trying to.
- If you are trying to establish norms in a community, regularly allowing norm violations will result in different norms than those you intended being adopted.
Those mentioned, my purpose in this post is to be informative, but I know that some of you will read this and make the short leap to treating it as advice that you should aim to allow less normalization of deviance, perhaps by being more scrupulous or less forgiving. Maybe, but before you jump to that, I encourage you to remember the adage about reversing all advice. Sometimes normalized "deviance" isn't so much deviance as an illegible norm that is serving an important purpose and "fixing" it will actually break things or otherwise make things worse. And not all deviance is normalized deviance: if you don't leave yourself enough slack you'll likely fail from trying too hard. So I encourage you to know about normalization of deviance, to notice it, and be deliberate about how you choose to respond to it.
Specifically, the normalization of deviance process in Challenger Launch Decision involved a 5-step process:
The key thing that Vaughan identifies is that in every iteration of the above cycle, the standard that was compared against was the output of the previous iteration. Because of this, the notion of what was "acceptable" joint rotation subtly shifted over time, from what a conservative standard in 1977 to a very risky one 1986. The problem was that NASA was updating its beliefs about what was acceptable O-ring performance, but, as an organization, was not realizing that it had updated. As a result it drifted in an uncontrolled manner from its original standards, and thus signed off as safe a system that was, in retrospect, a disaster waiting to happen.
Normalization of deviance is a difficult problem to combat, because the process that leads to normalization of deviance is also the process that leads to helpful and beneficial updates about the state of the world. I would suggest that some normalization of deviance, within limits is acceptable. The world is not always going to be what your model says it will be, and you have to have some leeway to adapt to circumstances that aren't what you were expecting. However, when doing so, it's important to ensure that today's exception remains an exception, and that the next time deviance occurs, it's checked against the original standard, not an updated standard that results from the exception process.