(Epistemic status: Pretty sure the premises are all correct and the metaphor is accurate. Exploring the implications and conclusions.)
A programmer was given an aging piece of software to use to complete a project. To the company, using the old software seemed like the path of least resistance. But, opening the program and looking at the code, the programmer saw it had been hacked, edited, and added to by twenty or more programmers before her, each with their own style and vision of what a good program should be, each according to the conventions and concerns of the day.
"Ahhh, Legacy code," the programmer said. She frowned a little.
By now, the software was full of unused code and had a maze of lines of programming that referred to other lines that referred to yet other lines, and so forth. The programmer found it almost impossible to sort out exactly what many parts of the program were designed to do. Worse, though the computers used by the company were high-tech, parts of the software were designed to run on computers from decades before. The program was a nightmare to work with. Normally one had to doctor the finicky inputs, and often the software returned incorrect, puzzling, or even meaningless outputs.
When software like this does anything useful at all, it is a small miracle.
Most organizations, social hierarchies, and belief systems are heavy with garbage. There is no one to blame, because most of those systems are constructed from layers of intentions and interpretations spanning years, decades, or even centuries. Hence things like a legal system with thousands of pages of code and case law. Plenty of these systems evolved around social realities that have changed immensely since they were designed. Meanwhile, your own experience tells you that no matter how dysfunctional or downright it is, no organization, belief system, or social structure will ever agree with you that it is wrong. For the most part, none of these systems were designed to take the steps to eliminate themselves when they are no longer useful.
Once a system takes root in society and ages, it loses its agency, intelligence, and will. That system is usually very hard to remove even if it starts to work against its original intention. Most people either thoughtlessly adopt or begrudgingly embed themselves in whatever systems are presented by their culture and society. Doing so usually seems to be the path of least resistance, even necessity. Thus, many social structures and systems are aging and barely functioning software, legacy programs running on human hardware.
The programmer with the messy program did what any sensible software engineer would have done in her shoes: Set aside the old software and built something useful, that did what was needed, using the full power of the company’s modern computers.
Regarding Moral Obligations to Systems:
No one in history has ever died wishing they had paid more dues to hierarchies, bureaucrats, and society’s systems. Yet that is what those systems always seem to want: More. People treat them as basic truths, but do we owe such systems any more loyalty than we would give to old computer programs?
The next time old software gets hung up on a certain procedure unless you give it doctored inputs it likes to see, why even hesitate? When you recognize opportunities to bypass or delete such software, should your default choice be to seize them?
And finally, the clear implication is we should be writing new software, and attempts to "tweak" it are probably just going to pile on more garbage and spin it further out of spec. I think Ed Deming would advocate strongly that this is in fact the case. This could be an even stronger moral impetus to "delete/bypass" such a system.
I see the main conflict in my reasoning would be with people who have embedded themselves by default in the systems around them. It would be like all the people who accepted a bloated Windows because it's all their org and Best Buy ever gave them and now we're all switching to Linux. Maybe then the moral obligation is to try to facilitate "soft landings" for those already deep in the current systems.
Is it this, or that simply appears to be the case because someone older is likely to be deeply embedded?
My dad doesn't think Windows is better than Linux or Mac. He sees me with OpenSuse and openly derides Windows all the time, but he figures he doesn't want to learn a whole new system. He's past EOL on Win7 at this point, but is so embedded in it, down to Excel for his accounting (was an actuary, on Excel from like the 80s through the 2000s).
Also, I have not argued that every new way is good. Some older techs are extremely good (Top of head example: no one who has ever used film and worked in a darkroom would say the experience of working in photoshop could ever fully replace that experience. Or another example, I hate turning on my computer to do anything with music. The screen/mouse/key interface is nothing nice to my creativity. And oh my goddess how cool the whole thing can sound and come together on a four track!).