My managers, therefore quite rationally did things that maximized their apparent value in the eyes of their bosses, even if it meant that the project (and, as a result) the entire organization was hurt.
The problems described are applicable to corporations and collective action in general.
The book Moral Mazes chronicles a sociological study in the 1980s, where a sociologist spent years living with the savages of corporate bureaucracy, learning their customs, behaviors, and morals. The irrelevancy of truth, the shifting of blame, the feudal relationship to those higher up, the obligation to keep knowledge from your bosses so that they could avoid responsibility, the situational ethics totally contradicting everything they otherwise profess in life - all of it there, and fully documented in the study.
Orwell and Hayek had it right. Organizations will get predictable results depending on how knowledge, power, and responsibility are aligned despite the clear global suboptimality (read abomination) of the solution to the individuals involved. Align those things wrong, and you get Big Brother and a boot stomping a human face, forever. Align them differently, and you get the modern corporation, or government bureaucracy, or or open source and maker movements.
Why Software Projects are terrible and how not to fix them (by Drew Crawford):
In other words, it's all about signaling, isn't it? Managers will take actions that actively harm the continued progress of the project if that action makes them look "decisive" and "in charge". I've seen this on many projects I've been on, and it took me a while to realize that my managers weren't stupid or ignorant. It's just that the organization I was working in put a higher priority on process than on results. My managers, therefore quite rationally did things that maximized their apparent value in the eyes of their bosses, even if it meant that the project (and, as a result) the entire organization was hurt.
Crawford then goes on to detail why organizations with such maladaptive practices survive:
I think this is something that we as rationalists sometimes forget about. Irrationality has momentum. Humans have been thinking intuitively for thousands (hundreds of thousands, even) of years before we figured out how to think with rigorous rationality. Even if rationality had a massive advantage of intuitive thinking in everyday situations (it doesn't, as far as I can tell) it would take a very long time for rational thought to propagate through society.
So the next time you get frustrated at some bit of wanton irrationality, remind yourself, "Momentum," before you get frustrated.
EDIT: Fixed spelling as per RolfAndreassen's post.