Something that has fascinated me — in the way video of a major disaster fascinates me — is watching big organizations and seeing how they act like idiots. It's been enough I created an acronym for it: BOTALI — Big Organizations That Act Like Idiots.
Can you think of any candidates? I certainly can — the U.S. government, most government agencies, NGOs, the UN... too many to count. Private sector as well — big companies, private charities, big churches. In fact, most any large organization will act like it's being run by idiots — unless it instead acts like it's being run by a cabal of any opposition to that goal to which it's supposed to be devoted.
But what can we do about it? Are we simply doomed to suffer the slings and arrows of a platoon of incompetent morons?
For the moment, let's put aside the organizations that are a more or less obvious grift. We're seeing plenty of those — DOGE is digging them out, and the grifters are fighting back with everything from lawfare to firebombs and bullets, but they deserve an article of their own.
Instead, let me pose a proposition: the grifts succeed because the organizations charged with managing and auditing the organization, yes, act like idiots.
There are innumerable big organizations that act like idiots even though they are individually filled with people who wholeheartedly believe in the organization's goals. In some abstract way.
It is such a common phenomenon that during World War II it picked up its own acronym: SNAFU, "Situation Normal All Fouled Up". (Of course, that's not what the F really is, but Google doesn't like naughty words in websites for which it provides ads.)
Robert Shea and Robert Anton Wilson, in their novel "Illuminatus," first published in three parts in 1975, proposed something they called "The SNAFU Principle." As they proposed it, the SNAFU Principle is "communication is only possible between equals." The point was that if two people feel themselves in a position of superior-inferior, then there won't be effective communication between them.
It is a valid observation — everyone has had the experience of difficulty communicating with a boss or getting complete information from a subordinate — but I don't think it is quite sufficient. To explain it, I'm going to use a very tiny bit of Claude Shannon's information theory, but I promise I won't resort to any math.
Unless you want the math... Okay, no one's clamoring for more math.
Flashback: The Greatest Genius No One Has Heard Of
Basically, Shannon observed something very general about communications between two endpoints. The line of communication is called a channel, and Shannon's observation was that any channel has a limit on the maximum information it can transmit. That amount is called the bandwidth of the channel, and it's measured in bits.
Here on the internet, the idea is actually familiar because we talk about it whenever we talk about our internet connections. Right now in Wichita, I get about 900 million bits per second to my home; back in the '70s, we had audio modems that got 300 bits per second and were thrilled to step up to 1,200 bits per second.
Information theory gives us a tool to talk about communication, though. As well as being able to define the maximum bandwidth of a channel, Shannon observed that some channels were imperfect. They could be noisy, which means just what it sounds like — an imperfect channel made it more difficult to understand the message being transmitted, just like a bad phone connection or trying to talk in a noisy bar.
So let's define a more refined SNAFU principle:
In any communication between people, the noise increases with the difference in their status or hierarchical position.
In other words, when a boss is talking to a subordinate, what the subordinate hears is filtered by, well, any number of things — fear, confusion, or simply because the subordinate doesn't know everything the boss assumes everyone knows. Too often, it's filtered by the subordinate trying to tell the boss what he thinks the boss wants to hear. And when the subordinate talks to the boss, what the boss hears is filtered by any number of things, of which the most important is that the boss tends to hear what the boss wants to hear. A second factor is that the boss doesn't really know the job of the subordinate.
Now think about a subordinate talking to the boss's boss: all of these factors come in, and probably more strongly. What's more, if the subordinate is talking to the boss, and the boss is talking to the boss's boss, then the message is filtered twice.
Of course, the SNAFU effect shows up in situations where there is no obvious formal hierarchy as well. There is a strong presumption on the part of the press that they know better and thus they should make sure their coverage should reflect their superior knowledge and expertise.
This is a real problem in the world of intelligence. In my time in the intelligence community, we on the collection end would pass an intercept up the line, where it was turned into a gist, passed to the first-level analyst, who then wrote essentially a term paper that was collected by a second-level analyst, passed to the Branch Chief who summarizes or at least determines what is important, to then go on to the Director of National Intelligence to be collected in the President's Daily Brief (PDB). At each stage, the person at that layer is deciding what to say based on what he thinks the boss wants to see — or what the DNI wants them to see.
The result is that the PDB says whatever the DNI thinks it's advantageous to say.
We saw the result of this back in the run-up to the Iraq War. It's often forgotten — conveniently — that the opinion of the intelligence services, not just in the U.S. but among the U.S.'s allies, was that Iraq had an active program to develop weapons of mass destruction. One of the results of this shared opinion was the Iraq Liberation Act of 1998, which President Clinton signed on Halloween.
Of course, after the Iraq War, we didn't find a lot of WMD. Once again, the SNAFU filter got involved, and it became common knowledge that President Bush had lied about WMD; a few years later, the New York Times reported that American soldiers had been injured when they discovered Iraqi troves of WMD, leading to the interesting situation of the Times asserting both that Bush lied about the existence of Iraqi WMD programs and then lied about there being no such weapons.
The result in all these cases is that people on the receiving end of information don't get the whole story and often instead are getting the story as filtered through a bunch of people's slants and biases.
So part of the answer is simply that the people on top who make the decisions are certain to not be getting the best information — and often they're getting bad information, purposefully or not. It's no surprise that even the best, most well-intentioned people in a big organization act like idiots.
(This is part one of a two-part series. Part Two: The Climber Competence Paradox: How Idiots Get To Be In Charge.)