How can you trust the quality of your processes if your people can’t be honest with you?
Quality & Economics
The question of quality runs far deeper than business. When quality fails at the societal level, we fail each other. Then the real danger is that we fail to govern efficiently and fairly.
When I talk about “honesty” in the context of quality, people often mistake my intent. I do not intend to imply that people are dishonest nor that poor quality is indicative of people being dishonest nor that they do not know right from wrong. I most definitely do not suggest that possessing quality is in any way elitist nor that the character of a people, and the societies with which they belong, can be measured by someone else’s impression of quality.
The kind of honesty I point to is the kind that helps people like you and I avoid mistakes in our businesses, or when there are mistakes, help us correct errors quickly. This kind of honesty is the type that pushes up good data, right actions, and a positive future.
Here are some examples of what happens when honesty in quality does not rise to the top:
-
- What happened that allowed the Deepwater Horizon drilling platform to explode and release millions of gallons of oil into the Gulf of Mexico? Detailed testimony during the court case against BP suggests that the engineers and management were aware of possible problems, but that they ignored the warnings. Management put a great deal of pressure on people to get the well producing as fast as possible. Management never said—do this at all cost—but the managers of the drilling platform felt they had no choice. So they pushed as hard as they could and took many risks along the way.
- Politicians who enacted the Budget Control Act of 2011 were swamped by data. Supporters claimed that budget sequestration was necessary to reduce America’s dependence on deficit budget spending. However, all sides agreed that such a law could alter America’s military readiness. The decider however was a political impasse that forced everyone’s hand. Less than a year after the law was enacted, a top of the line attack nuclear submarine was severely damaged due to budget limitations to secure it during a drydock refit. After the accident, the US Navy found they didn’t have the budget to repair the damage. Therefore a $1 billion piece of valuable military equipment ended up in the scrap yard instead.
- The Japanese government and a national power company claimed that the Fukushima nuclear reactor disaster was the result of unforeseen circumstances. Reports have since come out that they had received many warnings about the Fukushima installation; from the time it was planned in the 1960’s through the day it was commissioned in 1972. The warning: the plant was vulnerable to tsunami and earthquakes. None of the members of the management team had ever lived through a great earthquake and tsunami. The incidence of such events had occurred through Japan’s history, but they were so infrequent that management considered the possibility as extremely small.
These examples are of people misusing data because they choose to ignore what they believe are fringe possibilities, not realizing how their decisions endanger themselves and others. When such decisions produce disastrous results, people often make the irrational claim that management is deliberately dishonest. Nothing could be further from the truth. If anything, massive failure like the Deepwater Horizon disaster shows serious errors in how large organizations make bad decisions depending on how they manage risk. In the case of the Budget Control Act of 2011, “management” (the legislature) knew there were problems, but chose to ignore them because there was no political will to reach out for better solutions. Fukushima is again an error in managing risk, but also ignoring the fact that their engineers had good solutions that could have been implemented retroactively.
Large corporations and especially government and government agencies rely heavily on management consensus for their big decisions. The nature of the agreement could be based on several criteria like risk assessments, mitigation, cost factors, profitability, and so on. No single factor is the driving force in any decision, especially when there are political elements involved. Moreover, that’s how disasters can happen.
In the cases I mentioned above, it is very likely that few people thought that the risk levels were high enough to change their minds or offer alternative solutions. More than likely, they never heard the warnings because, by that time, the signs had been filtered out or diluted to the point that they didn’t stand out. Ultimately, the decision makers did what many people do: they just went with the flow.
To me, “going with the flow” means that you are not listening to the data, you are not enriching your actions, and you are not optimizing your future. When you make decisions as they did at BP or Fukushima, you’re taking an average of possible answers and hoping nobody notices that you took a shortcut. The problem is, these days, even the most innocuous alternatives can set off terrible global disasters.
When you have real honesty in management, information flows freely. You have actual data, compelling action, and a positive future. If you do not have honesty from everyone in your organization, how can you make the right decisions? How can you trust the quality of your processes if your people can’t be honest with you?