Notice: Undefined index: orientation in /var/www/wp-content/plugins/pdf-print/pdf-print.php on line 840
//Data, Action, and Future

Data, Action, and Future


How can you trust the quality of your processes if your people can’t be honest with you?

Quality & Economics

The question of quality runs far deeper than business. When quality fails at the societal level, we fail each other. Then the real danger is that we fail to govern efficiently and fairly.


Books by Subir

The Power of LEO
The Ice Cream Maker
The Power of Design for Six
The Power of Six Sigma
Organization 21c

Books read by Subir

When I talk about “honesty” in the context of quality, people often mistake my intent. I do not intend to imply that people are dishonest nor that poor quality is indicative of people being dishonest nor that they do not know right from wrong. I most definitely do not suggest that possessing quality is in any way elitist nor that the character of a people, and the societies with which they belong, can be measured by someone else’s impression of quality.

The kind of honesty I point to is the kind that helps people like you and I avoid mistakes in our businesses, or when there are mistakes, help us correct errors quickly. This kind of honesty is the type that pushes up good data, right actions, and a positive future.

Here are some examples of what happens when honesty in quality does not rise to the top:

    • What happened that allowed the Deepwater Horizon drilling platform to explode and release millions of gallons of oil into the Gulf of Mexico? Detailed testimony during the court case against BP suggests that the engineers and management were aware of possible problems, but that they ignored the warnings. Management put a great deal of pressure on people to get the well producing as fast as possible. Management never said—do this at all cost—but the managers of the drilling platform felt they had no choice. So they pushed as hard as they could and took many risks along the way.
    • Politicians who enacted the Budget Control Act of 2011 were swamped by data. Supporters claimed that budget sequestration was necessary to reduce America’s dependence on deficit budget spending. However, all sides agreed that such a law could alter America’s military readiness. The decider however was a political impasse that forced everyone’s hand. Less than a year after the law was enacted, a top of the line attack nuclear submarine was severely damaged due to budget limitations to secure it during a drydock refit. After the accident, the US Navy found they didn’t have the budget to repair the damage. Therefore a $1 billion piece of valuable military equipment ended up in the scrap yard instead.
    • The Japanese government and a national power company claimed that the Fukushima nuclear reactor disaster was the result of unforeseen circumstances. Reports have since come out that they had received many warnings about the Fukushima installation; from the time it was planned in the 1960’s through the day it was commissioned in 1972. The warning: the plant was vulnerable to tsunami and earthquakes. None of the members of the management team had ever lived through a great earthquake and tsunami. The incidence of such events had occurred through Japan’s history, but they were so infrequent that management considered the possibility as extremely small.

These examples are of people misusing data because they choose to ignore what they believe are fringe possibilities, not realizing how their decisions endanger themselves and others. When such decisions produce disastrous results, people often make the irrational claim that management is deliberately dishonest. Nothing could be further from the truth. If anything, massive failure like the Deepwater Horizon disaster shows serious errors in how large organizations make bad decisions depending on how they manage risk. In the case of the Budget Control Act of 2011, “management” (the legislature) knew there were problems, but chose to ignore them because there was no political will to reach out for better solutions. Fukushima is again an error in managing risk, but also ignoring the fact that their engineers had good solutions that could have been implemented retroactively.

Large corporations and especially government and government agencies rely heavily on management consensus for their big decisions. The nature of the agreement could be based on several criteria like risk assessments, mitigation, cost factors, profitability, and so on. No single factor is the driving force in any decision, especially when there are political elements involved. Moreover, that’s how disasters can happen.

In the cases I mentioned above, it is very likely that few people thought that the risk levels were high enough to change their minds or offer alternative solutions. More than likely, they never heard the warnings because, by that time, the signs had been filtered out or diluted to the point that they didn’t stand out. Ultimately, the decision makers did what many people do: they just went with the flow.

To me, “going with the flow” means that you are not listening to the data, you are not enriching your actions, and you are not optimizing your future. When you make decisions as they did at BP or Fukushima, you’re taking an average of possible answers and hoping nobody notices that you took a shortcut. The problem is, these days, even the most innocuous alternatives can set off terrible global disasters.

When you have real honesty in management, information flows freely. You have actual data, compelling action, and a positive future. If you do not have honesty from everyone in your organization, how can you make the right decisions? How can you trust the quality of your processes if your people can’t be honest with you?

A Moment of Truth for the Solar Panel Industry

I recently read a commentary in the New York Times (“Solar Industry Anxious Over Defective Panels”; May 25, 2013, link), and something sounded familiar. Solar panels that are expected to have a 25-year life span are failing. Coatings are disintegrating and other defects have caused fires. Worldwide, the reports are coming in. The $77 billion solar photovoltaic industry is facing a quality crisis.

A Little Salmonella May Not Kill You, but it May Kill your Economy

After salmonella was discovered in a flavor-enhancing ingredient, a wide range of processed foods were recalled including soups, snack foods, dips and dressings, the result of poor quality control.  Food and Drug Administration officials noted that the ingredient, hydrolyzed vegetable protein, was used in thousands of food products. The FDA and the Centers for Disease Control and Prevention said no illnesses or deaths have been reported - so far.

Subir Chowdhury Fellowship on Quality and Economics at Harvard University

Expanding the outreach of Subir Chowdhury's global call for quality throughout society - at all levels - a Fellowship on Quality and Economics has been established at Harvard University Graduate School of Arts and Sciences. The goal: to explore the impact of quality and economics in the United States.

Whose political crisis is this, anyhow?

I am deeply troubled by the increased pace of self-inflicted crises in our government and economy. We have been witness to one event after another during the last several years, each with seemingly greater levels of consequence and damage. Not surprisingly, this is all happening under the watchful eyes of two of the least productive congressional sessions in history.