One of the key challenges of business is figuring out what happens when “things go wrong.” The product release isn’t ready, the new feature doesn’t perform as expected, the cost of production exceeds the budget, employee turnover climbs: if these things happen, stakeholders are going to ask questions, and business leaders need a coherent response. A fancy dashboard with robust, detailed metrics doesn’t guarantee an easily identifiable root cause. And though root cause analysis may yield a factual explanation—a mistake or breakdown of the process, for example, that still may not explain the “why” in enough detail to prevent repeat errors.
Consider these two examples. Recently, the door plug blew off of a Boeing jet in flight, and investigators discovered that bolts were missing from the door. That’s a definitive, factual explanation, but knowing this doesn’t explain how a jet makes it all the way through production and into service with missing bolts. There’s obviously a lot more to this story.
Here's another example I got from an anonymous employee engineer. In the early 2000s, a global semi-conductor corporation upgraded their fabrication rooms. When the floors were excavated, hundreds of batches of ruined wafer lots were discovered underneath. For the decades preceding electronic tracking methods, employees were secreting their mistakes under the floorboards to avoid the backlash and blame of failed batches. Even a single failed batch cost the company a million dollars. This employee noted that the person brave enough to report a mistake simply disappeared, and was presumed fired, making disclosure a rare exception.
The organization in this example will never know what actually happened. These failures came at a much higher price tag than the cost of the destroyed lots because secrets forfeit the opportunity to learn from failure. To learn from mistakes, organizations must both catch and correct errors. When mistakes are unacceptable, failure goes underground—literally in this particular case study. The result is further loss from repeated errors and the impossibility of discovering tactical innovations that could lead to lower error rates or even entirely new production processes.
If it’s not safe to report errors, mistakes simply become closely held secrets. Forbidding failure doesn’t stop it; the moratorium just means no one will learn from it.
Kommentare