Navigating organizational dysfunction: lessons from Boeing

If it ain’t Boeing, I ain’t going—at one time, public opinion of an engineering company run by engineers. Today, perception is a little different. The multinational conglomerate and defense contractor owns some staggering recent mishaps—safety systems, door plugs, unclear technical issues. A disconnect between senior management and hands-on employees is reported. Analysts consider recent events symptoms of longer-standing organizational pathology, finally coming to light.

In American business, executives win leadership positions in part by prioritizing superficial targets ahead of customers’ (or patients’) best interests. When dysfunction is not unique, it’s seldom singled out. Boeing might be mounting too much evidence to ignore, but it isn’t alone.

When it comes to safety, we focus on outcomes. But beneath outcomes we have processes; below those, culture. Most bad outcomes don’t result from individual error—they arise from deeper flaws.

Whether the company is a hospital system or an aerospace manufacturer, a strong culture looks the same: people are trained, empowered, and provided mentorship that allows them to grow as contributors. Individuals form high-functioning teams with complementary skill sets, driven by shared mission and common goals. Work aligns with beliefs. Waste is minimized. Long-term perspectives are natural. Education serves to advance knowledge and skills. People are encouraged to think and to ask questions and to offer suggestions which will be considered and, if they’re good, implemented. Risk is proactively sought out and minimized. The whole is better than the sum of its parts.

In a weak culture, training looks like computer modules that mainly shift blame onto frontline staff when the inevitable error comes. Individuals don’t cooperate well because their goals are not the same. Work might align with beliefs and values; it might not. Focus is narrow—sometimes as narrow as a few spreadsheet cells. Waste is everywhere because nobody sees the whole picture. Managers police compliance because they aren’t trained to do any better. Policies can be incongruous with the company’s purpose and may create only the appearance of something meaningful. Highly educated people function as technicians. Risk flourishes. Departments blame one another. The whole is worse than the sum of its parts—sometimes impressively so.

A 2022 review by the Office of the Inspector General in the Department of Health and Human Services found that more than 25% of patients with Medicare experience preventable harm while hospitalized. By Medicare’s definition—harm to a patient as a result of medical care or in a health care setting, including the failure to provide needed care—this must be a conservative estimate. Every elder who loses strength and endurance while hospitalized because administrators don’t provide enough staff to provide simple care—for example, keeping patients walking—is a patient harmed. Every person whose diagnosis is delayed by systems staffed only to find some acute problem for which to admit and bill them is a patient harmed.

How do aviation and health care connect? In each, we report surface outcomes when instead we should look deeper into culture and process, to systems and risk. These are problems not just of unfortunate results but of preventable risk—of allowing, through insufficient training, monitoring, and feedback, a certain likelihood of an adverse event.

A danger of dysfunctional systems is their effect of isolating the people at true fault from liability and moving solutions further out of reach. Systems are treated as too complex and remote to grasp rather than ideas created and enabled daily by people. Hospitals are not this way by accident: it is easier and more profitable—though in the short term only—to function this way than it is to recruit, train, and empower skilled managers and leaders who can start the engines of continuous improvement and build great companies. These solutions demand real commitment, and that’s not something that can be faked.

Safety is not a set of processes that managers can force compliance with and expect great results. When the underlying culture is right, safety is the inevitable result of allowing systems to function; when they are not, it is perpetually out of reach. This is as true in any industry as it is in aviation or health care. It will sooner or later affect each of us—through dissatisfaction with work, through the care we ourselves need to pursue, or through wasteful spending of our tax dollars. The root failure has been here all along, but for Boeing, a few visible errors have been the catalyst to draw attention to fundamental inadequacies. So many health systems are just one public mistake away from the kind of scrutiny that will bring deep and long-standing insufficiencies to light. Few are doing things as well as patients and health care workers deserve, and brand prestige doesn’t seem to be strongly correlated with the quality of underlying culture. But the good news is that it doesn’t take an accident or injury to start fixing these problems—it’s good for patients and good business to begin today.

John Corsino is a physical therapist who blogs at his self-titled site, Health Philosophy.

Source link

About The Author

Scroll to Top