Summary: accepting objections is one of the most valuable skills a manager can learn, and yet the role models we get fail to highlight this
As many other geeks, I have always been fascinated by aviation and its history. One of the things that kept intriguing me was a simple question: what could cause the dramatic decline in the number of plane crashes attributable to pilot error?
I recently found an answer to that question in a book by Jonah Lehrer, crediting two different factors for the increased safety.
The first is due to technical progress: the introduction of commercial flight simulators allowed pilots to practice in difficult situations before experiencing them in real life.
However, the other cause is more interesting as it depends exclusively on human factors:
There was one other crucial factor in the dramatic decline of pilot error: the development of a decision-making strategy known as Cockpit Resource Management (CRM). The impetus for CRM came from a large NASA study in the 1970s of pilot error; it concluded that many cockpit mistakes were attributable, at least in part, to the “God-like certainty” of the pilot in command. If other crew members had been consulted, or if the pilot had considered other alternatives, then some of the bad decisions might have been avoided. As a result, the goal of CRM was to create an environment in which a diversity of viewpoints was freely shared.
― Jonah Lehrer, How We Decide
Similar strategies have been later adopted by other professions in which decisions have a crucial role, such as fire-fighters, naval officers and surgeons, to the point that Cockpit Resource Management has then been renamed Crew Resource Management to encompass its applications outside the airline industry.
The CRM program is based around a mantra: “See it, say it, fix it”, which implies that:
- anyone is able to identify problems, issues or warning signs, not just the person in charge;
- whenever some warning sign is spotted, the person making the discovery has the duty to report it;
- the reporter of a problem should suggest (or start looking for) a possible solution immediately.
What is even more interesting is that, being born in a context where hierarchy is extremely important, CRM training puts a significant stress on the communication between supervisors and subordinates. It is so easy for supervisors to mistake objections for insubordination that they are trained to accept them, and specific training is given on how to question authority without threatening. For example, firemen are suggested to follow these five steps when advocating their own position:
- An opening statement using the addressed person’s name (“Dave,” “Captain,” “Chief”)
- Stating your concern as an owned emotion (“I think we are heading for a problem…” )
- Stating the problem as you see it (“It looks like that building is getting ready to flash”)
- Offering a solution (“I think we should evacuate the interior crews right now”)
- Obtaining agreement (“Do you agree?”)
― International Association of Fire Chiefs, Crew Resource Management, A positive change for the fire service
Software engineering is a different context, as no lives are at risk during the development of a software project. However, I believe we could benefit greatly from this kind of attitude.
It is often the case that bad decisions taken early on during the life of a project carry significant consequences, consequences that could have been avoided if team members had voiced their opinion more effectively.
We are trained to care for the members of our teams, to respect their opinions, but often accepting objections is not something we are taught. If we are leading other people, we should do whatever we can to make them feel at ease in expressing their dissent with what we say, even when it hurts our ego.
If we are part of the team, we have the responsibility to object when we see an incoming disaster, but our objections should never sound as attacks.
Most of the role models we venerate highlight uncompromising leadership, vision and strength. But it is better to be proven wrong earlier than to be judged in hindsight after a catastrophe.