Uncovering Hidden Bias: The Silent Killer Of Projects, Products, and Transformations

A large naval ship powers steadily through a fog bank. Suddenly, a light emerges in the distance, and it appears to be coming straight for them.

Alarmed, the captain raises the other ship on the radio.

“Change course immediately,” he says.

“I think you’ll have to change course,” comes the reply.

“Absolutely not,” shouts the captain! “This is a US Navy vessel. You change course!”

“That’s great, captain. But I’m afraid this is a lighthouse.”

Decisions, Decisions…

Decision making is a key facet of leadership. But leadership doesn’t mean making all of the decisions yourself. It means knowing which decisions to make, and which to delegate to others. It also means knowing how hidden cognitive bias can affect any individuals’ decision making, and setting up systems to take advantage of a diversity of perspectives in order to arrive at the best possible decisions.

Take budgets and planning for IT projects as an example. In 2011, McKinsey research reported that a whopping 45 percent of IT projects come in over budget.

This is the planning fallacy at work.

The planning fallacy is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed. — Wikipedia

Estimates have always been a thorn in the side of tech leaders. There is a lot of uncertainty involved in technology projects, and business stakeholders are very keen to know exactly when new systems will be available, when certain bugs will be fixed, when integrations or migrations will be complete, freeing up critical staff for other important work.

Estimates are often made under pressure and without enough solid data to make an accurate prediction. Teams under pressure tend to fall back on unreliable methods of estimating, or relying on the opinion of the most senior person on the team. Failing to take into account a wide range of perspectives is just about guaranteed to produce a bad estimate.

Under Pressure

The widely acclaimed book “Thinking, Fast and Slow,” by Nobel Prize winner Daniel Kahneman, presented much concrete experimental evidence showing how stress from either physical or mental exertion dramatically impairs the brain’s ability to make decisions.

The mind, explains Kahneman, can be thought of as using two distinct systems for cognition. The first system, System One, is fast and performs its work automatically or instinctively. It cannot be directly observed or modified by the thinker because it happens so quickly. This is the part of the brain that evolved from the days on the African Savannah, where quick thinking and evaluation often meant the difference between life and death.

System Two is the part of the brain that is able to do advanced planning and complex calculation. It is responsible for analytical thinking. Unfortunately, System Two is also energy intensive and gets easily tired. Kahneman even goes so far as to call it “lazy.”

System One is useful in a crisis situation because it enables us to size up a threat in milliseconds, and react to it. Unfortunately, in complex human societies, it is quite often wrong. System One is where our cognitive biases live. It feeds us erroneous information that our slower, more deliberate System Two must carefully sort out.

However, when System Two is placed under stress, it quickly loses its ability to do its work analyzing data and coming up with right or at least reasonable answers. Meanwhile, System One is constantly feeding it erroneous warnings about external threats. It’s like a tired and overworked scientist being assisted by a frenetic and excitable assistant who is afraid of everything.

What Kahneman’s research implies is that the more stress you and your team are under, the greater the likelihood that they will be unable to counter their own biases in tough debates or discussions. It will also be harder for them to find creative and novel solutions in complex systems or scenarios.

Feelings Are Data

It is critical to not mistake emotional responses for bias.

Emotions matter considerably in conflict-rich environments. In negotiations with international terrorists, as described vividly by former FBI hostage negotiator Chris Voss, in his excellent negotiation book “Never Split The Difference”, emotions are everything in negotiation. Voss, emphasizes that the key to success in successful negotiation is building empathy with your counterpart. Empathy requires being in touch with both your emotions and those of the other party.

Emotions are part of your thinking process, and they contain valuable information if you know how to interpret them. The leading edge neuroscience of Lisa Feldman Barrett, illustrated in her book, “How Emotions Are Made”, shows that what we tend to think of as “feeling” and “thinking” are actually part of the same cognitive process. Feelings are simply responses by your body in anticipation of either positive or negative outcomes based on data coming in from your senses. The brain is constantly readjusting its model of the world, and your emotions are a barometer representing your body’s physiological responses to that data.

It’s not possible to remove all the emotion from your judgement. They are part of the decision-making process. Instead, it’s better to distribute the load of decision making across more minds, from a variety of perspectives. That will get you a better result than simply trying to be “logical”.

Successful leaders are able to create conditions for good decision making. They protect their teams from undo stress and pressure. They involve a wide range of perspectives in discussion and planning. They are careful to be slow and deliberate in making decisions, and are careful to avoid hidden bias whenever possible.

Want to learn more? Check out our Mindful Leadership Accelerator, where we work with leaders on these and other techniques, now accepting applicants.

Previous
Previous

Why Empathy Is The Key To Building Great Teams, Products, and Companies

Next
Next

Reducing Noise, Boosting Signal: Learning to Say No More Often