A Project Decision
Imagine you are in charge of a project to upgrade you library’s computer systems. After a year of searching you made the decision to buy $700,000 worth of new computers. So far $500,000 has been spend over the past year on non-refundable equipment purchases. As you prepare to place the last order for $200,000 another vendor approaches with a new offer. They can sell you the same number of computers, but these ones are twice as fast with twice as much memory for only $300,000. They are not compatible with the other set and would completely replace them.
Do you keep purchasing the initial set of computers or do you switch over to the new set? Think carefully about your choice.
Known and Unknown
“Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know. And if one looks throughout the history of our country and other free countries, it is the latter category that tend to be the difficult ones.” – Donald Rumsfeld – US Secretary of Defense – 2001-2006
When Secretary Rumsfeld initially made this statement he became the subject of much ridicule. At first glance it seemed like bureaucratic nonsense with no connection to the real world. Yet over time the statement has been re-evaluated. In fact, it has become so connected with him that Secretary Rumsfeld titled his own memoir Known and Unknown.
The act of leadership requires decisions be made on a constant basis. The challenge is that leaders must often do so in the face of limited knowledge and varying degrees of uncertainty. Rumsfeld’s analysis breaks down the types of challenges facing decision makers.
What is a Decision?
What was the last tough decision you made? Why was it hard? What were the stakes? Who was impacted?
Merriam-Webster online dictionary defines the word decision in a few ways:
- as the act or process of deciding
- the moment of decision has come;
- a determination arrived at after consideration.
We make decisions constantly ever day of our lives. Most of our decisions are small and only affect ourselves. However, leadership decisions directly affect others, ranging from a small few to an entire organization. Therefore, it is important for leaders to understand the art and science of decision making not only for themselves, but their colleagues and customers.
Decision making can be very challenging for leaders because of the impact. Since leadership decisions often have public implications, leaders are subjected to second guessing regularly. Psychologically the very act of making a decision is stressful because of the risk of making the wrong choice. Every decision we make means all other options were rejected in favor of one choice. This sense of finality can be scary. Sometimes people avoid making a decision altogether. However, not making a decision is a form of decision making with consequences in itself. Deferring decisions may be useful at times, but often there comes a point where making a decision is unavoidable.
The Search for Homo Economicus
Although it is easy to view the discipline of economics as simply about money and prices, it is at its heart the science of how decisions are made. That is, the economy is the sum total of the decisions made by the people participating in it. Any market, whether it be for houses, concert tickets, or collectable shoes, has its price largely decided by supply and demand and bounded by regulations and laws. Economics is full of models that seek to determine how people operate in markets, along with understanding the obstacles in the process. The sub-field of Behavioral Economics specifically examines how people actually make decisions.
At the core of classic economics is a fictitious being called Homo Economicus. He is the completely rational man who makes optimal decisions based on his own well-defined set of interests. For example, when Homo Economicus is shopping for a used car, in theory his preference criteria would be clearly ranked in advance such as:
- Fuel Economy
- Repair History
He would not be confused by dealer sales, car color, stylish floor mats, or the time of day, week, or month. Homo Economicus would move dispassionately through his criteria in a quest to make the optimum decision between Car A, Car B, or Car C. However, when economists search for Homo Economicus in the real world, it is like hunting for Bigfoot. Rumored sightings abound but he is never found.
The reason why Homo Economicus is a myth is simple: humans do not think rationally without a lot of effort! Our emotions and embedded cognitive misunderstandings constantly lead us astray. We are often driven by fear, anger, love, and beauty. These emotions pass quickly, yet can strongly influence our decisions in the heat of the moment. Many people buy a car based on its color and styling. These are attributes that have little economic impact, but lots of emotional weight. One need remember how the pandemic lead to a rush on toilet paper. The fear of being without those white little squares drove people to stockpile the product even though supply lines were functioning normally. Homo Economicus we are not!
Your Brain on Fallacies
The ability for our minds to think rationally is a developed skill, which even in experts can quickly go faulty. Our brains were built in an age when life and death decisions were made on the Savannah of Africa. Unfortunately, they struggle to adapt to the complex modern world, where evolutionary cognitive short cuts lead to faulty decision making. These modes of thinking are known as fallacies. Out of the hundreds of documented fallacies there are a few major ones that seriously impact decision making.
Here are five examples that have a huge effect on us:
Sunk Cost Factor – A cost that has already been incurred and cannot be recovered. This means continuing on a path based on the amount of time and resources already expended rather than understanding the potential value saved by stopping. Summed up in the phrase, “Throwing good money after bad.”
How to Avoid It: Think like a new boss. “If I had no attachment to this project would I keep it going?”
In the midst of the doldrums of 1985, IBM leadership was debating moving the core business from memory chips to microprocessors. President Andy Grove posed a hypothetical question to his colleague Gordon Moore: “If we got kicked out and the board brought in a new CEO, what do you think he would do?” Moore answered without hesitation, “He would get us out of memories.” To which Grove responded, “Why shouldn’t you and I walk out the door, come back and do it ourselves?”
Grove also stated the following to his leadership: “If existing management want to keep their jobs when the basics of the business are undergoing profound change, they must adopt an outsider’s intellectual objectivity.”
Let’s look back at the opening question regarding computer purchases. When making the decision one must ignore what has already been spent. It cannot be recovered. The real question is would you spend $200,000 for an inferior product or $300,000 on a better product?
Confirmation Bias – Seeking only data that supports the decision maker’s preferred choice and rejecting all evidence to the contrary.
“I only get my news from the Alphabet News Channel because Beta News Channel is biased against my candidate.”
How to Avoid It: Seek out alternative viewpoints. “What would it take to prove my idea is wrong?”
A prime example of the confirmation bias can be found in social media. Author Eli Pariser in his book called “Filter Bubble” showed how confirmation bias is amplified by algorithmic editing, which displays only information people they are likely to agree with, while excluding opposing views. This leads to increased polarization and disagreement.
Availability Bias – Making a decision based only on the evidence that is readily available. The decision maker does not dig deeper or look broader to find contradictory evidence.
“My five friends in the room all liked my product idea, so it must be great!”
Why to Avoid It: Do more research. “What other sources of information can I look at beyond what is available now?”
Here is a simple way to understand the availability bias. Quickly name three large quick service food/drink franchises? I suspect one or all of the businesses on your list were McDonalds, Subway and Starbucks. The reason they come top of mind is due to the number of locations. There are 37,000 + McDonalds, 41,500+ Subways and 30,000+ Starbucks outlets. Odds are one or even all three are located within three miles of your home. That is why when you think about getting something to eat or drink these franchises come to mind.
Outcome Bias – The outcome bias is an error made in evaluating the quality of a decision when the outcome of that decision is already known.
“Playing the lottery was a great decision since I won.”
Why to Avoid It: Consider the probabilities of it happening again. “Did this outcome happen because of pure chance?”
Competitive sports are very susceptible to the Outcome Bias. When games come to down to a final shot it is the winning team that gets all the accolades while the losing team is left to consider what went wrong. In the end pure chance made be the actual determinant of success. As an example, consider Super Bowl XLIX between the Seattle Seahawks and the New England Patriots. In the final minute of the game Seattle was on the New England goal line one yard away from scoring the go ahead touchdown. Instead New England defender Malcolm Butler intercepted a Russell Wilson pass to seal the win for the Patriots. The media and the entire Pacific Northwest was shocked because they assumed the Seahawks would give the ball to their powerful running back Marshawn Lynch. In an interview after the game, Seahawks coach Pete Carroll defended the decision this way:
(He) saw a front stacked against a power run and a match-up he felt he could exploit with a short route against a rookie corner who had zero career interceptions. And he didn’t want to run, get stopped short, burn his final timeout and be boxed into calling a pass on third down.
“You could run on 2nd down, call timeout, have to throw on third and score, or incompletion and have to choose (run or pass) on the final down,” Carroll texted. “That’s ball logic, not 2nd guess logic … you never think you’ll throw an interception there, just as you don’t think you would fumble.”
Essentially, Carroll’s defense of the play call was that it was the right one to make in that situation even if the result came out bad. However, for sports fans the interception colored the view of the Coach’s decision.
Anchoring Bias – The anchoring bias where an individual depends too heavily on an initial piece of information offered, considered to be the “anchor”, to make subsequent judgments.
“The salesman says this car retails for $20,000. It must be worth at least that much.”
Why to Avoid It: Consider the first piece of information very lightly.
“If I made the first bid/offer/choice, what would it be?”
In a study by Dan Ariely, an audience is first asked to write the last two digits of their social security number and consider whether they would pay this number of dollars for items whose value they did not know, such as wine, chocolate and computer equipment. They were then asked to bid for these items with the result that the participants with higher two-digit numbers submitted bids that were between 60 percent and 120 percent higher than those with the lower social security numbers. This showed that a random number can easily become their anchor.
A common situation where we encounter the anchoring bias is during car shopping. The dealer creates an anchor with the sticker price. Most people negotiate based on that number which they have no idea how it was determined. I was enlightened as to how meaningless the sticker price is when I was in the market to lease a new vehicle. After being given an initial offer I happened to meet a friend at a dinner party who owns his own car dealership. When I ran it past him, my friend advised that I counter off at almost half the dealer’s opening offer. By doing this I dislodged the initial anchor and bound it to my number. In the end the negotiated price was closer to my counter offer than the sticker price.
Have you ever heard someone say, “Show me the numbers,” when wrestling with a decision? Oftentimes numerical data can be a big asset for making a decision. “Numbers don’t lie,” is a common saying, but statistics are one of the easiest things to manipulate. Former U.K. Prime Minister Benjamin Disraeli famously said, “There are three kinds of lies: lies, damned lies, and statistics.” However, even when presented honestly, interpreting statistical data can lead us straight into the same set of fallacies listed above. It may also be unclear which statistic is actually useful for solving a problem. Take the COVID-19 statistics. Which ones are the most important for understanding the spread of the virus?
- Confirmed cases
- Number of tests completed
- Rate of infection
- Comparisons to other states or countries
All these stats provide valuable data, but ranking their importance is challenging for fully understanding the problem.
Additionally humans are not built to comprehend probability. Our minds prefer definite answers over ones based on chance. Studies have shown we have difficulty understanding probabilistic results. Here is a simple example. Which form of transportation is more dangerous, cars or planes? Based on sheer probability a person is more likely to die in a car accident. Yet thanks to the availability bias, we ignore the probabilities and focus on the news reports of the few big plane crashes rather than the under reported but more numerous daily road accidents.
A classic puzzle that demonstrates how our minds fail to grasp probability is The Monty Hall Problem. This puzzle imagines a player on the game show Let’s Make a Deal. The contestant is asked to pick one of three doors to win a new car. After their initial pick, the host opens ones of the other doors to show a worthless “zonk” prize. The host then offers the player a chance to switch doors. The puzzle asks whether switching doors increases the player’s chance of winning, makes no difference, or reduces their chances. Most people assume it does not matter, but in reality, the probability of winning goes up when the player switches doors. It is counter-intuitive to how we regularly consider probabilities, but is proven through a simple charting of the results.
As the past few months have demonstrated all leaders are at the whim of unexpected events. Crisis coming out of left field challenge our standard decision-making process and upset prior conclusions very quickly. In his book, The Black Swan, author Nassim Nicholas Taleb reflects on how the unexpected has more of an out sized impact on our lives than most people appreciate. He believes that “our blindness with respect to randomness, particularly large deviations” causes us to naively believe that present conditions will exist for the foreseeable future, changing only gradually. In fact, they change more rapidly than we expect and in ways we cannot foresee.
Taleb’s argument encapsulates many of the topics expressed in this article, including our inability to comprehend probability, the availability and confirmation bias fallacies, and failure to appreciate that most of our decisions are made in “wicked” environments.
“Kind” vs “Wicked” Environments
Decision making is also challenging because a choice made one day may not be the best choice on a subsequent day. Environmental conditions surrounding the choice have a direct effect on our ability to choose. In the book Range David Epstein shares research on “Kind” versus “Wicked” environments. A “kind” learning environment is one where the rules are fixed, allowing for easy feedback and adjustment. Think of chess where the games stays the same no matter where it is played. Players can improve quickly since the parameters are fixed and the lessons of a prior game can be directly applied to future games. This is contrasted by a “wicked” learning environment where the rules are either vague or change over time. For example, imagine an entrepreneur who grows an Internet start-up. They think they can repeat the success with a new company, but even after following their prior model the company is derailed by a change in the law, or an upstart competitor, or even a shift in user tastes.
Fear and Decisions
Perhaps the worst place to make important decisions is from a state of fear. This is because it narrows our focus and forces us to make quick judgements based on incomplete evidence. The Nobel winning economist Daniel Kahneman has spent his career understanding how the mind works when faced with choices. We tend to think that our minds have only one way of thinking. Kahneman proposes that we actually have two.
As described in an article from Psychology Today: “In his 2011 book, Thinking, Fast and Slow, Kahneman proposed two different modes of decisional thinking—an automatic, fast judgment based on instinct and emotion and a slower, more rational and deliberative process—that optimally work together, but often come into conflict.”
Fear runs along the instinctual line of thinking. This is understandable from an evolutionary perspective when our ancestor’s survival relied on the ability to detect a snake in the grass or a tiger behind a bush. In our modern world where threats are far more ambiguous, these kind of snap decisions may lead us in the wrong direction. Recall the statistics shared earlier around the dangers of driving versus flying.
For example in the same Psychology Today article, law profession Carl Sunstein “has often argued that after 9/11, the fear-based implementation of burdensome airport screening measures may have resulted in people avoiding air travel and instead dying in car accidents as a result.”
Fear can lead us to play it safe. Even professionals accustom to risk management often err on the side of caution when faced with a difficult decision. In the book Scorecasting: The Hidden Influences Behind How Sports Are Played and Games Are Won, author Tobias Moskowitz and L. Jon Wertheim provide a clear example in a chapter called Tiger Woods is human (and not for the reason you think): how Tiger Woods is just like the rest of us, even when it comes to playing golf. The authors point out that the data shows most professional golfers miss their putts short, not long. This would indicate that the golfer if fearful of overshooting. This in turn led them to miss more putts than needed. Even Tiger Woods at the height of his game was susceptible to this mistake.
There is an old saying loosely translated from the writings of Voltaire that “perfect is the enemy of good.” It implies a major fear based trap in the decision-making process which is the desire to make the “right” decision. This is where someone is afraid to make the wrong choice so they engage in an endless search process that continually delays making the decision. This is known as analysis paralysis. As described in Wikipedia:
Analysis paralysis (or paralysis by analysis) describes an individual or group process when over-analyzing or overthinking a situation can cause forward motion or decision-making to become “paralyzed”, meaning that no solution or course of action is decided upon.
The first step to avoid this problem is to acknowledge that no real-world decision can be perfect due to a lack of information. Essentially, it is an understanding that most decisions are made in “wicked” environments. To move forward, making a decision provides the only way to gain additional information. By trying a solution and seeing the results the decision maker gains valuable data which they use to adjust and make the next decision. This moves them step by step towards an optimal solution.
An expression of this form of leadership decision making is the Design Thinking Model. It is a process centered on continuous experimentation and feedback. There are five stages to design thinking.
- Empathize – Research the problem
- Define – Clearly state the problem
- Ideate – Create ideas and challenge assumptions
- Prototype – Identify possible solutions
- Test – Implement a solution and gain feedback
During the process, the designer will cycle between some or all of the five stages until they reach a point where the problem is effectively resolved. For an experimental leader, a process like Design Thinking is a good way to resolve the fear of mistakes. Not only are mistakes expected in the process they are required to gain important feedback.
Many professions use this approach including software engineers. Venkatesh Rao in his collected online essays entitled “How Software is Eating the World” notes that when creating new code software engineers try their best to make it fail. It is only through breaking the code that problems can be identified and improvements made. For example, when coding a new video game, the designer will test out unusual keyboard combinations to make it crash. If they do, the code is examined and then improved for the next test. This is important because it is better for the designer to find the bugs early rather than outsource that process to their customer after they buy the game.
Bonus Fallacy – The Fundamental Attribution Error
Perhaps one of the greatest fears we have as humans if the fear others, especially those who are different from us even the slightest of ways. Certainly this election season is a text book example of how fear of the other side is used to drive people’s decisions on who to vote for at the ballot box. It also shows how easy it is to think the worst of others. In fact it is the basis of one of the most pernicious fallacies:
Fundamental Attribution Error – When we assign our mistakes to outside circumstances while we assign others mistakes to their personal failings. “I dropped the ball because the sun got in my eyes. He dropped the ball because he is a lazy bum!”
Why to Avoid It: Practice humility and empathy.
“If it was me who failed like they did, how would I explain it?”
One of the worst ways this manifests is through racism. After all racism assumes the worst about another person or group based largely on their skin color or ethnic background. This is especially relevant this year with the summer protests triggered by the murder of George Floyd. There was a huge discussion over institutional racism and what it means to be an anti-racist. For me the story that hit home for me the most was told by inspirational speaker and author Shola Richards. In the midst of the protests he shared a story on social media.
Twice a day, I walk my dog Ace around my neighborhood with one, or both, of my girls. I know that doesn’t seem noteworthy, but here’s something that I must admit:
I would be scared to death to take these walks without my girls and my dog. In fact, in the four years living in my house, I have never taken a walk around my neighborhood alone (and probably never will).
Shola’s worry is that if he walks alone his white neighbors will only see a black man scoping out houses. However, when he is with his daughters or his dog the perception changes to that of a dog owner or Dad. This example shows how the fundamental attribution error not only effects the decision maker but also those who are impacted by the decision itself.
So how do we get past this endless cycle of misunderstanding? I believe that kindness is the key. To be kind to others forces us to consider them as human beings. It makes us think differently by having us take time to reflect on that person or decision rather than be trapped in reaction. Kindness makes everyone involved from the receiver, the giver, to any observers feel good. Consider that the next time you must make a decision that will affect others.
The field of decision making is immense. The skill is not one that is mastered in a short time. Instead, leaders must commit to improve their abilities continually over the course of their careers. Leader must be willing to remain a perpetual student, always learning and never being satisfied with their current state. That is the burden and the reward of leadership.