Battling Bias

A man wearing multiple pairs of glasses

Decision-making is one of the few variables that crosses all business functions, affects all companies regardless of size, and leads to success or failure wherever it is used. However, decision-making is not a science: it’s highly subjective and shown to be susceptible to cognitive biases.

In an organization, cognitive bias works at two levels—the individual and the group—influencing decision-making and its subsequent outcomes for the organization. The results are often errors or simply bad choices. Examples are plentiful: Lehman Brothers’ excessive risk-taking and subsequent bankruptcy; Toyota’s response (or inadequate response) to emergent quality problems, resulting in innumerable recalls; VW’s decision to employ programming that manipulated the results of vehicle emissions testing; Target’s rushed and probably oversized entry into Canada; and BP’s decisions to cut costs and skimp on safety practices, leading to the Deepwater Horizon oil spill.

Some errors, of course, are unavoidable, but systemic ones can be prevented, largely by addressing the propensity for bias.

RATIONALITY

Rationality—working with reason or logic to find an optimal choice given the information available—is the most prevalent approach to making decisions. In the 1985 American Political Science Review article “Human Nature in Politics: The Dialogue of Psychology with Political Science,” Herbert Simon, one of the early researchers looking at the economic concept of maximizing utility as the base for decision-making, claimed “individuals have a strong need to maintain a consistent cognitive system that produces stable and simplified cognitive structures.” He later qualified his position with the concept of bounded rationality, suggesting that perfectly rational decisions are often not possible in practice. Rather, rationality is limited in decision-making by the tractability of the decision problem, the cognitive limitations of individuals’ minds, and the time available to make the decision.

Rationality powerfully explains both individual and business behaviour. But as research published by Amos Tversky and Daniel Kahneman demonstrated in the 1974 Science article “Judgment Under Uncertainty: Heuristics and Biases,” there are an array of decision-making patterns that deviate from the rational model. They suggested that people make decisions, especially those involving some risk, not based on the potential value of the outcome but on the believed value of the losses and gains. Further, the risks and values are assessed with heuristics—mental shortcuts based on experience, which focus on one or a few aspects of a problem and ignore others. Sometimes the process works, but often what we believe to be true and what is actually true are different. Rationality structures both the data we receive and the data we select, but using heuristics often results in cognitive bias.

DISTORTIONS & OVERCONFIDENCE

According to Tversky and Kahneman’s research, a key distortion or cognitive bias in decision-making is overconfidence. There is a distinct lack of humility in business decisions. The best illustration is entrepreneurs who overestimate the value of their company and its ability to grow. This is the “hockey stick illusion” often seen in funding presentations.

Their research also demonstrated that investors suffer from the same syndrome when it comes to their belief that they can pick winners. Company planners tend to be overly optimistic, which is why annual revenue projections frequently look like a series of hockey sticks, while actual performance is closer to a straight line. The ambition of forecasters tends to fail time and time again. In a blog entitled Hockey Stick Dreams, Hairy Back Reality, McKinsey’s Chris Bradley (2017) suggests that decision-making is underpinned with the fallacy, “I will get better results next year by trying to do roughly the same thing as last year but just a little bit better.” As the new adage goes, “Hope is not a strategy.”

“Just as inputs in manufacturing should be tested for quality, the inputs to decisions should also be tested for quality.”

Confirmation Bias: an outcome of overconfidence is often complacency. More information may not always be the answer because we often seek information that supports our view—an approach known as confirmation bias. Or, put another way, we see what we want to see. Experience (heuristics) is frequently a good counterweight, and so is the question “what if?”

Anchoring Bias: We all start our decision-making at reference points, which serve as our anchors and bias our thinking. We frequently use past performance as an anchor even though it often does not reflect the future. Investment analysts fall subject to this bias, basing their chart evaluations on repeat patterns.

Analogy Bias: This form of bias is similar to anchoring bias: in both, we approach a new problem by referring to similar problems we’ve faced in the past where we have been successful. But in a fast-changing industry, experience is not always a good yardstick. We recently consulted with a Canadian company that expanded into the United States using the same marketing strategies that had been successful in Canada. The company did not fully realize why it was unsuccessful and losing money in the United States. Its strategy in Canada had been based on a near technological monopoly and had generated high returns, but in the United States, the company faced fierce competition from companies with a long and conservative past. Further, the U.S. competition was growing rapidly, using innovative strategies that the company was not familiar with.

Availability Bias: This form of bias is often a product of convenience or low cost. Universities use students as subjects in social sciences because the students are available and affordable. Similarly, businesses may rely on shallow evidence because it is easy to obtain or because it fits their budget. The recency effect—remembering best the information most recently provided—also biases information, and short-termism—concentrating on short-term outcomes for immediate profit at the cost of long-term interests—is a perennial criticism of business.

Optical Bias: Optics can preoccupy decision-makers, especially those in the public policy domain. Optics will often overshadow substance or evidence. Another consideration is seizing the moment. Decision-making may be biased when a good acquisition is pursued by many suitors, short-changing a thorough and time-consuming analysis. Did Sobeys (Empire Co. Ltd.) rush its acquisition of Safeway in 2013, spurred by the battle with Metro Inc. for the same purchase? The deal looked sound at the time, but only a few years later, Sobeys wrote down approximately 50 per cent of the purchase price (CA$2.9 billion). If an acquisition looks too good to be true, it might just be.

GROUPS & HERD MENTALITY

Groups add layers of complexity and cognitive bias to decision-making. Because many business decisions are made by teams or groups, awareness of these bias traps and influences and how they may be layered becomes even more important. In theory, groups should make better decisions than individuals because groups bring diverse views to issues. But groups suffer the same biases as individuals, with each group member bringing baggage to the decision-making process.

Just as information is at the root of individual decision-making, it is also at the root of group decision-making. However, in groups, the value of information is a reflection of who brings it to the table and their status. Status can both block information and enhance its value beyond what it deserves. Where you sit at the table matters.

Every group has its own culture and the culture structures the outcome and controls information. For example, some groups are open to new ideas and some are less so. Culture often structures the discussion and who takes part. There are norms that influence not only the level of discussion, but also the time allocated to individuals. Understanding the culture and norms of any group is key to being an effective participating member.

Leadership also adds a layer of complexity to group decision-making. Leadership is a small factor in individual decision-making, but it plays out as a strong influence in group decisions and dynamics. Leadership can complicate the bias, but it can also be the starting point for a culture of bias recognition. In a public company, the board of directors leads the organization and must take responsibility for instigating awareness of and addressing bias.

Irving Janis, known for the concept of “groupthink”—the systematic errors made by groups when making decisions—suggested that groups favour interpretations and decisions that lead to stability. While harmony is an acceptable and desirable group norm, group members interpret information through this lens, avoiding controversial issues or alternative solutions. The result is irrational or dysfunctional decision-making. Collective sense-making streamlines decision-making, but it does not necessarily lead to better decisions.

In Guide to Decision Making: Getting It More Right Than Wrong, Helga Drummond, an expert on decision-making, lists the key characteristics of group thinking as self-serving, superficiality, and consensus-generating. These characteristics leave little space for dissenters in a cohesive group, but they do lead to what Drummond labels “a conspiracy of optimism.” Groups succumb to the same overconfidence observed in individual decision-making and, like individuals, close the door to new and sometimes contrary information. They also close the door to contrarians. The decision by Sobeys to acquire Safeway was validated and applauded, internally and externally, by the market, the financial press, and every retail expert. No awkward questions were raised; no one asked (or heard) the question, “What if we cannot execute?”

AVOIDING BIAS

Completely removing bias is probably impossible, but avoiding systemic errors is possible by de-biasing the organization. The following steps will help achieve this:

  1. Force Awareness of Potential for Bias

Awareness of the issues and their impact will go a long way. Awareness must include and start from the board level and trickle down, and extend to consideration of the information used, how it was collected, and how it is weighed.

  1. Conduct a Frank Self-examination.

There must be clarity surrounding the objectives and strategy being pursued. Although this seems obvious, the concept is not well defined or understood, even in smaller organizations. There must be an acceptance of trade-offs; very few decisions are optimal.

  1. Critically Evaluate Knowledge used to Make Decisions

Drummond proposes a simple set of questions that individuals may use to suss out bias:

  • What do I know?
  • What do I think I know?
  • How do I know?

These questions can help your organization evaluate whether decisions are made using a solid foundation of facts.

Just as inputs in manufacturing should be tested for quality, the inputs to decisions should also be tested for quality. Assess whether the data or analysis that informs the decision is accurate and unbiased. Kahneman recommends viewing “decisions as a product manufactured in an organization.” Review your decisions as you would want to review the process of production.

  1. Diagnose the Cultural Codes

Diagnosing the cultural code of groups will lead to a greater understanding of their decision-making and the biases informing the process.

  1. Reframe Problems.

How an issue is framed or the problem defined impacts the decision-making process. Kahneman and Tversky’s research suggests that people prefer to avoid losses than acquire equivalent gains. Therefore, when a problem is defined negatively, as a loss, those making the decision will embrace risks to avoid the loss. If the problem is defined positively, the decision-makers are more likely to be risk intolerant. This means, among other things, that a lot of time and effort should be spent on framing; however, we usually take the problem as given when reframing the problem may provide a simple resolution.

  1. Actively Seek Contrary Viewpoints

Balance in decision-making can also be achieved by seeking specifically contrary data, or establishing contrary decision-making groups. If there are two teams, one the usual internal team and the other a completely different external team, both teams will suffer from cognitive biases, but their biases are likely to be different.

In the 2007 Harvard Business Review article “Performing a Project Premortem,” Gary Klein offered an interesting solution for addressing cognitive bias: conduct a “premortem.”

Instead of assessing what went wrong at the end of a project, Klein recommends briefing the team on the plan, then asking the members to assume the project failed, write a brief history of the disaster, and evaluate what went wrong. A premortem creates an environment that makes it not just safe but also inviting for dissenters to speak up. The exercise overall reduces uncritical optimism.

Whatever way you choose to deal with cognitive bias, begin with acknowledgement that bias exists in all organizations and on all teams.

Cognitive bias is not just an academic construct. Without addressing the issues, bias will influence decision-making, usually for a poor outcome. Addressing the bias should be recognized as part of necessary organizational change, leading to the enhancement of the organization’s culture and, ultimately, better decision-making.

REFERENCES:

Herbert A. Simon, “Human Nature in Politics: The Dialogue of Psychology with Political Science,” American Political Science Review 1985, vol. 79, no. 2, pp. 293–304, doi:10.2307/1956650.

Amos Tversky and Daniel Kahneman, “Judgment Under Uncertainty: Heuristics and Biases,” Science 1974, vol. 185, no. 4157, pp. 1124–1131.

Chris Bradley, “Hockey Stick Dreams, Hairy Back Reality,” McKinsey & Company (blog), January 31, 2017, www.mckinsey.com/business-functions/strategy-and-corporate-finance/our-insights/the-strategy-and-corporate-finance-blog/hockey-stick-dreams-hairy-back-reality.

Helga Drummond, Guide to Decision Making: Getting It More Right Than Wrong, Economist Books, London, 2012.

Gary Klein, “Performing a Project Premortem,” Harvard Business Review, September 2007, https://hbr.org/2007/09/performing-a-project-premortem.

Leave a Reply

Please submit respectful comments only, including full name, professional title, and contact information (only name and title will be posted). Required fields are marked *