9.4 Fixing Our Decisions

Ample evidence documents that even smart people are routinely impaired by biases. Early research demonstrated, unfortunately, that awareness of these problems does little to reduce bias (Fischhoff, 1982). The good news is that more recent research documents interventions that do help us overcome our faulty thinking (Bazerman & Moore, 2013).

One critical path to fixing our biases is provided in Stanovich and West’s (2000) distinction between System 1 and System 2 decision making. System 1 processing is our intuitive system, which is typically fast, automatic, effortless, implicit, and emotional. System 2 refers to decision making that is slower, conscious, effortful, explicit, and logical. The six logical steps of decision making outlined earlier describe a System 2 process.

Clearly, a complete System 2 process is not required for every decision we make. In most situations, our System 1 thinking is quite sufficient; it would be impractical, for example, to logically reason through every choice we make while shopping for groceries. But, preferably, System 2 logic should influence our most important decisions. Nonetheless, we use our System 1 processes for most decisions in life, relying on it even when making important decisions.

The key to reducing the effects of bias and improving our decisions is to transition from trusting our intuitive System 1 thinking toward engaging more in deliberative System 2 thought. Unfortunately, the busier and more rushed people are, the more they have on their minds, and the more likely they are to rely on System 1 thinking (Chugh, 2004). The frantic pace of professional life suggests that executives often rely on System 1 thinking (Chugh, 2004).

Fortunately, it is possible to identify conditions where we rely on intuition at our peril and substitute more deliberative thought. One fascinating example of this substitution comes from journalist Michael Lewis’ (2003) account of how Billy Beane, the general manager of the Oakland Athletics, improved the outcomes of the failing baseball team after recognizing that the intuition of baseball executives was limited and systematically biased and that their intuitions had been incorporated into important decisions in ways that created enormous mistakes. Lewis (2003) documents that baseball professionals tend to overgeneralize from their personal experiences, be overly influenced by players’ very recent performances, and overweigh what they see with their own eyes, despite the fact that players’ multiyear records provide far better data. By substituting valid predictors of future performance (System 2 thinking), the Athletics were able to outperform expectations given their very limited payroll.

An image of money in a jar labeled "retirement."
Nudges can be used to help people make better decisions about saving for retirement.

Another important direction for improving decisions comes from Thaler and Sunstein’s (2008) book Nudge: Improving Decisions about Health, Wealth, and Happiness. Rather than setting out to debias human judgment, Thaler and Sunstein outline a strategy for how “decision architects” can change environments in ways that account for human bias and trigger better decisions as a result. For example, Beshears, Choi, Laibson, and Madrian (2008) have shown that simple changes to defaults can dramatically improve people’s decisions. They tackle the failure of many people to save for retirement and show that a simple change can significantly influence enrollment in 401(k) programs. In most companies, when you start your job, you need to proactively sign up to join the company’s retirement savings plan. Many people take years before getting around to doing so. When, instead, companies automatically enroll their employees in 401(k) programs and give them the opportunity to “opt out,” the net enrollment rate rises significantly. By changing defaults, we can counteract the human tendency to live with the status quo.

Similarly, Johnson and Goldstein’s (2003) cross-European organ donation study reveals that countries that have opt-in organ donation policies, where the default is not to harvest people’s organs without their prior consent, sacrifice thousands of lives in comparison to opt-out policies, where the default is to harvest organs. The United States and too many other countries require that citizens opt in to organ donation through a proactive effort; as a consequence, consent rates range between 4%–44% across these countries. In contrast, changing the decision architecture to an opt-out policy improves consent rates to 86% to nearly 100%. Designing the donation system with knowledge of the power of defaults can dramatically change donation rates without changing the options available to citizens. In contrast, a more intuitive strategy, such as the one in place in the United States, inspires defaults that result in many unnecessary deaths.

License

Cognitive Psychology Copyright © by Robert Graham and Scott Griffin. All Rights Reserved.

Share This Book