The May issue of Harvard Business Review has a spotlight on making better decisions, which primarily focuses on behavioral economics. Leaders as Decision Architects discusses how to mitigate the effects of cognitive biases and low motivation on decision making. People have two main modes of processing information: (1) "automatic, instinctive and emotional" and (2) "slow, logical and deliberate." The authors argue that "engaging System 2 requires exerting cognitive effort, which is a scarce resource... as the cognitive energy needed to exercise System 2 is depleted, problems of bias and inadequate motivation may arise." The authors seem to favor System 2, but in a business world heavy on interpersonal skills, there is also value in trusting your instincts if you think a potential partner is not trustworthy. That is not a cognitive bias. For instance, many people have a tell-tale sign right before/when they lie. You can spend your "cognitive energy" trying to articulate what the sign was - and perhaps if you are new to this, it is not a bad thing to articulate what gave you pause - or you can go with your instincts and try to formalize it all some other time.
(But the next article, Outsmart your own biases, actually advocates precisely against that sort of behavior because "unless you occasionally go against your gut, you haven't put your intuition to the test." My take is, if you don't follow your instincts and your instincts were right, you'll associate a lot more intense pain to the experience - in addition to having to deal with a problem employee you didn't want to hire in the first place - than you would if you just go with another qualified candidate from the beginning. Again, I don't think intuition would be a cognitive bias.
A cognitive bias would be, School A distorted something the star varsity athlete of School B had posted on social media to get him suspended from a key game in the rivalry between Schools A and B and unfairly ruin his reputation, so now someone who graduated from School B cringes whenever he sees the name of School A on a resume and has sworn never to hire anyone who has ever been affiliated with School A. Yes, it would be good for that person to pause and analyze his aversion to School A logically because he may be depriving himself of star performers on his team.
Another cognitive bias would be, you have a rags-to-riches personal story or at least an underdog-who-beat-the-odds story, so you tend to be more inclined to give a chance to other people who claim to be underdogs too, but you are blind to the fact that people are simply presenting themselves in the way they think will make you more likely to hire them. More on how to outsmart your own biases below.)
The authors of Leaders as Decision Architects identify two roots for poor decision making: insufficient motivation (you know the right thing that you're supposed to be doing but don't do it) and cognitive biases (you decide on the wrong course of action). Again, they seem to think that System 2 is the solution to everything ("Because problems of motivation and cognition often occur when System 2 thinking fails to kick in...") Well, if the issue is motivation, maybe it'd be helpful to first have the employee connect with a positive emotional response - even if it is just discussing their favorite sports team - and then once they are in a productive state, push them along on the path to completing the task. The funny thing is, a key example they give is that of a company a co-author got involved with, which was plagued with severe attrition within months of onboarding. "The training failed to build an emotional bond between new hires and the organization and caused them to view the relationship as transactional rather than personal."
A quick mention is made of the 2008 book Nudge: Improving Decisions about Health, Wealth and Happiness by Thaler and Sunstein, and "Step 4: Design the solution" in the HBR article was particularly informative, but what I found most helpful was the sidebar on common biases that affect business decisions (click here for the chart in one picture with helpful explanations for each, courtesy of HBR's Twitter feed):
- Action-oriented biases: excessive optimism and overconfidence
- Biases related to perceiving and judging alternatives: confirmation bias, anchoring and insufficient adjustment, groupthink, egocentrism
- Biases related to the framing of alternatives: loss aversion, sunk-cost fallacy ("we pay attention to historical costs that are not recoverable when considering future courses of action"), escalation of commitment ("we invest additional resources in an apparently losing proposition because of the effort, money and time already invested"), controllability bias ("we believe that we can control outcomes more than is actually the case").
- Stability biases: status quo bias and present bias.
The authors of Outsmart Your Own Biases advocate the use of checklists and algorithms to bypass strong emotional attachments and stay focused on the right things; they also recommend "trip wires" at key points in the decision-making process. In thinking about the future, they advise to make three estimates for every forecast, think twice (make two forecasts and take the average), take an outside view (view the project you are involved in as an outsider) and - my favorite activity - use premortems, i.e., imagine a future failure and then explain the cause. It is also called prospective hindsight and was the topic of an old blog post of mine. The authors also recommend thinking about objectives and thinking about options. In particular, they refer to Chip Heath, Dan Heath and their book (which I love, as stated here) Switch: How to change things when change is hard and describe the "vanishing options" test: what would you do if you couldn't choose any of the options you're currently weighing?
Finally, Fooled by Experience argues that "we view the past through filters that distort reality." Such filters include the business environment, our circle of advisors (who may censor information) and our own "focus on evidence that confirms our beliefs." Although the advice is a bit trite ("we can base our decisions on a clearer view of the world if we... surround ourselves with people who will speak frankly [and] search for evidence that our hunches are wrong"), it is a good reminder that our view of the past is imperfect at best.
Overall, the issue was a good read. I'm still not sold on behavioral economics but I'm having fun thinking about cognitive biases.