Leading With Your Gut

No business cliché is more worthy of repudiation, annihilation and eradication than ‘You’ve got to trust your gut.’”

No business cliché is more worthy of repudiation, annihilation and eradication than ‘You’ve got to trust your gut.’” So wrote Michael Schrage, a research fellow with the MIT Center for Digital Business at the MIT Sloan School of Management, in a recent blog post, reacting to what he believes to be an unfortunate trend of managers being exhorted to rely more on their intuition. “Everyone knows [the saying] that ‘Good judgment comes from experience and experience comes from bad judgment,’” Schrage continued. “But where does bad judgment come from? My answer, and the replicable answer from Nobel Prize-winning research, is that it comes from trusting gut instincts.”

The research he refers to is that of Daniel Kahneman, arguably the godfather of the study of flawed judgment in decision making. Currently professor emeritus of psychology and public affairs at the Woodrow Wilson School of Public and International Affairs at Princeton University, Kahneman won the Nobel Prize in 2002 for his work.

Much of the research conducted by Kahneman and his late colleague Amos Tversky focused on the study of “expert intuition,” such as might be applied by an emergency room doctor making a diagnosis. Expert intuition is not a decision per se, nor is it a hunch; it is a reflexive and instantaneous reaction to a familiar situation on the basis of vast specific experience, knowledge and practice.

An expert’s “intuitive impressions come to mind without explicit intention, and without any confrontation,” said Kahneman, in an interview with the Association for Psychological Science Observer, an online journal. Sometimes, he said, these intuitive impressions lead to a good outcome, but just as often they can lead to overconfidence and, ultimately, bad judgment. “Accessibility, or the ease with which thoughts come to mind, defines intuition,” said Kahneman. “And once people make decisions, they tend to suppress alternative interpretations.”

A recent Wall Street Journal article, “The Yes Man in Your Head,” cited studies with nearly 8,000 participants that showed people are twice as likely to seek information that confirms what they already believe as they are to consider evidence that would challenge those beliefs. “We’re all mentally lazy,” Scott O. Lilienfeld, a psychology professor at Emory University, told The Journal. “It’s simply easier to focus our attention on data that supports our hypothesis, rather than to seek out evidence that might disprove it.”

That phenomenon, known in psychology as “confirmation bias,” is just one of a disturbingly long list of well-documented cognitive biases that routinely skew human beings’ judgment, memory, perception and motivation. For example, “anchoring” is our tendency to rely too heavily on one piece of information when making decisions; “negativity bias” causes us to give more weight to negative information than positive; “normalcy bias” is our disinclination to plan for an outcome that has never happened before; “omission bias” causes us to see action as potentially more harmful than inaction; “consistency bias” is our tendency to incorrectly assume that what we currently think is consistent with our past views; “self-serving bias” describes our inclination to give ourselves credit but not blame; and “clustering illusion” describes our tendency to see patterns where none exist.

That small sample, not to mention the full panoply of human cognitive tics, is more than enough to blow a gaping hole in any claim we may make to being objectively rational. But the question is whether intuitive decisions are any more likely to be irrational than analytical ones. The studies cited in The Wall Street Journal suggest perhaps not: They showed that more data and analysis did not necessarily make people’s judgments more accurate, but simply further entrenched their views.

There are those who believe that intuition is neither a cause nor a symptom of our irrational biases, but rather a highly evolved way of working around them. Gerd Gigerenzer, director at the Max Planck Institute for Human Development, is a proponent of the idea of “bounded rationality,” the notion that any attempt to make a rational decision is limited by the decision maker’s available information, cognitive biases and finite time frame. Within these constraints, optimal decisions must necessarily involve heuristics — essentially shortcuts, inferences or rules of thumb derived from prior experience and analysis. Gigerenzer argues that the use of heuristics — or intuition by another name — represents an adaptive solution to the problem of having to make choices under constraints.

In his 2005 book, “Blink,” Malcolm Gladwell referred to this process as “rapid cognition.” What we normally call “thinking” is a conscious strategy, he said. It is analytical, logical and definitive, but it is also slow, information intensive and, evolutionarily speaking, not always an appropriate survival strategy. Rapid cognition is a second strategy for making decisions that operates a lot faster, but at least initially, entirely below the surface of consciousness. “It’s a system in which our brain reaches conclusions without immediately telling us,” said Gladwell. He does not use the word intuition to describe this phenomenon because he believes that term describes emotional, not rational reactions. “Rapid cognition is thinking,” he said. “It’s just thinking that moves a little faster and operates a little more mysteriously than deliberate, conscious decision making.”

Dan Ariely, a professor of behavioral economics at Duke, is somewhat more circumspect about the uses of intuition. In his latest book, “The Upside of Irrationality,” as in his 2008 book, “Predictably Irrational,” he makes the case that human beings are not and cannot be rational in any absolute sense. Irrationality is programmed in, but our irrational behaviors are neither random nor senseless and, if we are aware of them, they can be harnessed to produce good outcomes, he says. Intuition can be an important part of our decision-making tool kit, but it serves us best when we employ it with a healthy skepticism. Note the results of your intuitive choices, and “trust [your intuition] only after you have evidence that it’s useful,” he suggested.

Like Ariely, Daniel Kahneman counsels self-awareness, and he cautions against using intuition as an easy rationale for any random snap judgment. But even he acknowledges that in real situations where time is limited and there is too much information, “most of the time, we just have to go with our intuition.”

Perhaps that explains the appeal of intuitive decision making. We live in a world in which our ability to digitize, measure, share and track everything offers us a welter of information and data on any subject, and yet that somehow makes us feel less in control and less confident in our decisions. It is possible that in such a world, “going with your gut” is the only rational response.

Download the PDF