Do we really know how good decisions get made? Gary Klein’s Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making takes aim at commonly held beliefs about the best way to make decisions. Do any of these claims sound familiar to you?
They certainly struck a chord with me. I found myself agreeing more or less to all of the ten or eleven claims that Klein lists in the first chapter of his book. Apparently most of us do — Klein surveyed more than 160 people to compile his list of received wisdom about decision making. All of the claims sound perfectly reasonable. But as Klein points out through numerous case studies, interviews and examples, the situations most decision makers face in real-world conditions are unreasonable.
Which is why what sounds like a bunch of unobjectionable statements from a first-year course in business administration can get us into such trouble when we leave the controlled world of the the classroom or psychology lab.
Klein deliberately doesn’t call these statements myths. All of the claims provide useful guidance in specific — and usually quite controlled — situations. But they are not the universal truths that decision scientists, behavioral psychologists, and business gurus often make them out to be.
Take the claim that “decision biases distort our thinking”. (There have been a lot of books written about this subject lately. Dan Ariely’s Predictably Irrational is one of the best.) You can devise all sorts of clever experiments to demonstrate that human beings do have ingrained biases that affect our thinking. But do these biases cause trouble for us in real world settings? Klein argues that many human biases can also be viewed as sensible defaults.
For example, one of the key experiments used to demonstrate “irrationality” is a game where two players divide a pot of money. They players are anonymous, and do not meet each other. One player is asked to divide the pot any way he likes and make an offer to the other player. That player can decide to accept the deal or refuse it. If the second player accepts, both players get to keep the money. If the second player refuses, neither player gets money.
The rational thing for the first player to do is to offer the second one the smallest amount possible, a dollar. The second player, also being rational, ought to accept that deal since it’s better to get a dollar than to get nothing at all. In practice, most people in the second player’s shoes routinely reject stingy offers and more people in the first player’s choose usually offer something close to a 50-50 split. Certainly some sort of bias is at work here; why would two people who have never met and are unlikely to ever meet cooperate in such a fashion?
Klein points out that in real world conditions, away from the psych lab, you never know when you’ll bump into somebody. Or that you’ll share a mutual acquaintance who likes to gossip. And even though the experimenter has assured the players that their identities will remain strictly confidential, can the players really trust him? Maybe not. In conditions of uncertainty, it makes sense to play it safe and offer something fair. Better to guard your reputation than to be thought of as a cheat or a jerk. Word gets around, you know.
This may be a bias, Klein says, but it’s a useful one. In most situations, it would cause us to behave in our best interest anyway. Maybe our gut instincts are trying to tell us something.
The analogy Klein uses is vision. Our eyes are amazingly sensitive organs, and our brains carry out amazing feats of information processing to make sense of the massive visual input it receives every second. To keep up with all the data, our optical system follows some rules of thumb that allow it to make shortcuts. Most of the time, this works just fine, but it is possible to “fake out” our brains with a cleverly designed optical illusion. But it’s rare that you encounter these optical illusions in the wild. It’s the exception, not the rule.
Another example is the claim that gathering information can help reduce uncertainty. (As an information junkie myself, I strongly agreed with this claim at the start of the book. But I’ve since changed my mind.) Klein points out that in situations with a low signal to noise ratio, when accurate and relevant information is hard to find, spending more time gathering information also means more time gathering distractions. Those distractions can slow you down or cause you to focus on the wrong things. Rather than spending time gathering more data, you’d be better off trying to make sense of what little trustworthy information you have.
In other words, if your problem is information overload, gathering more information isn’t going to help!
Most of Streetlights and Shadows is devoted to a discussion of each claim, why it gets made, when and where it gets misapplied, and what we can do about it. For each, Klein proposes a more nuanced replacement. So “establish clear goals at the start of any project” becomes “when facing difficult problems, we must refine our goals as we try to reach them.”
His new phrases don’t have the unambiguous certainty of the sound bites they replace, but that’s the point. Using rigorous analytical methods canhelp us in well-ordered situations. But we often can’t arrive at definitive answers in our unstructured world. We have to rely on our experience and expertise instead.
I especially liked the contrast Gary Klein drew between puzzles and mysteries. With a puzzle, you know what the solution looks like and can recognize its pieces, even when you’re not quite sure how to put it all together. It has set rules that don’t change.
With mysteries, you’re often not sure what “solved” looks like. It’s tricky to identify relevant clues, and difficult to see how they fit together. You face ambiguous or conflicting information. And there’s often a twist to it — a bit of context that can put the whole situation in a new light.
Once you know the trick, it’s fairly easy to solve a puzzle. They have clear rules. You can develop a process for solving similar puzzles of the same kind.
Mysteries are open-ended, and each one is different. You can develop expertise in solving mysteries, but you can’t really create a routine or checklist for doing it.
The claims Klein discusses in Streetlights and Shadows work well for puzzles, but not for mysteries. Knowing which kind of problem you have can help you pick an effective decision making strategy. Do we need to drill people on a procedure checklist or do we need to help them explore and develop expertise?
Given that most of the puzzle-solving problems can be solved with rational, analytical methods, and are thus suitable for automation, Klein would argue that most knowledge work today is of the mystery-solving kind, and only becoming more so as our computers become more capable.
Which means that the ten claims he dissects, while true and useful in some situations, aren’t nearly as relevant as they once were. So it’s important that we learn to see past them and embrace other ways of solving difficult issues.
But what are these other ways?
The title Streetlights and Shadows comes from an old joke.
A policeman sees a drunk staring at the ground beneath a streetlight. “What are you doing?” the cop asks.
“Looking for my keys.” says the drunk. “I dropped them in the dark alley over there.”
“Then why are you over here?” asks the policeman, confused.
“Because the light’s so much better over here.”
So far, Klein says, we’ve been searching for the keys to adaptive decision making under streetlights: in classrooms, labs, and other controlled environments with clear metrics and fixed timetables. But real-world situations aren’t structured like this. As a result, most of what we’re taught about how to make tough choices falls apart in real-world situations. In the ambiguous, dynamic, and shadowy situations we often find ourselves, basing our actions on these conventional claims about decision making can be useless — or even dangerous.
Where we ought to look instead is out in the field, in practice, with all the messiness that that entails.
In a way, Streetlights and Shadows can be read as a defense of human expertise, a contrast to proponents of big-data computation, machine control, or The Wisdom of Crowds. In some domains, tacit knowledge gained over a lifetime of experience outperforms most of the rigorous, analytical methods that have been developed so far.
And on some level, we recognize that. Though Gary Klein found that most people agreed with the ten or eleven claims set forth in the book, he found that few of his respondents regularly practiced them. They did not routinely make risk management plans, correct for human biases, or wait until all the evidence had been collected. We may use these analytical tools to justify our decisions, but they’re more often taken at the gut.