At one level, we all know that other people are crazy/make-no-sense/full-of-idiosyncrasies. At another level we expect them to be reasonable/sensible. Kahneman lays out, in a pretty accessible set of stories, the ways in which we are unreasonable. The book draws on years of research on how people make decisions, which led to his Nobel Prize in Economics in 2002. He’s also co-written a new book, Noise (which I haven’t read).
Just to take few examples, he shows how we:
* we are more averse to risk of loss than logically makes sense.,
* our decision making depends on what we did or read just previously (i.e., what’s ‘in mind’),
* construct stories that seem to make sense, even without evidence or contrary to the evidence we have, and
* project rosier outcomes than are likely because we would all like to live in a best-case-scenario world.
The issue of story construction is of particular concern to historians since we are all about constructing stories. We (as a species) are generally pretty desperate to live in a sensible world and coherent stories help us do that. Kahneman points out that this drive causes us to ignore facts that don’t fit the pattern, or create stories and then find the “facts” that make the story neat-and-tidy.
It’s sort of like looking at constellations in the night sky. We can find Orion’s belt or the “Big Dipper” easily enough, but extrapolating from a few points of light into a warrior with a sword or the Big Bear (aka “Ursa Major”) is an act of human projection, based on a particular culture and astrophysical location. One historical analogue (to which I have previously referred) cites Western/European/American “progress”/power over the past 200 years as the baseline of a story of moral superiority (rather than an anomaly in a broader global pattern). At a more personal level, historians also construct stories in which historical actors make sensible decisions when, in fact (like us) they often act based on subconscious prejudices (cultural or genetic).
How many war plans were implemented because of over-optimistic assessments? Ask Kaiser Wilhelm or Robert McNamara! Historical analysis that starts with the (usual and implicit) assumption of rational behavior is pretty likely to be off the mark. This model of rational intentionality is also popular because it facilitates the “blame game.” Even though at some level we know that almost all bad stuff is caused by error rather than malevolence, finding a villain is much more satisfying (and a convenient distraction from looking at our own comparable shortcomings.
This kind of decision-making (Kahneman calls it “fast thinking”) is so common in our everyday lives that we have to consciously try to pay attention to it. This is, of course, much easier to do with other people’s behaviors (spouses are especially useful in this regard). Do we avoid a particular driving route because—once—there was a big back up? Do we avoid foods that—once—caused a stomach upset or headache? Without knowing whether the traffic was the result of a particular accident or construction issue or whether our gut might have reacted to another food/germ/external tension, we can’t make much sense out of a single data point. And few of us have the concentration to study such situations to really understand systemic commuting patterns or gastric dynamics.
In fact, it’s pretty easy to see other people’s non-sensical thinking in all sorts of ways. But the more important lesson from Kahneman’s book is that I do it, too. My unique combination of predilections, biota, and culture are no more exempt from the common human brain-wiring patterns than anyone else’s. By their very unconscious nature, they are hard to see, but my own mishegas (to use a technical term) is no less risible.
An important reminder of humility, to be sure.
At a social level, even without the overlay of all the current political angst, these ordinary human mental patterns help to explain why vaccination rates are lower than one might hope/expect, why people focus on closed-network/self-reinforcing media circuses, and the difficulty of shifting epistemologies, among a host of other phenomena. Fear and inertia play a much larger role in people’s thinking than we like to acknowledge. (Oh, and my own, too!!) (as I have to repeatedly remind myself). The problem of over-optimistic projections leads to our apparent surprise (for the thousandth time) at budget overruns or delays when it comes to construction projects (whether a new kitchen or a multi-tier bridge).
Kahneman won the Nobel Prize for the work that he and Amos Tversky did to help economists begin to shift from their founding/core assumption of economic decisions by rational people. His book is worth reading, even if parts of it are a bit abstruse. Don’t feel you need to read it cover-to-cover. Find the stories (often of experiments they ran) that highlight how people actually think/act/decide in particular situations. You’ll likely find yourself in some of these situations and, if you can remember that, then it will be time well spent.