Why We Need Answers

The human mind is incredibly averse to uncertainty and ambiguity; from an early age, we respond to uncertainty or lack of clarity by spontaneously generating plausible explanations. What’s more, we hold on to these invented explanations as having intrinsic value of their own. Once we have them, we don’t like to let them go.

In 1972, the psychologist Jerome Kagan posited that uncertainty resolution was one of the foremost determinants of our behavior. When we can’t immediately gratify our desire to know, we become highly motivated to reach a concrete explanation. That motivation, in Kagan’s conception, lies at the heart of most other common motives: achievement, affiliation, power, and the like. We want to eliminate the distress of the unknown. We want, in other words, to achieve “cognitive closure.” This term was coined by the social psychologist Arie Kruglanski, who eventually defined it as “individuals’ desire for a firm answer to a question and an aversion toward ambiguity,” a drive for certainty in the face of a less than certain world. When faced with heightened ambiguity and a lack of clear-cut answers, we need to know—and as quickly as possible.

In 1994, Kruglanski and Donna Webster introduced a standard way to measure the need for closure, or N.F.C.: a forty-two-item scale that looked at the five separate motivational facets that comprised our underlying tendency for clarity and resolution—namely, the preference for order, predictability, and decisiveness, discomfort with ambiguity, and closed-mindedness. Taken together, these elements tell us how high our need for closure is at any given point. Heightened need for cognitive closure can bias our choices, change our preferences, and influence our mood. In our rush for definition, we tend to produce fewer hypotheses and search less thoroughly for information. We become more likely to form judgments based on early cues (something known as impressional primacy), and as a result become more prone to anchoring and correspondence biases (using first impressions as anchors for our decisions and not accounting enough for situational variables). And, perversely, we may not even realize how much we are biasing our own judgments.

While the need for closure does vary from person to person—some people are higher in baseline N.F.C. than others—it is, to a large extent, situationally determined: the more in flux and indeterminate our environment, the more we want to reach some sort of resolution. N.F.C. is heightened under time pressure, with fatigue, with excess environmental noise—when a lot of information that is difficult to make sense of is coming at us at the same time—and when we feel that we need to give an opinion. It’s also directly related to stress. In short, its influence peaks under the circumstances of emergency or crisis.

In 2010, Kruglanski and colleagues looked specifically at the need for cognitive closure as part of the response to terrorism. In a series of five studies, they found that reminders of terrorist attacks elevate N.F.C., increasing the need “to develop strong beliefs, form clear-cut impressions, and classify objects and events into sharply defined categories in order to experience certainty and avoid ambiguity.” In the central study, American students were shown a seven-minute slide show that either discussed the 9/11 attacks or talked about the advantages of working at Google. They then completed a filler task and had their N.F.C. measured. Participants shown the 9/11 video scored significantly higher on the N.F.C. scale; in short, simply seeing the terrorist film—not even being in an actual crisis environment—was enough to trigger a heightened need to attain cognitive certainty and resolution.

The researchers also had an opportunity to test their findings in a natural setting. In the two weeks that immediately followed the July, 2005, London-transit bombing, when four explosions killed fifty-six people and injured more than seven hundred, they recruited two groups of just over a hundred participants and had them complete a series of questionnaires. Not only did they find elevated N.F.C. levels, but that need in turn predicted support for counterterrorism policies. The relationship makes a lot of sense. Kruglanski conceptualizes our need for cognitive closure as consisting of two major stages, seizing and freezing. In the first stage, we are driven by urgency, or the need to reach closure quickly: we “seize” whatever information we can, without necessarily taking the time to verify it as we otherwise would. In the second stage, we are driven by permanence, or the need to preserve that closure for as long as possible: we “freeze” our knowledge and do what we can to safeguard it. (So, for instance, we support policies or arguments that validate our initial view). And once we’ve frozen? Our confidence increases apace.

It’s a self-reinforcing loop: we search energetically, but once we’ve seized onto an idea we remain crystallized at that point. And if we’ve externally committed ourselves to our position by tweeting or posting or speaking? We crystallize our judgment all the more, so as not to appear inconsistent. It’s why false rumors start—and why they die such hard deaths. It’s a dynamic that can have consequences far nastier than a minor media snafu. Kruglanski and the political scientist Uri Bar-Joseph hypothesize that heightened N.F.C. and its concomitant cognitive “freezing” were in large part responsible for the start of the Yom Kippur War, the October 6, 1973, Israeli intelligence failure where Israel was caught unprepared for a surprise attack from Egypt and Syria. The warning signs were great, they argue, and the evidence ample. But high-placed Israeli intelligence officials exhibited heightened N.F.C., and they froze on the early conventional wisdom—that the chances of an attack were quite low—and failed to adequately incorporate new signals, blocking off conflicting information as to the attack’s imminence.

So are we all doomed to make uncomfortable errors in reporting—or fatal errors in intelligence analysis—when the stakes are high? Not necessarily. A number of interventions have been shown to lower the N.F.C. imperative, even at those moments when it should be at its highest. Central among them is the fear of invalidity—that is, the fear that a mistake will prove personally costly. If we are afraid that what we say or think will come with a severe penalty, we suddenly become much more cautious in our judgments. The more salient that possibility, the more circumspect our thinking.

The reporting that followed the Boston Marathon bombings was rife with error and rumors run amok. For each story (they robbed a 7-Eleven!), a counter-story followed close on its heels (they weren’t even in the 7-Eleven). The misinformation plagued professional news outlets just as much as it did the amateur reporting efforts of Reddit and Twitter—understandable, if you consider that the circumstances were ideal for heightened need for cognitive closure to kick in. But in the midst of it all, a few calm voices managed to maintain their cool. On NBC, Pete Williams maintained his usual measured composure, ensuring that his stories were verified many times over before they ever appeared on air. On Twitter, Seth Mnookin meticulously reported developments and corrected misinformation.

Maintaining of cool and levelheadedness is not an easy feat, especially in the face of circumstances that urge us all toward some—any—resolution just to regain a measure of sanity in the middle of ever-increasing uncertainty. But it’s not impossible, either. The next time we want to run the race toward closure, to be the first to tweet or post or report, to follow the first thing we hear because it seems so believable, we’d do well to consider the lessons of Boston—not just the moments when the media world fell to its lowest points but those rare instances when it was able to show what the value of measured reporting really is. The need for cognitive closure is a powerful force. But a need is neither a mandate nor an excuse.

Maria Konnikova is the author of the New York Times best-seller “How to Think Like Sherlock Holmes,” and she just received her PhD in Psychology from Columbia University.

Photograph by Eric Thayer/The New York Times/Redux.