Skip to contentSkip to navigation

What You Know That You Don’t Know Can Hurt You

The Brussels terrorist attack reveals a great deal about organizational myopia.

On the morning of March 22, 2016, at the beginning of rush hour, two bombs ripped through the busy departure terminal at Brussels Airport. Little more than an hour later, a suicide bomber detonated a device in a subway car at Maelbeek Station, a stop used by many who work in nearby European government buildings. Thirty-five people died and dozens were injured in the terrorist attack.

Although it was lost in the fog of the Belgian assault, the incident and its aftermath reveal a fundamental shortcoming in former U.S. Secretary of Defense Donald Rumsfeld’s classic framework for assessing situational risk. Describing the considerations for whether or not to invade Iraq in 2003, Rumsfeld said that there were known knowns, known unknowns, and unknown unknowns. Missing from this neat construct, however, was one crucial bucket: unknown knowns.

It would seem odd for something to be known and yet unknown, but the attack in Brussels perfectly captures this concept. The city had already been identified as a hotbed of Muslim extremist activity, and people living there were linked to the recent terrorist killings in Paris. Moreover, apparently Belgian police were warned months before that at least one of the suicide bombers was a jihadist. Yet these facts might as well have been unknown, as local and global authorities were caught flat-footed.

It is convenient — and certainly tempting — to blame law enforcement and officials for failing to do their jobs and anticipate dangers in their midst. Such recriminations have been commonplace in disasters, including Pearl Harbor, 9/11, the Boston Marathon, plane crashes, and natural disasters. And although there may be some merit to this criticism, there is also something else at work in these incidents: In large enterprises, unknown knowns are legion — and, by definition, invisible. But ultimately they paralyze the ability to make good strategic decisions.

The most comprehensive research to begin to poke at the ubiquity and dangers of unknown knowns in organizations was by Dartmouth professor Sydney Finkelstein. In the book Why Smart Executives Fail (Portfolio Penguin, 2003), Finkelstein argued that businesses went under for a surprisingly small number of reasons, all of them linked to being indifferent or myopic to facts staring them in the face. The clues to the impending danger were there not some of the time or most of the time — they were there in each and every case. Finkelstein cites Motorola’s attempt a decade ago to continue to foist its Razr cellphone on an increasingly uninterested marketplace, ignoring the imminence of the iPhone and BlackBerry, as a classic illustration of a company crumbling under unknown knowns.

Although organizations are enfeebled by unknown knowns, individuals draw upon them all the time, usually rather successfully. We call it intuition. Our gut tells us that we know something without our conscious minds being able to pinpoint exactly why we have strong confidence in a particular assessment or choice. It just feels right. Nobel Prize–winning psychologist Daniel Kahneman and other researchers have written about the human brain’s propensity for ongoing, often subconscious, data collection and analysis. Its primary job is to keep us alive and so it is constantly assessing the environment for risks as well as potential rewards.

This massive internal information bank can be extraordinarily useful when facing situations you have encountered multiple times before. Without the need for extensive calculation, your brain builds a shortcut — a heuristic — that enables you to decide and act quickly. You use this ability every time you drive in traffic, for example. Fast-moving vehicles are all around, yet you aren’t constantly slamming on the brakes or taking evasive maneuvers — that is, until the brain senses a shift in the pattern indicating heightened danger. Then you react instantly. If you were asked to explain exactly what triggered your reaction, you likely would be unable to do so. This is the unknown known as a positive.

But what happens in a large organization to so pervert the potential benefits of unknown knowns?  Chiefly, the so-called cognitive biases that people often overcome are magnified by septic organizational behavior. These include availability bias, which causes us to place undue weight on the most accessible data, rather than dig deeper for true probabilities; confirmation bias, which leads us to more readily accept ostensible facts that conform to our existing worldview rather than objectively considering all of the evidence; and optimism bias, which makes us underestimate the likelihood of adverse outcomes.

What happens in a large organization to so pervert the potential benefits of unknown knowns?

In organizations, the fear of risk-taking and making an unintuitive decision that backfires compels many individuals to seek comfort in the safety of cognitive biases; meanwhile, managers frequently fail to proactively discover those with hidden pockets of experience and expertise who are better able to overcome their cognitive biases and see the knowns clearly. Moreover, in some organizations, power is derived from knowing something that others don’t and waiting for the right moment to spring it. Consequently, people may hoard rather than broadcast their more unorthodox insights. And departmental silos and walled-off communications channels may make it difficult for employees to find an avenue for warning their supervisors about a problem they believe they have uncovered. Indeed, one only need look at the financial meltdown of 2008 to see the best and brightest at major investment firms and central banks completely unwound by unknown knowns, either disbelieving the data, discounting its significance, or ignoring people who tried to raise red flags.

The good news for organizational managers is that shortsightedness from unknown knowns can be mitigated and counteracted by adopting these three leadership principles:

Always ask “What am I missing?” This is a favorite question of Admiral (Ret.) Thad Allen, former Commandant of the United States Coast Guard who salvaged the Katrina response and led the cleanup of the Deepwater Horizon gulf oil spill. Leaders who are afraid to admit knowledge gaps create unnecessary blind spots. Articulating that there is more to learn will spur active inquiry for you and your team.

Never say “never.” Just because you don’t want the worst to happen doesn’t mean that it won’t. When you say that something will never happen, you set the stage for confirmation bias. Instead, work to calculate credible probabilities. Openly discuss the consequences if something goes wrong. When you engage in this exercise with a group, you are more likely to offset each other’s cognitive biases and avoid groupthink.

Take time to sit with a challenge. Sherlock Holmes pondered a “three-pipe” problem in the story The Red-Headed League. That is, he sat and let his mind work for the time that he took to smoke three pipefuls of tobacco (these were “three-patch” problems in the BBC series starring Benedict Cumberbatch, in which smoking was frowned upon). When you let the brain work uninterrupted by email, social media, or other distractions, it has time to sift through its stored data looking for patterns and connections. This is also why great ideas seem to pop up when you are in the shower.

Brussels was a surprise that should not have been. And risks — not just terrorist attacks but public health concerns such as the Zika virus moving toward North America, global economic weaknesses, or product market downturns — are fast evolving around us. By gamely acknowledging the existence of unknown knowns, you may find out that you know more than you thought you did.

Eric McNulty

Eric J. McNulty is the associate director of the National Preparedness Leadership Initiative. He is the coauthor of You're It: Crisis, Change, and How to Lead When It Matters Most (PublicAffairs, 2019). He writes frequently about leadership, change, and organizational culture.

 
Get s+b's award-winning newsletter delivered to your inbox. Sign up No, thanks
Illustration of flying birds delivering information
Get the newsletter

Sign up now to get our top insights on business strategy and management trends, delivered straight to your inbox twice a week.