Willful blindness: When a leader turns a blind eye

Leaders inhabit a bubble of power, and they are both mentally and physically cut off from the reality most people would recognize. Reality is the obligation to tell the truth, “the reality most people would recognize” is the imperative, if they witness improper or unlawful behaviour, to tell the truth, the whole truth and nothing but the truth. This author explains why leaders resist the imperative and how they – and we – can avoid the temptation.


When the British Member of Parliament, Adrian Sanders, asked Rupert and James Murdoch if they were familiar with the term “willful blindness,” their silence said it all. The MP defined it for them [reading from my HuffPo blog post]: “If there is knowledge that you could have had, should have had but chose not to have, you are still responsible.”  Then and now, willful blindness was a concept that should send shivers down the spines of any executive.

The legal concept of willful blindness originated in the nineteenth century. The judge in Regina v Sleep ruled that an accused could not be convicted for possession of government property unless the jury found that he either knew the goods came from government stores or had “willfully shut his eyes to the fact.” Nowadays, the law is most commonly applied in money laundering and drug trafficking cases – but the behaviour it describes is all around us: in banks, the Catholic Church, at BP, in Abu Ghraib, in most industrial accidents. These narratives always follow the same trajectory: years of abuse involving a large number of participants, plenty of warning signs and, when the problem finally explodes, howls of pain: How could we have been so blind?

Cases of willful blindness aren’t about hindsight. They feature contemporaneous information that was available but ignored. While it’s tempting to pillory individual villains, the causes are more often systemic and cultural. There are many reasons –psychological, social and structural — why we don’t see what we most need to notice. None of them provides an alibi or an excuse. What each does is shed light on how these organizational car crashes happen – and how they might be prevented.

Chief among culprits is power. When Richard Fuld was CEO of Lehman Brothers, he perfected the seamless commute: a limo drove him to a helicopter flying him to Manhattan where another limo whisked him to the bank’s offices. Front and lift doors were timed so that Fuld could ascend to his office without encountering a single employee.

Leaders of organizations inhabit a bubble of power, of which Fuld’s commute is a magnificent physical representation. They’re either isolated or surrounded by those desperate to please. The powerful also communicate differently. Academic analysis of their language shows that, confronted by risky situations, the powerful think in more abstract terms, are more optimistic and more certain that they are right. They’re both mentally and physically cut off from the reality most people would recognize.

Power is a problem, not a perk and it is exacerbated by money. In 2007, a series of experiments got students to play Monopoly; those who made the most money proved least helpful to others. Subsequent research reinforced the basic finding: thinking about money undermines our sense of social connectedness. (That, more than envy, is why bankers’ bonuses are so dangerous.) Extremely high pay adds social isolation to the psychological solipsism of power. Moreover, because business decisions are normally framed as purely economic choices, the focus on money crowds out ethical considerations. If the numbers work, then the decision works – doesn’t it? The use of money as the primary, often the only, measure of success put enormous pressure on ostensible independents, like accountants, lawyers and consultants to toe the line. The ghost of Arthur Andersen may haunt large firms but it’s a distant memory compared to the live scoreboard of billable hours.

And everyone wants to be big. Rupert Murdoch made much of the vast scale of News Corporation, in which News of the World represented less than one percent of the whole. But any corporation might do well to ask whether it has become too complex to manage. Enron declared bankruptcy before it needed to because its balance sheet was so complicated that no one knew how much the company owned.  Similarly, the banks had no accurate way of measuring systemic risk. Never mind too big to fail; we have a large number of companies now that may be too big to run.

Blindness: Here, there and everywhere

Outsourcing renders oversight more difficult because it atomizes processes until no one can see how they connect. It’s like Asimov’s iRobot series, in which the law prohibits the killing of humans; the robots resolve this by breaking their plans into so many steps that no single one is illegal. After BP’s Deepwater Horizon exploded, CEO Tony Hayward was quick to point out that the rig was built by Hyundai in Korea to design of a Texas firm, R&B Falcon, bought by the Swiss operators, Transocean, who leased it to BP. Most of the victims were not BP employees. “This wasn’t” Hayward said, “our accident.” Even Apple, more diligent than most in its scrutiny of partner companies, was blindsided by the high suicide rate at the FoxConn factory manufacturing iPhones. Meanwhile, the Catholic Church is trying to retrofit outsourcing by claiming priests aren’t employees.

Outsourcing has become so endemic in Western economies that there are no areas in which it isn’t considered, including wars and policing: in the U.S. and the UK, the number of private guards is now more than twice that of public police officers. Once you’re sub-contracting and outsourcing at this level, you stand a high likelihood of being blind to how work gets done; the cynical will argue that that is what it’s for.

Hayek wrote that “without a theory, the facts are silent” – but with a theory, or ideology, inconvenient facts can become invisible.  With disarming frankness, economist Paul Krugman acknowledged: “I think there’s a pretty good case to be made that the stuff that I stressed in the models is a less important story than the things I left out because I couldn’t model them.”

Alan Greenspan’s fervent belief in free markets blinded him to repeated failures in unregulated derivatives trading. Between 1994 until 2008, billions of dollars were lost in derivatives bets that others (George Soros, Muriel Siebert, Frank Partnoy) interpreted at the time as early warning signs the market wasn’t working. But Greenspan could not see what he would not see. Even after the banks failed, he would not relinquish his ideology, acknowledging merely that he had “found a flaw.

Scientists can be just as myopic. In 1956, the Oxford-based scientist Alice Stewart demonstrated, with startling data, that the chances of childhood cancer were dramatically increased by x-raying pregnant mothers. Yet it took 25 years before the practice was abandoned by the British and American medical establishments. Why? Because Stewart’s data flew in the face of current epidemiological theory – threshold theory – that maintained that, while a large dose of something like radiation would be dangerous, there was always a point beyond which it was safe. It wasn’t until 1997 that the king of epidemiology, Richard Doll, quietly retired the theory with the most modest of mea culpas.

Big ideas create tunnel vision, blinding the believer to disconfirming data. This cognitive dissonance is resolved in favour of the faith. Rupert Murdoch has always believed in the business value of political power and the importance of scale. Those beliefs blinded him to the growing disgust with political elites together with the popular discomfort with large foreign corporate takeovers.

Do as you’re told: Are we all blind?

It’s easy to deride such beliefs and ideologies but most people, governments and organizations have them. As Greenspan testified, “ideology is the way people deal with reality. Everyone has one.” Whether it is the belief that military intervention saves lives, or big governments are bad or the only successful company is global, ideologies are what psychologist Anthony Greenwald called ‘totalitarian egos,’ locking up incompatible ideas, suppressing evidence and re-writing history.

Once enlisted, our totalitarian egos are strikingly submissive. Ever since Stanley Milgram’s 1961 experiments into obedience, we’ve known that, without reward for compliance or punishment for refusal, most people (around 65 percent) will commit unethical acts when asked to do so by someone in authority. Repeated around the world ever since with unchanging outcomes, the experiments showed, Milgram wrote,  “the capacity for man to abandon his humanity — indeed the inevitability that he does so — as he merges his unique personality into larger institutional structures. … It would not be true to say he loses his moral sense. Instead it acquires a radically different focus. His moral concern now shifts to how well he lives up to the expectation the authority has of him.” Authority is a capacity whose dangers few CEOs recognize. Ambitious employees will work hard to intuit what’s wanted, to infer what will make them successful in the eyes of the organization they have joined. And their moral focus will change.

In 1998, when BP bought Amoco, then CEO John Browne ordered 25 per cent cuts across all refineries, regardless of their condition. Everything was cut, down (it was said) to the number of pencils. Personnel reductions stressed people and plant until it was clear that the company’s Texas City refinery had become a dangerous place to work. In 2004 alone, major incidents caused three fatalities. Yet that same year, prior to a meeting to discuss cuts, one manager sent this email to a colleague: “Which bit of 25 percent don’t you understand? We are going to be wasting our time on Monday unless you come prepared to commit to a 25 percent cut. I have more interesting things to do than getting up at 3 a.m. for a non-productive meeting!” These zealous executives did as they were told and in 2005, the refinery experienced its worst accident in which fifteen people died.

The plague of Groupthink

Milgram’s teacher, Solomon Asch, had conducted earlier experiments into conformity demonstrating that, when asked to match 2 lines of similar length, most people would rather give an obviously wrong answer that keeps them in a group than a correct answer that would make them outsiders. Fifty years later, fMRI versions of this experiment revealed startling detail: at the moment of conforming to the wrong answer, the brain wasn’t making a conscious decision. Instead, the brain’s activity centred on areas responsible for perception. In other words, knowing what the group saw changed what the participants saw. What we see depends on what others see.

In successful, glamorous companies, these motivations become more extreme, even cult like. Deborah Layton, one of the few survivors of Jim Jones’s People’s Temple in Guyana tragedy, sees cultish qualities in many corporations, an impression echoed by employees working at News International’s Wapping headquarters. If everyone is drinking the same Kool-Aid, you can be pretty sure no one will speak up when something is wrong. They’re too eager for inclusion, too afraid of exclusion. For the most part, we would rather be wrong than alone.

An American academic study into organizational silence found that 85 percent of executives had issues or concerns at work that they had never articulated. The chief reason was fear of retribution. When I mirrored the study with U.K. executives, the numbers were the same but the cause was different: here, silence was provoked by a sense of futility. American executives were afraid while British ones were without hope. That’s how millions of pounds of PPI mortgage insurance get sold.

While ambition, competitiveness and hierarchy may exacerbate these behaviours, high moral purpose won’t protect against them; silence is implicated in many NHS and care home scandals. The abuse and mistreatment of patients at the Royal Sussex County Hospital or at Winterbourne View wasn’t hidden; it took place on public wards where everyone could see. At the Bristol Royal Infirmary in the 1990s, that babies were dying in heart operations was public knowledge. Cardiff even built its own pediatric cardiac unit to avoid sending their infants to Bristol. For years, the medical establishment, notoriously conservative, said nothing; indeed research shows that medical students are more conforming after their training than when they begin, an effect called the “hidden curriculum.”

In his bromidic 2009 memoirs, John Browne acknowledged, “I wish someone had challenged me and been brave enough to say, ‘We need to ask more disagreeable questions’.” But he didn’t seem to understand why they hadn’t. When managers say that they want to hear the bad news, that they won’t shoot the messenger, most employees simply do not believe them. I’d argue this is the biggest challenge faced by any organization today and few CEOs even see it.

Discussing willful blindness with me, the cognitive psychologist, Albert Bandura, argued  “People are highly driven to do things that build self-worth; you can’t transgress and think of yourself as bad.  So people transform harmful practices into worthy ones, coming up with social justification, distancing themselves with euphemisms and numbers, ignoring the long-term consequences of their actions.” His examples included TV producers, gun manufacturers, and climate change deniers.

But the greatest single cause of willful blindness may also be the most basic. To build that sense of self-worth, we surround ourselves with people and information that confirm it.  Overwhelmingly, we prefer people like ourselves – and there’s a solid physiological reason why. The brain can’t handle all the information it is presented with, so it prioritizes. What gets a head start is information that is already familiar – and what is most familiar to us is us. So we feel most comfortable with people and ideas we already know. Just like Amazon’s recommendation engine or eHarmony’s online dating programmes, our brain searches for matches because building on the known makes for highly efficient processing. On a trivial level, this preference shows up in consumer preferences for products whose names share their initials: Carol likes Coke but Peter prefers Pepsi. More seriously, over time our neural networks, just like our opinions and ideologies, become deeper but also narrower. That is as true for us when we choose media we agree with as it is for party leaders who prioritize editors who agree with them. 

Everyone is biased in favour of themselves; it may be one reason why, despite decades of diversity programmes, women and minorities have made so little progress inside corporations. Looking at the constituency of the News Corporation’s board, you’ll see the bias played out in full: the directors either are Murdochs or friends and employees of the Murdochs. Strikingly absent are individuals sufficiently different and independent to provide any challenge. This isn’t unusual. At the time that it was collapsing, Dennis Stevenson remarked that the HBOS board was ‘as one’; it did not seem to occur to him that this might have been the problem. (The British financial services corporation, HBOS, recently declared bankruptcy).

News Corporation isn’t the first organization to fall into these traps and it won’t be the last. The central irony of willful blindness is that it makes us feel safe even as it puts us in danger. As Colm O’Gorman, one of the first people to uncover abuse in Ireland’s Catholic Church, told me, “We make ourselves powerless when we pretend we don’t know.”

But just because willful blindness is endemic doesn’t mean it’s irresistible. Roy Spence, a Texan advertising executive, refused to work with Enron even as the rest of the world beat a path to its door. How did he see what others missed? He thought a lifetime of seeing through the eyes of the powerless gave him different perspectives. “My sister had cystic fibrosis and I used to push her wheelchair to school every morning. I could see people pitying us, oblivious to the richness of our relationship. It made me ask, then as now: If they’re missing so much about us, what I am missing about them?” That internal dialogue is what Hannah Arendt called thinking.