Enron, Elizabeth Holmes, and Evil

Lessons from corporate crime on how to be a better person

Matt Serna
The Startup
15 min readMar 16, 2020

--

Corporate crime provides an illuminating glimpse into the darkest and most unsettling aspects of human nature. It shows how easy it is for even the most intelligent among us to be deceived, duped, and defrauded by bad actors. More interestingly, it reveals forces that lead otherwise decent people down a slippery slope toward taking illegal or unethical action — or to stand by while others do so.

Take for example the 1719 South Sea scandal, one of the western world’s earliest financial crises.

Driven by the promise of investing in trade with South America, thousands of Britons poured their money into the “South Sea Company,” an organization granted an exclusive monopoly by the Crown on trade to Spanish colonies as part of a deal to assume debt. Given the substantial upside of such trade, the company’s stock skyrocketed, and fortunes were made. In a pattern that has since grown all too familiar, the higher the stock rose, the more people rushed in to invest. Even the legendary Isaac Newton invested £7000 in the company, and intelligently sold his shares after doubling his investment.

Unable to realize its financial promise, the South Sea Company bubble crashed. In retrospect, the fraudulent nature of the South Sea Company should have been obvious. England was at war with Spain, who owned the colonies in question. And Spain had never let any of its allies, let alone its enemies, trade with their colonies.

Fortunes were dashed. Great families went into bankruptcy. Even Isaac Newton, who later reinvested over £20,000 after cashing out, may have lost a sizable portion of his net worth.

“I can calculate the movement of the stars,” he later remarked, “but not the madness of men.”

Stock chart of the South Sea Company

Three hundred years later, we are still plagued by corporate fraud. Like the South Sea Company, multiple companies have blinded investors with the promise of spectacular financial upside, only to go belly up when brought down by problems that should have been obvious to see.

And just like with the South Sea Company, the most intelligent among us seem equally likely to be seduced by the promise of riches these companies afford.

Just look at Enron, named by Fortune magazine as “most innovative company” 6 years in a row by Fortune, and celebrated by analysts as “the industry standard for excellence.” Few understand just how the company kept beating its earnings targets (by cooking the books); fewer cared, because its executives and investors were making a killing.

More recently, Theranos founder Elizabeth Holmes spun the promise of a new way to run blood tests to the tune of over $500 million dollars in fundraising. At its peak, the company was valued at $10 billion.

Both of these companies cratered into bankruptcy, costing their executives, investors and employees billions.

One might look at these failures and infer that we are unable to learn the lessons of history.

While true, I think the most important lessons to learn from the rise and fall of Enron and Theranos are not historical, but psychological.

What is it about human nature that not only leads us to be misled so easily? And, perhaps more interestingly, what is it about human nature that leads people working inside companies like Theranos and Enron to start engaging in blatantly illegal and harmful behaviors?

It’s one thing to be an unwitting victim. It’s quite another to be an unwitting villain.

As it turns out, a small set of cognitive biases is responsible for the creation of intellectual and moral blind spots that lead to the manipulation of our beliefs and values.

By understanding these biases, we can not only avoid being led astray, but ensure that we ourselves never become complicit in evil ourselves.

The corrupting influence of groups

March 13th, 1964: 28-year-old Kitty Genovese is brutally stabbed to death in the courtyard of an apartment complex in Queens.

After the murder, the New York Times published an article claiming that even though over 30 people saw or heard the attack, none called the police or tried to help.

This inspired a wave of academic research into what motivates people to help others in need — and why people so often fail to do so. Through this research, psychologists uncovered what is now known as the “bystander effect.” It asserts that the more people see something bad happening, the less likely one of them is to act.

And it helps explain just how so many otherwise good people at companies like Enron and Theranos became complicit in unethical behavior.

50 years later, the Genovese murder still sparks dialog on complacency in the face of evil

In The Smartest Guys in the Room, considered to be the definitive account of the Enron scandal, authors Bethany McLean and Peter Elkind explain how forces like the bystander effect led to the company’s downfall.

Enron, perceived to be one of the most successful companies in the 1990s, collapsed in the early 2000s due to financial mismanagement and accounting fraud.

Some of their accounting practices were completely above board; a small number were explicitly illegal. But the vast majority of what Enron’s wrongdoing fell into a moral gray zone, upholding the letter of the law while violating its spirit.

Enron CFO Andy Fastow was responsible for signing off on each of the company’s fraudulent transactions. McLean and Elkind’s interviews with him suggest that he may have fallen victim to groupthink. Even to this day, Fastow defends his actions by asserting that he got approval from all of the company’s lawyers, accountants, and management. The very fact that the approval of others could be used as a defense of one’s own actions suggests the power that others’ opinions have on what Fastow, or any of us, could consider to be right or wrong.

In one of the Enron story’s biggest ironies, the peerless Arthur Andersen, their accounting and audit firm, fell victim to the same corrupting forces of groupthink and diffusion of responsibility as their client.

Since their office was just a quick walk from Enron’s headquarters, most of their accountants informally set up shop on-site, working side-by-side the company’s accountants. Over time, McClean and Elkin assert, it became “hard to tell who was the auditor and who was the client.”

When faced with a decision on whether or not to veto the set-up of a CFO-managed fund to facilitate corporate transactions — an illegal conflict of interest — Arthur Anderson voiced concerns amongst its own auditors, but said nothing to the client under the assumption that the CEO and president of the company would kill it.

A decade prior to Kitty Genovese’s murder, a study performed by Dr. Solomon Asch showed that faced with sufficient peer pressure, people will conform to beliefs that directly contradict what they see with their own eyes

Volunteers would be shown the below image and asked to say aloud which line on the right matched the length of the line on the left.

When put in a group where other “volunteers” (in reality, experiment confederates) incorrectly answered the question, one-third of respondents “agreed” with the group in spite of what their own eyes told them was the truth.

Arthur Anderson succumbed to the same forces, and their implicit consent led to Enron’s executives and board of directors approving the CFO-managed fund, which eventually facilitated the financial transactions that brought the company down. Just like in the murder of Kitty Genovese, the presence of so many witnesses to a blatantly criminal act, from the CFO and his team, to the auditors, to the lawyers, to the board themselves, provided an implicit justification for inaction.

“If this transaction was actually as fraudulent as I think it might be,” each party may have reasoned, “somebody else would have said something.”

Thus, the very checks set in place to prevent fraudulent accounting practices ultimately provided Fastow and his team with the moral cover to systematically cook the books.

Whereas Enron’s executives could be seen as unwitting victims to the dark forces of social influence, Theranos’s executive team made intentional use of these forces to build a moat of credibility around their fraudulent technology.

Early on, Elizabeth Holmes identified the importance of attracting marquee, globally-recognized leaders to her board — even if they lacked business experience or biotech domain expertise. In the parlance of Silicon Valley, the absence of those qualities was seen not as a bug, but as a feature. They provided all the credibility that was needed, without any of the accompanying scrutiny.

Securing prestigious leaders like George Shultz, the eminently regarded former Secretary of State during the Reagan administration, inspired confidence in Theranos’s prospective customers. It helped drive Walgreens to sign an exclusive contract with Theranos to install lab testing devices in all of their in-store clinics — without any independent validation of the technology.

With Walgreens as a customer, Theranos was able to attract an even more prestigious board, spanning from Jim Mattis to Henry Kissinger. This, in turn, drove over $500 million dollars in additional fundraising over the course of the next five years.

All the result of a deliberate effort to manufacture credibility by establishing ties to credible investors.

It’s a variation of “trading up the chain,” a common media strategy exposed by author Ryan Holiday in Trust me I’m Lying, an expose on the dark arts of PR. If a company wants to plant a story or narrative in a national newspaper, they start by pitching to smaller media outlets, who may have a lower bar for what constitutes “newsworthiness” or “accuracy.” Once it’s published, they send their coverage to larger media outlets, who see it as validation of the story and justification to publish the story themselves. And so the process repeats, helping take a story from community blog to the New York Times. Or in Theranos’s case, the cover of Fortune magazine.

Out for blood — in more ways than one

Trading up the chain works because of the human bias towards trusting the judgment of the crowd over independent investigation. The more voices we hear telling us a certain truth, the less likely we are to question it. All the more so when those voices are highly influential in our lives.

To manufacture the perception of success, Theranos practiced its own version of trading up the chain. The fact that doing so blinded the world from the company’s blatantly faulty science is a testament to its power.

Like in the South Sea Scandal, or indeed, any bubble, there’s a vicious cycle at play, where the more people invest in something (whether it be a company, an asset, or an idea), the more social validation people feel when making that investment. This inspires more confidence in the investment, leading to even more investment. And so the cycle repeats itself.

The root of all evil

Groupthink explains how bad ideas and false beliefs spread. But what makes these ideas bad in the first place? And why do we seem to fall more easily for “bad” ideas than “good” ones?

An uncomfortable aspect of human nature is that we’re biased to believe things that make us feel comfortable, and do not make us feel uncomfortable. It helps explain why most people believe in an afterlife. Nobody wants to believe it’s all dirt and worms and darkness in the end.

Other times, our beliefs spring from our identity. Consider the about-face made by many in the religious right in America who decided to support Donald Trump in spite of his well-documented moral failings. Or the Republicans who spent their careers fighting for free trade who now advocate for tariffs backed by Trump. Their beliefs have changed because what it means to be a Republican has changed.

And what could be more unpleasant than a truth that requires us to compromise our identity?

The bias towards beliefs that make us feel good help explain how Theranos and Enron, two companies built on lies, were able to last as long as they did in the first place. For everybody involved, from the founders, to the employees, to investors and customers, it was simply easier to believe that all was well.

Theranos was the quintessential feel-good story. In a male-dominant startup world, Elizabeth Holmes — billed as a “female Zuckerburg” — cut an inspirational figure. Instead of trying to build yet another app to monetize our attention, she built a company aspiring to transform how healthcare was practiced.

It was easier to believe that needles could be replaced by a fingerprick than to accept the limitations of the laws of chemistry. And it was nearly impossible to believe that a female entrepreneur lauded as an example to a generation of girls could be the mastermind of the largest Silicon Valley Fraud in the 21st century.

This same bias led Enron and the community of investors that propped it up to be willfully misled by the company’s performance, so long as believing in the company’s success contributed to their own bottom line.

Analysts could see plain as day that earnings far exceeded cash coming in. The bankers who underwrote Enron’s suspect financial transactions had to have known what was going on. But in the red-hot market of the late 1990s, where analysts grew rich and famous in proportion to their bullishness, few beliefs could be more painful to hold than that Enron was a house of cards waiting to be toppled.

“Enron is literally unbeatable at what they do,” raved David Fleischer, a securities analyst at Goldman Sachs.

“The industry standard for excellence,” chimed in Deutsche Bank’s Edward Tirello.

“Enron is the one to emulate,” wrote the Financial Times.

The “best of the best” — at pioneering new ways to cook the books

Because of the market’s continued vote of confidence, Enron’s executives escalated their reckless, fraudulent behavior.

They capitalized on financial loopholes afforded by “mark to market” accounting, which enabled them to book all projected revenue for a deal immediately as earnings. This created a short-term massive incentive to close deals, with no incentive to ensure that cash-flow came through on the back end. Because this strategy helped juice Enron’s short-term earnings, increasing the price of the stock, executive compensation began to be tied to such short term thinking. Bonuses were paid on the closing of the contract, not on how it worked out over time. Enron’s executives, auditors, and bankers lining their pockets by virtue of financial policies that defied common sense.

The financial upside of their decisions bent their moral compass and prevented them from seeing how what they were doing was wrong. To this day, Andy Fastow asserts that while he intentionally engaged in fraud as CFO of Enron, he never technically broke the law. It’s a belief that defies all logic. But accepting this truth would force Fastow to compromise his self-professed identity as a law-abiding citizen, as a well-intentioned person who let things get out of hand.

What’s the key takeaway from Enron and Theranos? Our thoughts, feelings, and even moral values can be corrupted by invisible social forces.

It’s a lesson that’s been tragically reinforced time and time again throughout history.

In Eichmann in Jerusalem, a report on the trial of Holocaust organizer Adolf Eichmann, author Hannah Arendt famously described how otherwise normal Germans were enlisted to participate in the Einsatzgruppen — Nazi paramilitary death squads. Over 1,500,000 were killed by them.

Horror of horrors, committed by otherwise ordinary people

“The murderers were not sadists or killers by nature,” explained Arendt. “On the contrary, a systematic effort was made to weed out all those who derived physical pleasure from what they did.”

One might imagine that the only way average German soldiers could be made to engage in such brutality would be if refusing to do so put their own lives at risk. Yet testimony in the Nuremberg Trials, held after the war to prosecute Nazi war criminals, revealed that this was not the case. In fact, documents from the Nuremberg trials do not indicate a single instance of an SS member suffering the death penalty for denying participation in an execution.

Such horrific scenes have played out across the 20th century, from Bosnia, to Kosovo, to Rwanda. It would be erroneous to assume ourselves exempt from the same potential for moral corruption.

Resisting Evil

Dive deep enough into the atrocities of war and genocide and it’s easy to lose faith in humanity.

But taking a longer view of history shows how far we’ve come.

We look at the Islamic State with horror — how they condone slavery, murder those of different faiths, treat women only marginally better than animals — and forget that less than 350 years ago, fewer than 20 generations into our past, the majority of so-called “civilized” states held more in common with them than they do with us.

The fact that we’ve been able to come so far in such a short period of time should be a source of encouragement.

Of course, there’s much more progress to be made. To do so, we must recognize and conquer the uncomfortable truth that the person we see in the mirror is highly influenced by authority, by peers, and by a desire to preserve identity.

Evil isn’t caused by bad apples; it stems from the bad barrel that rots them. This doesn’t apply just to the shifts in moral judgment that perpetuate major evils like genocide, or more minor evils like corporate fraud. It applies to every single act where we sacrifice the wellbeing of ourselves or others in exchange for short-term satisfaction. And naturally, it applies to shifts in intellectual judgment that may not be morally harmful persay, but can be personally catastrophic — like Newton losing half of his net worth during the South Sea Scandal.

It brings to mind the first step of Alcoholics Anonymous’s 12 Step Program: accept powerlessness over one’s situation. By accepting vulnerability to social forces and cognitive biases, it becomes possible to consciously identify their influence and implement strategies to overcome it.

We must think twice when you find yourself rationalizing behavior or beliefs that you would otherwise find abhorrent under “normal” conditions. Quite often, these “exceptions” are nothing more than excuses to help resolve cognitive dissonance in morally or ethically ambiguous situations.

With the knowledge that we are shaped by our environment, we must proactively shape that environment so that it improves us instead of corrupting us.

This can not only help us resist corruption ourselves, but make other people better as well. Recall the Asch test, where correct answers to basic questions on the length of different lines plummeted in the presence of people who answered incorrectly. The same phenomenon can work in reverse. Openly virtuous behavior contributes to the establishment of social norms that influence others to be good.

Counterintuitively, an understanding of the social forces that unconsciously shape our behavior helps us to reclaim individual agency for our behavior, and empowers us to transform the environments shaping those around us.

And even when we can’t, the Theranos story shows how it’s still possible to do the right thing.

Tyler Shultz, a grandson of Theranos board member George Shultz, joined the company inspired by Elizabeth Holmes and noticed the company was lying about the validity of their tests to doctors, patients and regulators. After voicing his concerns and leaving the company — against the guidance of his family — he served as a source for the Wall Street Journal, whose reporting exposed the company’s fraud.

As Theranos began to suspect Tyler was collaborating with the Journal, Elizabeth Holmes personally threatened his career prospects. His grandfather chose Holmes over his own grandson in the matter. Theranos sued Tyler for violating his non-disclosure agreement, and his parents had to take out a mortgage on their house to cover the $400,000 in legal bills that they racked up in his defense.

The journalist who reported on the fraud later shared that without Tyler, he may have never been able to get the article published. Who knows how much damage would have been done to Theranos’s patients had he not spoken up when he did.

He provides a compelling example of how, in spite of monolithic social and cultural forces that can push us towards fraud, evil, or mere misinformed decision making, all it takes is one individual to wake up the world from delusion.

It’s proof that while we can be influenced by our environments, we don’t have to be victims to it. That we can choose to say fraud when we see fraud. And in doing so we can fight back against the social forces and cognitive biases that can make otherwise good people do bad things.

Stressed out by this crazy world?

Sign up for our free 1 week email course on applying Stoic philosophy to reduce stress, get more done, and find a career you love.

--

--

Matt Serna
The Startup

Technology marketer exploring the intersection of psychology, philosophy, and work at artofoutput.com.