로그인|회원가입|고객센터
Top
검색버튼 메뉴버튼

Rethinking Trust

DBR | 1호 (2008년 1월)
 
Despite deceit, greed, and incompetence on a previously unimaginable scale, people are still trusting too much.
 
For the past two decades, trust has been touted as the all-powerful lubricant that keeps the economic wheels turning and greases the right connections—all to our collective benefit. Popular business books proclaim the power and virtue of trust. Academics have enthusiastically piled up study after study showing the varied benefits of trust, especially when it is based on a clear track record, credible expertise, and prominence in the right networks.
 
Then along came Bernie. There was “something about this person, pedigree, and reputation that inspired trust,” mused one broker taken in by Bernard Madoff, who confessed to a $65 billion Ponzi scheme—one of the largest and most successful in history. On the surface, Madoff possessed all the bona fides—the record, the résumé, the expertise, and the social connections. But the fact that so many people, including some sophisticated financial experts and business leaders, were lulled into a false sense of security when dealing with Madoff should give us pause. Why are we so prone to trusting?
 
Madoff is hardly the first to pull the wool over so many eyes. What about Enron, WorldCom, Tyco, and all the other corporate scandals of the past decade? Is there perhaps a problem with how we trust?
 
Highlights—and lowlights—in the public’s trust of business (Located at the end of this article)
 
I have been grappling with this question for most of my 30 years as a social psychologist, exploring both the strengths and the weaknesses of trust. In the wake of the recent massive and pervasive abuses—and with evidence of more scandals surfacing each day—I think it’s worth taking another look at why we trust so readily, why we sometimes trust poorly, and what we can do about it. In the following pages, I present the thesis that human beings are naturally predisposed to trust—it’s in our genes and our childhood learning—and by and large it’s a survival mechanism that has served our species well. That said, our willingness to trust often gets us into trouble. Moreover, we sometimes have difficulty distinguishing trustworthy people from untrustworthy ones. At a species level, that doesn’t matter very much so long as more people are trustworthy than not. At the individual level, though, it can be a real problem. To survive as individuals, we’ll have to learn to trust wisely and well. That kind of trust—I call it tempered trust—doesn’t come easily, but if you diligently ask yourself the right questions, you can develop it.
 
Let’s begin by looking at why we’re so prone to trust.
 
To Trust Is Human
It all starts with the brain. Thanks to our large brains, humans are born physically premature and highly dependent on caretakers. Because of this need, we enter the world “hardwired” to make social connections. The evidence is impressive: Within one hour of birth, a human infant will draw her head back to look into the eyes and face of the person gazing at her. Within a few more hours, the infant will orient her head in the direction of her mother’s voice. And, unbelievable as it may seem, it’s only a matter of hours before the infant can actually mimic a caretaker’s expressions. A baby’s mother, in turn, responds and mimics her child’s expression and emotions within seconds.
 
In short, we’re social beings from the get-go: We’re born to be engaged and to engage others, which is what trust is largely about. That has been an advantage in our struggle for survival. As social psychologist Shelley Taylor noted in her summary of the scientific evidence, “Scientists now consider the nurturant qualities of life—the parent-child bond, cooperation, and other benign social ties—to be critical attributes that drove brain development...accounting for our success as a species.” The tendency to trust made sense in our evolutionary history.
 
Research has shown that the brain chemistry governing our emotions also plays a role in trust. Paul Zak, a researcher on the cutting edge of the new field of neuroeconomics, has demonstrated, for instance, that oxytocin, a powerful natural chemical found in our bodies (which plays a role in a mother’s labor and milk production) can boost both trust and trustworthiness between people playing experimental trust games. (Even a squirt of oxytocin-laden nasal spray is enough to do it.) Other research has also shown how intimately oxytocin is connected with positive emotional states and the creation of social connections. It’s well documented that animals become calmer, more sedate, and less anxious when injected with oxytocin.
 
Trust kicks in on remarkably simple cues. We’re far more likely, for example, to trust people who are similar to us in some dimension. Perhaps the most compelling evidence of this comes from a study by researcher Lisa DeBruine. She developed a clever technique for creating an image of another person that could be morphed to look more and more (or less and less) like a study participant’s face. The greater the similarity, DeBruine found, the more the participant trusted the person in the image. This tendency to trust people who resemble us may be rooted in the possibility that such people might be related to us. Other studies have shown that we like and trust people who are members of our own social group more than we like outsiders or strangers. This in-group effect is so powerful that even random assignment into small groups is sufficient to create a sense of solidarity.
 
As psychologist Dacher Keltner and others have shown, physical touch also has a strong connection to the experience of trust. In one experiment involving a game widely used to study decisions to trust, an experimenter made it a point, while describing the task, to ever so lightly touch the backs of individuals as they were about to play the game. People who received a quick and unobtrusive touch were more likely to cooperate with, rather than compete against, their partner. It’s no coincidence, Keltner noted, that greeting rituals throughout the world involve touching—witness the firm, all-American handshake.
 
So what does all this research add up to? It shows that it often doesn’t take much to tip us toward trust. People may say they don’t have a lot of trust in others, but their behavior tells a very different story. In fact, in many ways, trust is our default position; we trust routinely, reflexively, and somewhat mindlessly across a broad range of social situations. As clinical psychologist Doris Brothers succinctly put it, “Trust rarely occupies the foreground of conscious awareness. We are no more likely to ask ourselves how trusting we are at any given moment than to inquire if gravity is still keeping the planets in orbit.” I call this tendency presumptive trust to capture the idea that we approach many situations without any suspicion. Much of the time this predisposition serves us well. Unless we’ve been unfortunate enough to be victims of a major violation of trust, most of us have had years of experiences that affirm the basic trustworthiness of the people and institutions around us by the time we become adults. Things seldom go catastrophically wrong when we trust, so it’s not entirely irrational that we have a bias toward trust.
 
But Our Judgment Is Sometimes Poor
If it’s human to trust, perhaps it’s just as human to err. Indeed, a lot of research confirms it. Our exquisitely adapted, cue-driven brains may help us forge trust connections in the first place, but they also make us vulnerable to exploitation. In particular, our tendency to judge trustworthiness on the basis of physical similarities and other surface cues can prove disastrous when combined with the way we process information.
 
One tendency that skews our judgment is our proclivity to see what we want to see. Psychologists call this the confirmation bias. Because of it we pay more attention to, and overweight in importance, evidence supporting our hypotheses about the world, while downplaying or discounting discrepancies or evidence to the contrary. In one laboratory game I conducted, individuals who were primed to expect a possible abuse of trust looked more carefully for signs of untrustworthy behavior from prospective partners. In contrast, those primed with more positive social expectations paid more attention to evidence of others’ trustworthiness. Most important, individuals’ subsequent decisions about how much to trust the prospective partners were swayed by those expectations.
 
A confirmation bias wouldn’t be so bad if we weren’t heavily influenced by the social stereotypes that most of us carry around in our heads. These stereotypes reflect (often false) beliefs that correlate observable cues (facial characteristics, age, gender, race, and so on) with underlying psychological traits (honesty, reliability, likability, or trustworthiness). Psychologists call these beliefs implicit theories, and the evidence is overwhelming that we aren’t conscious of how they affect our judgment. Most of the time our implicit personality theories are pretty harmless; they simply help us categorize people more quickly and render social judgments more swiftly. But they can cause us to overestimate someone’s trustworthiness in situations where a lot is at stake (for instance, our physical safety or financial security).
 
To make matters worse, people tend to think their own judgment is better than average—including their judgment about whom to trust. In a negotiation class I teach, I routinely find that about 95% of MBA students place themselves in the upper half of the distribution when it comes to their ability to “size up” other people accurately, including how trustworthy, reliable, honest, and fair their classmates are. In fact, more than 77% of my students put themselves in the top 25% of their class, and about 20% put themselves in the top 10%. This inflated sense of our own judgment makes us vulnerable to people who can fake outward signs of trustworthiness.
It’s not just biases inside our heads that skew our judgment. We often rely on trusted third parties to verify the character or reliability of other people. These third parties, in effect, help us “roll over” our positive expectations from one known and trusted party to another who is less known and trusted. In such situations, trust becomes, quite literally, transitive. Unfortunately, as the Bernie Madoff case illustrates, transitive trust can lull people into a false sense of security. The evidence suggests that Madoff was a master at cultivating and exploiting social connections. One of his hunting grounds was the Orthodox Jewish community, a tight-knit social group.
 
The biases described thus far contribute to errors in deciding whom to trust. Unfortunately, the wiring in our brains can also hinder our ability to make good decisions about how much risk to assume in our relationships. In particular, researchers have identified two cognitive illusions that increase our propensity to trust too readily, too much, and for too long.
 
The first illusion causes us to underestimate the likelihood that bad things will happen to us. Research on this illusion of personal invulnerability has demonstrated that we think we’re not very likely to experience some of life’s misfortunes, even though we realize objectively that such risk exists. Thus, although we know intellectually that street crime is a major problem in most cities, we underestimate the chances that we will become victims of it. One reason for this illusion, it’s been argued, is the ease with which we engage in a kind of compensatory calculus and call up from memory all the steps we’ve taken to mitigate such risks (for instance, avoiding dark alleys or making it a habit to cross the street when we see an ominous stranger approaching). The second and closely related illusion is unrealistic optimism. Numerous studies have shown that people often overestimate the likelihood that good things will happen to them—that they will marry well, have a successful career, live a long life, and so on. Even when people are given accurate information regarding the true odds of such outcomes, they still tend to think they will do better than average.
 
As if all these biases and illusions weren’t enough, we also have to contend with the fact that the very simplicity of our trust cues leaves us vulnerable to abuse. Unfortunately for us, virtually any indicator of trustworthiness can be manipulated or faked. A number of studies indicate that detecting the cheaters among us is not as easy as one might think. I have been studying deceptive behavior in my lab experiments—and teach about it in my business school courses on power and negotiation. In one exercise, I instruct some participants to do everything they can to “fake” trustworthiness during an upcoming negotiation exercise. I tell them to draw freely on all their intuitive theories regarding behaviors that signal trustworthiness. So what do these short-term sociopaths say and do? Usually, they make it a point to smile a lot; to maintain strong eye contact; to occasionally touch the other person’s hand or arm gently. (Women mention touching as a strategy more than men do and, in their post-exercise debriefs, also report using it more than men do.) They engage in cheery banter to relax the other person, and they feign openness during their actual negotiation by saying things like “Let’s agree to be honest and we can probably do better at this exercise” and “I always like to put all my cards on the table.”
 
Their efforts turn out to be pretty successful. Most find it fairly easy to get the other person to think they are behaving in a trustworthy, open, cooperative fashion (according to their negotiation partners’ ratings of these traits). Additionally, even when students on the other side of the bargaining table were (secretly) forewarned that half the students they might encounter had been instructed to try to fool them and take advantage of them, their ability to detect fakers did not improve: They didn’t identify fakers any more accurately than a coin flip would have. Perhaps most interesting, those who had been forewarned actually felt they’d done a better job of detecting fakery than did the other students.
We’ve seen why we trust and also why we sometimes trust poorly. Now it’s time to consider how to get trust back on track. If we are to harvest its genuine benefits, we need to trust more prudently.
 
Temper Your Trust
We can never be certain of another’s motivations, intentions, character, or future actions. We simply have to choose between trust (opening ourselves to the prospect of abuse if we’re dealing with an exploiter) or distrust (which means missing out on all the benefits if the other person happens to be honest). The shadow of doubt lingers over every decision to trust. That said, there is much that you can do to reduce the doubt—in particular, by adjusting your mind-set and behavioral habits. Here are some preliminary rules for tempering trust.
 
Rule 1 | Know yourself.
People generally fall into one of two buckets when it comes to their disposition toward trust. Some trust too much and too readily. They tend to take an overly rosy view, assuming that most people are decent and would never harm them. Thus they disclose personal secrets too early in relationships or share sensitive information in the workplace too indiscriminately, before prudent, incremental foundations of trust have been laid. They talk too freely about their beliefs and impressions of others, without determining whether the person they’re conversing with is a friend or a foe. Their overly trusting behavior sets them up for potential grief. In the other bucket are people who are too mistrustful when venturing into relationships. They assume the worst about other people’s motivations, intentions, and future actions and thus hold back, avoiding disclosing anything about themselves that might help create a social connection. They’re reluctant to reciprocate fully because they fear they’ll trust the wrong people. They may make fewer mistakes than their more trusting counterparts do, but they have fewer positive experiences because they keep others at a distance.
 
The first rule, therefore, is to figure out which of the buckets you fall into, because that will determine what you need to work on. If you’re good at trusting but are prone to trust the wrong people, you must get better at interpreting the cues that you receive. If you’re good at recognizing cues but have difficulty forging trusting relationships, then you’ll have to expand your repertoire of behaviors.
 
Rule 2 | Start small.
Trust entails risk. There’s no way to avoid that. But you can keep the risks sensible—and sensible means small, especially in the early phases of a relationship. Social psychologist David Messick and I coined the term shallow trust to describe the kinds of small but productive behaviors through which we can communicate our own willingness to trust.
 
A good example of this is a gesture made by Hewlett-Packard in the 1980s. HP’s management allowed engineers to take equipment home whenever they needed to, including weekends, without having to go through a lot of formal paperwork or red tape. That sent a strong message that the employees taking it off-site could be trusted. The fact that the equipment was subsequently returned validated that trust and, over time, cemented it. Imaginative acts of trust of this sort breed trustworthiness in return. They don’t involve much risk, but they broadcast that you’re willing to meet people halfway.
 
Salting your world with lots of small trusting acts sends a signal to others who are themselves interested in building good relationships, and decades of research by social psychologist Svenn Lindskold and others have proved that it leads to more positive interactions. It works because it’s incremental (and thus manages the risks intelligently) and contingent (that is, tied to reciprocity). By taking turns with gradually increasing risks, you build a strong and tempered trust with the other person.
 
Rule 3 | Write an escape clause.
In our study of trust dynamics in high-stakes situations, Debra Meyerson, Karl Weick, and I found that if people have a clearly articulated plan for disengagement, they can engage more fully and with more commitment. Hedging one’s bets in this way may seem as if it would undermine rather than reinforce trust. (After all, how can you expect me to trust you completely if I know you don’t trust me completely?) Yet, paradoxically, hedges allow everyone in an organization to trust more easily and comfortably—and even to take larger risks. Because I know your dependence on me is hedged a bit (you have a good backup plan), I have more breathing room as well. All of us know the system will survive the occasional, unavoidable mistakes that permeate any complex organization or social system.
 
A study I did of novice screenwriters trying to break into the entertainment industry, a domain where betrayals of trust are commonplace, provides a good example of how this works. To get a chance to develop their original ideas for movies or television shows, screenwriters first have to pitch them to agents, independent producers, and studio executives. Once they’ve done so, however, their ideas are out there—and always at risk of being stolen. (And it’s a real prospect: No less a writer than Art Buchwald had this experience when pitching an idea for a movie about an African prince visiting America—an idea that suddenly showed up on the screen a few years later as Coming to America, with Eddie Murphy in the starring role. In 1988, Buchwald sued Paramount, claiming the idea was his, and won.) One way to hedge the risk is to write up the treatment and register it first with the Writers Guild of America, which prevents others from claiming it as their own. A second important hedge in Hollywood is to have an agent who can pitch the idea so widely that its authorship becomes well known. Hollywood is a small world, and making something common knowledge in a small world is a good hedging strategy.
 
Rule 4 | Send strong signals.
To ensure that trust builds from small initial acts to deeper and broader commitments, it’s important to send loud, clear, and consistent signals. Some of the social signals we send are too subtle, though we don’t realize it. In one study I did exploring perceptions of reciprocal trust, I found that both managers and subordinates overestimated how much they were trusted by the people in the other category. This discrepancy in self-other perception—a trust gap—has an important implication: Most of us tend to underinvest in communicating our trustworthiness to others, because we take it for granted that they know or can readily discern our wonderful qualities of fairness, honesty, and integrity.
 
Sending strong and clear signals not only attracts other tempered trusters but also deters potential predators, who are on the lookout for easy victims sending weak and inconsistent cues. That’s why having a reputation for toughness is critical; reputation is among the most powerful ways we communicate who we are and what kinds of relationships we seek. Robert Axelrod, a pioneer in this stream of research, used the colorful term provocability to capture this idea: In order to keep your trust relations on an even keel, and the playing field level, you have to be willing not only to take chances by initially trusting a bit (signaling the willingness to cooperate) but also to retaliate strongly, quickly, and proportionately (signaling that you will strike back when your trust is abused). His research showed that you can be nice and not finish last—but only if you are firm and consistent with respect to punishing offenses.
 
Rule 5 | Recognize the other person’s dilemma.
It’s easy for our self-absorbed brains to fall into the trap of thinking only from our own point of view: After all, it’s our own trust dilemmas that we find so anxiety provoking and attention getting. (Whom should I invest my money with? Whom should I allow to operate on me?) We often forget that the people we’re dealing with confront their own trust dilemmas and need reassurance when wondering whether (or how much) they should trust us. Some of the best trust builders I’ve studied display great attention to, and empathy for, the perspective of the other party. They are good mind readers, know what steps to take to reassure people, and proactively allay the anxiety and concerns of others.
 
A good example is President John F. Kennedy in his famous commencement address at American University in 1963, in which he praised the admirable qualities of the Soviet people and declared his willingness to work toward mutual nuclear disarmament with Soviet leaders. We know from Soviet memoirs that Premier Nikita Khrushchev was impressed, believing that Kennedy was sincere in trying to break from the past and could be trusted to work on this issue.
 
Rule 6 | Look at roles as well as people.
Many studies highlight the central importance of personal connections in the trust-building process—and appropriately so. This finding does not necessarily mean, however, that your trust in leaders or persons of power must be based on a history of sustained personal contact. Research that Debra Meyerson, Karl Weick, and I did on what we call swift trust showed that high levels of trust often come from very depersonalized interactions; in fact, personal relations sometimes get in the way of trust.
 
An important element of swift trust is the presence of clear and compelling roles. Deep trust in a role, we found, can be a substitute for personal experience with an individual. Role-based trust is trust in the system that selects and trains the individual. Robyn Dawes, a psychologist who specializes in human judgment, once observed, “We trust engineers because we trust engineering and that engineers [as individuals] have been taught to apply valid principles of engineering.” Thus, the role is a proxy for personal experience and guarantees expertise and motivation—in short, trustworthiness.
 
Of course, role-based trust isn’t foolproof. People on Main Street trusted people on Wall Street for a long time precisely because the U.S. financial system seemed to be producing reliable results that were the envy of the rest of the world. But flawed or not, in deciding whom to trust we still need to take the roles people play into account.
 
Rule 7 | Remain vigilant and always question.
When we’re hungry, we think about food until we’ve satisfied our hunger; then our minds move on to the next task confronting us. Human beings seek closure—and that’s true of our decisions in trust dilemmas as well. We worry about the trustworthiness of a prospective financial adviser, so we do our due diligence. Once we’ve made a decision, however, we tend not to revisit it so long as nothing seems to have changed. That’s dangerous.
 
In analyzing accounts of formative trust experiences, I’ve found that people whose trust was abused were often in situations where they discovered—too late—that the landscape had changed, but they failed to notice because they thought they had already long ago figured out the situation. Despite the fact that a boss’s attitude toward them had shifted or someone in the organization was poisoning their reputation, they were living with a false sense of security. They let their vigilance lapse.
 
The Madoff scandal is a good example. Many people who invested their life savings with Bernie Madoff initially did their due diligence. But once they’d made their decision, their attention turned elsewhere. They were too busy making their money to manage it—which they often didn’t feel comfortable doing anyway, because they didn’t think of themselves as financial experts. As Holocaust survivor and Nobel Peace Prize winner Elie Wiesel, one of Madoff’s many victims, stated, “We checked the people who have business with him, and they were among the best minds on Wall Street, the geniuses of finance. I teach philosophy and literature—and so it happened.”
 
The challenge in revisiting trust is that it requires questioning the people we trust, which is psychologically uncomfortable. But when it comes to situations in which our physical, mental, or financial security is on the line, our trust must be tempered by a sustained, disciplined ambivalence.
 
Our predisposition to trust has been an important survival skill for young children and, indeed, for us as a species. Recent evidence, moreover, shows that trust plays a critical role in the economic and social vitality of nations, further affirming its fundamental value. But what helps humanity survive doesn’t always help the human, and our propensity to trust makes us vulnerable as individuals. To safely reap the full benefits of trust, therefore, we must learn to temper it.
 
The seven rules I offer here by no means represent a complete primer on how to trust judiciously. The science of trust is also much less complete than we would like, although it is growing rapidly as neuroeconomists, behavioral economists, and psychologists use powerful new techniques such as brain imaging and agent modeling to discover more about how we make judgments about whom to trust and when. But for all their shortcomings, these rules will help you make a good start on what will be a lifelong process of learning how to trust wisely and well.
 
Highlights—and lowlights—in the public’s trust of business
People’s trust in business takes a hard hit during scandals and financial crises; nevertheless, trust hasn’t always been low. Government agencies, consumer groups, and businesses themselves have helped build confidence over time by acting as watchdogs and establishing safeguards. Still, the recent round of abuses reminds us that the system is far from fail-proof and raises the question: Are we trusting business too much?
—The HBR Editors
 
1907
A scheme to corner the market in the stock of United Copper causes the collapse of Knickerbocker Trust and a financial panic. At one point J.P. Morgan locks leading bankers in a room until they agree to bail out weaker institutions.
 
1909
Moody’s publishes an analysis of the stocks and bonds of U.S. railroads, becoming the first to rate public-market securities. The growth of credit-ratings agencies fosters trust by helping investors assess the riskiness of various assets.
 
1912
After the U.S. Attorney takes Coca-Cola to court for false advertising, the ad industry falls into public disfavor. A group of U.S. executives forms the National Vigilance Committee to police truth in advertising. Its subsidiaries, which resolve cases at the local level, become the Better Business Bureaus.
 
1913
The U.S. Congress founds the Federal Reserve System, as the fallout from the Panic of 1907 finally breaks the political resistance to creating a strong central bank to avert monetary shortages.
 
1922–1929
As confidence in the prospects of big industrial companies rises, ordinary investors start purchasing stocks, not just bonds. The U.S. stock market soars. In October 1929, it crashes to earth.
 
1930s
During the Great Depression, the Pecora Commission investigates the causes of the crash, uncovering a wide range of misdeeds in banking. The U.S. government helps rebuild trust in business, by establishing regulatory bodies such as the FDIC and the SEC.
 
1941
Unprecedented government spending for World War II leads to abuses by contractors, especially in the United States. Harry Truman forms a special Senate committee to investigate.
 
1950s
Mutual funds, developed in the 1920s, take off as investors cautiously begin to give money to large intermediaries in order to distribute and manage their risks.
 
1960s
Ralph Nader’s Unsafe at Any Speed heightens awareness that business and consumer interests often clash. Congress passes a flurry of consumer safety and environmental protection laws.
 
1970s
The U.S. government creates several regulatory agencies to ensure that businesses act in the public interest. Securitization of loans begins, allowing home buyers to borrow from far-off lenders.
 
1978
Drexel Burnham Lambert uses risk-analysis tools to build a market for junk bonds that finance entrepreneurial companies and corporate takeovers. Junk bonds’ popularity dips after a trading scandal but resurges in the 1990s. By 2000, the use of junk bonds will become pervasive in corporate finance.
 
1981
Western governments start a far-reaching program of deregulation under Ronald Reagan and other leaders as people start trusting business more than government.
 
1983
The open-book management movement is born when Jack Stack, the new CEO of Springfield Remanufacturing Corporation, begins sharing financial information with all 119 employees and teaching them how to interpret it.
 
1984
A Union Carbide chemical gas spill in Bhopal, India, the worst industrial disaster in history, leads to greater skepticism about multinationals in developing countries.
 
1990s
Executive pay soars as U.S. companies experience a resurgence in competitiveness. The cult of the CEO grows, and global companies increasingly imitate the American approach to business.
 
1995
Excitement about the internet kicks off a period of “irrational exuberance,” in which investors bid up the stock prices of dot-com companies that have little or no profit.
 
1997
eBay institutes its feedback stars rating system, allowing buyers to rate the trustworthiness of sellers. The following year, its registered user base rises from 341,000 to 2.1 million.
 
2000
The technology heavy NASDAQ Composite Index reaches a peak of 5048.62 in March—and only a few weeks later falls 25%. The internet bubble bursts.
 
2001
Enron collapses into bankruptcy, followed by WorldCom and other companies rife with fraud.
 
2006
Grameen Bank and its founder, Muhammad Yunus, jointly receive the Nobel Peace Prize, making Grameen Bank the first business awarded this honor.
 
2008
Excessive leveraging from securitization, combined with the bursting of the housing bubble, leads to a severe credit crunch, where banks stop trusting companies with loans, and investors stop trusting banks. The world plunges into a severe recession.
 
2009
After suffering a historic loss, AIG uses its government bailout—more than $170 billion—to pay employees millions in bonuses. President Obama calls it an “outrage” and asks the Treasury Department to “pursue every single legal avenue” to recoup the bonuses.
인기기사