Discussion 1,2,3,4 Kim attached is the chapter readings please read before doing four discussions. each discussion shows different chapters where inf

Discussion 1,2,3,4
Kim attached is the chapter readings please read before doing four discussions. each discussion shows different chapters where information comes from. look below also each discussion must be 300 words each with 2 references including the book I attached for findings.

book name attached
Bolman, L.G. and Deal, T.E. (2017) Reframing Organizations, (6th ed.). San Francisco, CA: Jossey-Bass.

Don't use plagiarized sources. Get Your Custom Assignment on
Discussion 1,2,3,4 Kim attached is the chapter readings please read before doing four discussions. each discussion shows different chapters where inf
From as Little as $13/Page

Read chapter 1-2
1. Discussion Topic: What do the authors mean by reframing organizations and why is it important?
Introduction to Human services leadership
Bolman & Deal Chapter 1-2

2. Discussion Topic: Are leaders Born or Made? List the top ten traits of an effective leader.
The Role of the Leader in the Structural Frame
Bolman & Deal Chapters 3-5

3.Discussion Topic: How do leadership and management differ?
The Human Resource frame
Bolman & Deal chapters 6, 17

4.Discussion Topic: How do you motivate employees?
Management of Human Resources interpersonal & group dynamics
Bolman & Deal chapters 7-8

CHAPTER 1
INTRODUCTION
THE POWER OF REFRAMING
By the second decade of the twenty-first century, the German carmaker Volkswagen and
the U.S. bank Wells Fargo were among the world’s largest, most successful, and most
admired firms. Then both trashed their own brand by following the same script. It’s a
drama in three acts:

1. Act I: Set daunting standards for employees to improve performance.
2. Act II: Look the other way when employees cheat because they think it’s the only

way to meet the targets.

3. Act III: When the cheating leads to a media firestorm and public outrage, blame the
workers and paint top managers as blameless.

In Wells Fargo’s case, the bank fired more than 5,000 lower-level employees but offered an
exit bonus of $125 million to the executive who oversaw them (Sorkin, 2016).
Volkswagen CEO Martin Winterkorn was known as an eagle-eyed micromanager but
pleaded ignorance when his company admitted in 2015 that it had been cheating for years
on emissions tests of its clean diesels. He was quickly replaced by Matthias Mller, who
claimed that he didn’t know anything about VW’s cheating either. Mller also explained
why VW wasn’t exactly guilty: It was a technical problem. We had not the interpretation of
the American lawWe didn’t lie. We didn’t understand the question first (Smith and
Parloff, 2016). Apparently VW was smart enough to design clever software to fudge
emissions tests but not smart enough to know that cheating might be illegal.
The smokescreen worked for yearsVW sold a lot of diesels to consumers who wanted
just what Volkswagen claimed to offer, a car at the sweet spot of low emissions, high
performance, and great fuel economy. The cheating apparently began around 2008, seven
years before it became public, when Volkswagen engineers realized they could not make
good on the company’s public, clean-diesel promises (Ewing, 2015). Bob Lutz, an industry
insider, described VW’s management system as a reign of terror and a culture where
performance was driven by fear and intimidation (Lutz, 2015). VW engineers faced a
tough choice. Should they tell the truth and lose their jobs now or cheat and maybe lose
their jobs later? The engineers chose option B. The story did not end happily. In January,
2017, VW pleaded guilty to cheating on emissions tests and agreed to pay a fine of $4.3
billion. In the same week, six VW executives were indicted for conspiring to defraud the
United States.1 In Spring of 2017, VW’s legal troubles appeared to be winding down in the
United States, at a total cost of more than $20 billion, but were still ramping up in Germany,
where authorities had launched criminal investigations (Ewing, 2017).

The story at Wells Fargo was similar. For years, it had successfully billed itself as the
friendly, community bank. It ran warm and fuzzy ads around themes of working together
and caring about people. The ads did not mention that in 2010 a federal judge ruled that
the bank had cheated customers by deliberately manipulating customer transactions to
increase overdraft fees (Randall, 2010), nor that in August, 2016, the bank agreed to pay a

$4.1 million penalty for cheating student borrowers. But no amount of advertising would
have helped in September, 2016, when the news broke that employees in Wells Fargo
branches, under pressure from their bosses to sell more solutions, had opened some two
million accounts that customers didn’t want and usually didn’t know about, at least not
until they received an unexpected credit card in the mail or got hit with fees on an account
they didn’t know they had.
None of it should have been news to Wells Fargo’s leadership. Back in 2005, employees
began to call the firm’s human resources department and ethics hotline to report that some
of their coworkers were cheating (Cowley, 2016). The bank sometimes solved that problem
by firing the whistleblowers. Take the case of a branch manager in Arizona. While covering
for a colleague at another branch, he found that employees were opening accounts for fake
businesses. He called HR, which told him to call the ethics hotline. Ethics asked him for
specific data to support the allegations. He pulled data from the system and reported it. A
month later, he was fired for improperly looking up account information.
In 2013, the Los Angeles Times ran a story about phony accounts in some local branches.
Wells Fargo’s solution was not to lower the flame under the pot but to try and screw down
the lid even tighter. They kept up the intense push for cross-selling but sent employees to
ethics seminars where they were instructed not to open accounts customers didn’t want.
CEO John Stumpf achieved plausible deniability by proclaiming that he didn’t want want
anyone ever offering a product to someone when they don’t know what the benefit is, or
the customer doesn’t understand it, or doesn’t want it, or doesn’t need it (Sorkin, 2016, p.
B1). But despite his public assurances, the incentives up and down the line still rewarded
sales rather than ethical squeamishness. Many employees felt they were in a bind: they’d
been told not to cheat, but that was the best way to keep their jobs (Corkery and Cowley,
2016). Like the VW engineers, many decided to cheat now and hope that later never came.

Maybe leaders at Volkswagen and Wells Fargo knew about the cheating and hoped it would
never come to light. Maybe they were just out of touch. Either way, they were clueless
failing to see that their companies were headed for costly public-relations nightmares. But
they are far from alone. Cluelessness is a pervasive affliction for leaders, even the best and
brightest. Often it leads to personal and institutional disaster. But, sometimes there are
second chances.
Consider Steve Jobs. He had to fail before he could succeed. Fail he did. He was fired from
Apple Computer, the company he founded, and then spent 11 years in the wilderness
(Schlender, 2004). During this time of reflection he discovered capacities as a leaderand
human beingthat set the stage for his triumphant second act at Apple.
He failed initially for the same reason that countless managers stumble: like the executives
at VW and Wells Fargo, Jobs was operating on a limited understanding of leadership and
organizations. He was always a brilliant and charismatic product visionary. That enabled
him to take Apple from startup to major computer vendor, but didn’t equip him to lead
Apple to its next phase. Being fired was painful, but Jobs later concluded that it was the best
thing that ever happened to him. It freed me to enter one of the most creative periods of
my life. I’m pretty sure none of this would have happened if I hadn’t been fired from Apple.
It was awful-tasting medicine, but I guess the patient needed it.

During his period of self-reflection, Jobs kept busy. He focused on Pixar, a computer
graphics company he bought for $10 million, and on NeXT, a new computer company that
he founded. One succeeded and the other didn’t, but he learned from both. Pixar became so
successful it made Jobs a billionaire. NeXT never made money, but it developed technology
that proved vital when Jobs was recalled from the wilderness to save Apple from a death
spiral.
His experiences at NeXT and Pixar provided two vital lessons. One was the importance of
aligning an organization with its strategy and mission. He understood more clearly that he
needed a great company to build great products. Lesson two was about people. Jobs had
always understood the importance of talent, but now he had a better appreciation for the
importance of relationships and teamwork.
Jobs’s basic character did not change during his wilderness years. The Steve Jobs who
returned to Apple in 1997 was much like the human paradox fired 12 years earlier
demanding and charismatic, charming and infuriating, erratic and focused, opinionated and
curious. The difference was in how he interpreted what was going on around him and how
he led. To his long-time gifts as a magician and warrior, he had added newfound capacities
as an organizational architect and team builder.
Shortly after his return, he radically simplified Apple’s product line, built a loyal and
talented leadership team, and turned his old company into a hit-making machine as reliable
as Pixar. The iMac, iPod, iPhone, and iPad made Jobs the world’s most admired chief
executive, and Apple passed ExxonMobil to become the world’s most valuable company.
His success in building an organization and a leadership team was validated as Apple’s
business results continued to impress after his death in October 2011. Like many other
executives, Steve Jobs seemed to have it all until he lost itbut most never get it back.
Martin Winterkorn had seemed to be on track to make Volkswagen the world’s biggest car
company, and Wells Fargo CEO John Stumpf was one of America’s most admired bankers.
But both became so cocooned in imperfect worldviews that they misread their
circumstances and couldn’t see other options. That’s what it means to be clueless. You
don’t know what’s going on, but you think you do, and you don’t see better choices. So you
do more of what you know, even though it’s not working. You hope in vain that steady on
course will get you where you want to go.
How do leaders become clueless? That is what we explore next. Then we
introduce reframingthe conceptual core of the book and our basic prescription for sizing
things up. Reframing requires an ability to think about situations from more than one
angle, which lets you develop alternative diagnoses and strategies. We introduce four
distinct framesstructural, human resource, political, and symboliceach logical and
powerful in capturing a detailed snapshot. Together, they help to paint a more
comprehensive picture of what’s going on and what to do.

Virtues and Drawbacks of Organized Activity
There was little need for professional managers when individuals mostly managed their
own affairs, drawing goods and services from family farms and small local businesses.
Since the dawn of the industrial revolution some 200 years ago, explosive technological

and social changes have produced a world that is far more interconnected, frantic, and
complicated. Humans struggle to avoid drowning in complexity that continually threatens
to pull them in over their heads (Kegan, 1998). Forms of management and organization
effective a few years ago are now obsolete. Srieyx (1993) calls it the organizational big
bang: The information revolution, the globalization of economies, the proliferation of
events that undermine all our certainties, the collapse of the grand ideologies, the arrival of
the CNN society which transforms us into an immense, planetary villageall these shocks
have overturned the rules of the game and suddenly turned yesterday’s organizations into
antiques (pp. 1415).
Benner and Tushman (2015) argue that the twenty-first century is making managers’
challenges ever more vexing:
The paradoxical challenges facing organizations have become more numerous and
strategic (Besharov & Smith, 2014; Smith & Lewis, 2011). Beyond the innovation
challenges of exploration and exploitation, organizations are now challenged to be local
and global (e.g., Marquis & Battilana, 2009), doing well and doing good (e.g., Battilana &
Lee, 2014; Margolis & Walsh, 2003), social and commercial (e.g., Battilana & Dorado, 2010),
artistic or scientific and profitable (e.g., Glynn, 2000), high commitment and high
performance (e.g., Beer & Eisenstadt, 2009), and profitable and sustainable (e.g., Eccles,
Ioannou, & Serafeim, 2014; Henderson, Gulati, & Tushman, 2015; Jay, 2013). These
contradictions are more prevalent, persistent, and consequential. Further, these
contradictions can be sustained and managed, but not resolved (Smith, 2014).

The demands on managers’ wisdom, imagination and agility have never been greater, and
the impact of organizations on people’s well-being and happiness has never been more
consequential. The proliferation of complex organizations has made most human
activities more formalized than they once were. We grow up in families and then start our
own. We work for business, government, or nonprofits. We learn in schools and
universities. We worship in churches, mosques, and synagogues. We play sports in teams,
franchises, and leagues. We join clubs and associations. Many of us will grow old and die in
hospitals or nursing homes. We build these enterprises because of what they can do for us.
They offer goods, entertainment, social services, health care, and almost everything else
that we use or consume.

All too often, however, we experience a darker side of these enterprises. Organizations can
frustrate and exploit people. Too often, products are flawed, families are dysfunctional,
students fail to learn, patients get worse, and policies backfire. Work often has so little
meaning that jobs offer nothing beyond a paycheck. If we believe mission statements and
public pronouncements, almost every organization these days aims to nurture its
employees and delight its customers. But many miss the mark. Schools are blamed for mis-
educating, universities are said to close more minds than they open, and government is
criticized for corruption, red tape, and rigidity.

The private sector has its own problems. Manufacturers recall faulty cars or inflammable
cellphones. Producers of food and pharmaceuticals make people sick with tainted products.
Software companies deliver bugs and vaporware. Industrial accidents dump chemicals,
oil, toxic gas, and radioactive materials into the air and water. Too often, corporate greed,

incompetence, and insensitivity create havoc for communities and individuals. The bottom
line: We seem hard-pressed to manage organizations so that their virtues exceed their
vices. The big question: Why?

Management’s Track Record
Year after year, the best and brightest managers maneuver or meander their way to the
apex of enterprises great and small. Then they do really dumb things. How do bright people
turn out so dim? One theory is that they’re too smart for their own good. Feinberg and
Tarrant (1995) label it the self-destructive intelligence syndrome. They argue that smart
people act stupid because of personality flawsthings like pride, arrogance, and an
unconscious desire to fail. It’s true that psychological flaws have been apparent in brilliant,
self-destructive individuals such as Adolf Hitler, Richard Nixon, and Bill Clinton. But on the
whole, the best and brightest have no more psychological problems than everyone else.
The primary source of cluelessness is not personality or IQ but a failure to make sense of
complex situations. If we misread a situation, we’ll do the wrong thing. But if we don’t
know we’re seeing things inaccurately, we won’t understand why we’re not getting the
results we want. So we insist we’re right even when we’re off track.
Vaughan (1995), in trying to unravel the causes of the 1986 disaster that destroyed
the Challenger space shuttle and its crew, underscored how hard it is for people to
surrender their entrenched conceptions of reality:
They puzzle over contradictory evidence, but usually succeed in pushing it asideuntil
they come across a piece of evidence too fascinating to ignore, too clear to misperceive, too
painful to deny, which makes vivid still other signals they do not want to see, forcing them
to alter and surrender the world-view they have so meticulously constructed (p. 235).
So when we don’t know what to do, we do more of what we know. We construct our own
psychic prisons and then lock ourselves in and throw away the key. This helps explain a
number of unsettling reports from the managerial front lines:

Hogan, Curphy, and Hogan (1994) estimate that the skills of one half to three
quarters of American managers are inadequate for the demands of their jobs. Gallup
(2015) puts the number even higher, estimating that more than 80 percent of
American managers lack the talent they need. But most probably don’t realize it:
Kruger and Dunning (1999) found that the less competent people are, the more they
overestimate their performance, partly because they don’t know good performance
when they see it.

About half of the high-profile senior executives that companies hire fail within two
years, according to a 2006 study (Burns and Kiley, 2007).

The annual value of corporate mergers has grown more than a hundredfold since
1980, yet evidence suggests that 70 to 90 percent are unsuccessful in producing
any business benefit as regards shareholder value (KPMG, 2000; Christensen,
Alton, Rising, and Waldeck, 2011). Mergers typically benefit shareholders of the
acquired firm but hurt almost everyone elsecustomers, employees, and, ironically,
the buyers who initiated the deal (King et al., 2004). Stockholders in the acquiring
firm typically suffer a 10 percent loss on their investment (Agrawal, Jaffe, and
Mandelker, 1992), while consumers feel that they’re paying more and getting less.

Despite this dismal record, the vast majority of the managers who engineered
mergers insisted they were successful (KPMG, 2000; Graffin, Haleblian, and Kiley,
2016).

Year after year, management miscues cause once highly successful companies to
skid into bankruptcy. In just the first quarter of 2015, for example, 26 companies
went under, including six with claimed assets of more than $1 billion. (Among the
biggest were the casino giant, Caesars Entertainment, and the venerable electronics
retailer, RadioShack.)

Small wonder that so many organizational veterans nod in assent to Scott Adams’s
admittedly unscientific Dilbert principle: the most ineffective workers are systematically
moved to the place where they can do the least damagemanagement (1996, p. 14).

Strategies for Improving Organizations
We have certainly made a noble effort to improve organizations despite our limited ability
to understand them. Legions of managers report to work each day with hope for a better
future in mind. Authors and consultants spin out a torrent of new answers and promising
solutions. Policymakers develop laws and regulations to guide or shove organizations on
the right path.
The most universal improvement strategy is upgrading management talent. Modern
mythology promises that organizations will work splendidly if well managed. Managers are
supposed to see the big picture and look out for their organization’s overall well-being.
They have not always been equal to the task, even when armed with the full array of
modern tools and techniques. They go forth with this rational arsenal to try to tame our
wild and primitive workplaces. Yet in the end, irrational forces too often prevail.
When managers find problems too hard to solve, they hire consultants. The number and
variety of advice givers keeps growing. Most have a specialty: strategy, technology, quality,
finance, marketing, mergers, human resource management, executive search,
outplacement, coaching, organization development, and many more. For every managerial
challenge, there is a consultant willing to offer assistanceat a price.
For all their sage advice and remarkable fees, consultants often make little dent in
persistent problems plaguing organizations, though they may blame the clients for failing
to implement their profound insights. McKinsey & Co., the high priest of high-level
consulting (Byrne, 2002a, p. 66), worked so closely with Enron that its managing partner
(Rajat Gupta, who eventually went to jail for insider trading) sent his chief lawyer to
Houston after Enron’s collapse to see if his firm might be in legal trouble.2 The lawyer
reported that McKinsey was safe, and a relieved Gupta insisted bravely, We stand by all
the work we did. Beyond that, we can only empathize with the trouble they are going
through. It’s a sad thing to see (p. 68).
When managers and consultants fail, government recurrently responds with legislation,
policies, and regulations. Constituents badger elected officials to do something about a
variety of ills: pollution, dangerous products, hazardous working conditions,
discrimination, and low performing schools, to name a few. Governing bodies respond by
making policy. But policymakers don’t always understand the problem well enough to get

the solution right, and a sizable body of research records a continuing saga of perverse
ways in which the implementation process undermines even good solutions (Bardach,
1977; Elmore, 1978; Freudenberg and Gramling, 1994; Gottfried and Conchas, 2016;
Peters, 1999; Pressman and Wildavsky, 1973). Policymakers, for example, have been trying
for decades to reform U.S. public schools. Billions of taxpayer dollars have been spent. The
result? About as successful as America’s switch to the metric system. In the 1950s Congress
passed legislation mandating adoption of metric standards and measures. More than six
decades later, if you know what a hectare is or can visualize the size of a 300-gram package
of crackers, you’re ahead of most Americans. Legislators did not factor into their solution
what it would take to get their decision implemented against longstanding custom and
tradition.
In short, the difficulties surrounding improvement strategies are well documented.
Exemplary intentions produce more costs than benefits. Problems outlast solutions. Still,
there are reasons for optimism. Organizations have changed about as much in recent
decades as in the preceding century. To survive, they had to. Revolutionary changes in
technology, the rise of the global economy, and shortened product life cycles have spawned
a flurry of efforts to design faster, more flexible organizational forms. New organizational
models flourish in companies such as Pret Manger (the socially conscious U.K. sandwich
shops), Google (the global search giant), Airbnb (a new concept of lodging) and Novo-
Nordisk (a Danish pharmaceutical company that includes environmental and social metrics
in its bottom line). The dispersed collection of enthusiasts and volunteers who provide
content for Wikipedia and the far-flung network of software engineers who have developed
the Linux operating system provide dramatic examples of possibilities in the digital world.
But despite such successes, failures are still too common. The nagging question: How can
leaders and managers improve the odds for themselves as well for their organizations?

Framing
Goran Carstedt, the talented executive who led the turnaround of Volvo’s French division in
the 1980s, got to the heart of a challenge managers face every day: The world simply can’t
be made sense of, facts can’t be organized, unless you have a mental model to begin with.
That theory does not have to be the right one, because you can alter it along the way as
information comes in. But you can’t begin to learn without some concept that gives you
expectations or hypotheses (Hampden-Turner, 1992, p. 167). Such mental models have
many labelsmaps, mind-sets, schema, paradigms, heuristics, and cognitive lenses, to
name a few.3 Following the work of Goffman, Dewey, and others, we have chosen the
label frames, a term that has received increasing attention in organizational research as
scholars give greater attention to how managers make sense of a complicated and
turbulent world (see, e.g., Foss and Webber, 2016; Gray, Purdy, and Ansari, 2015;
Cornelissen and Werner, 2014; Hahn et al., 2014; Maitlis and Christianson, 2014). In
describing frames, we deliberately mix metaphors, referring to them as windows, maps,
tools, lenses, orientations, prisms, and perspectives, because all these images capture part
of the idea we want to convey.

A frame is a mental modela set of ideas and assumptionsthat you carry in your head to
help you understand and negotiate a particular territory. A good frame makes it easier to

know what you are up against and, ultimately, what you can do about it. Frames are vital
because organizations don’t come with computerized navigation systems to guide you
turn-by-turn to your destination. Instead, managers need to develop and carry accurate
maps in their heads.
Such maps make it possible to register and assemble key bits of perceptual data into a
coherent patternan image of what’s happening. When it works fluidly, the process takes
the form of rapid cognition, the process that Gladwell (2005) examines in his best
seller Blink. He describes it as a gift that makes it possible to read deeply into the
narrowest slivers of experience. In basketball, the player who can take in and comprehend
all that is happening in the moment is said to have court sense (p. 44). The military
stresses situational awareness to describe the same capacity.

Dane and Pratt (2007) describe four key characteristics of this intuitive blink process:
It is nonconsciousyou can do it without thinking about it and without knowing

how you did it.
It is very fastthe process often occurs almost instantly.

It is holisticyou see a coherent, meaningful pattern.

It results in affective judgmentsthought and feeling work together so you feel
confident that you know what is going on and what needs to be done.

The essence of this process is matching situational cues with a well-learned mental
frameworka deeply held, nonconscious category or pattern (Dane and Pratt, 2007, p.
37). This is the key skill that Simon and Chase (1973) found in chess mastersthey could
instantly recognize more than 50,000 configurations of a chessboard. This ability enables
grand masters to play 25 lesser opponents simultaneously, beating all of them while
spending only seconds on each move.
The same process of rapid cognition is at work in the diagnostic categories physicians rely
on to evaluate patients’ symptoms. The Hippocratic Oath to do no harm
requires physicians to be confident that they know what they’re up against before
prescribing a remedy. Their skilled judgment draws on a repertoire of categories and clues,
honed by training and experience. But sometimes they get it wrong. One source of error is
anchoring: doctors, like leaders, sometimes lock on to the first answer that seems right,
even if a few messy facts don’t quite fit. Your mind plays tricks on you because you see
only the landmarks you expect to see and neglect those that should tell you that in fact
you’re still at sea (Groopman, 2007, p. 65).

That problem tripped up leaders at Volkswagen, Wells Fargo, and countless other
organizations. Organizations are at least as complex as the human body, and the diagnostic
categories less well defined. That means that the quality of your judgments depends on the
information you have at hand, your mental maps, and how well you have learned to use
them. Good maps align with the terrain and provide enough detail to keep you on course. If
you’re trying to find your way around Beijing, a map of Chicago won’t help. In the same
way, different circumstances require different approaches.

Even with the right map, getting around will be slow and awkward if you have to stop and
study at every intersection. The ultimate goal is fluid expertise, the sort of know-how that
lets you think on the fly and navigate organizations as easily as you drive home on a
familiar route. You can make decisions quickly and automatically because you know at a
glance where you are and what you need to do next.

There is no shortcut to developing this kind of expertise. It takes effort, time, practice, and
feedback. Some of the effort has to go into learning frames and the ideas behind them.
Equally important is putting the ideas to use. Experience, one often hears, is the best
teacher, but that is true only if one learns from it. McCall, Lombardo, and Morrison (1988, p.
122) found that a key quality among successful executives was they were great learners,
displaying an extraordinary tenacity in extracting something worthwhile from their
experience and in seeking experiences rich in opportunities for growth.

Reframing
Frames define the questions we ask and solutions we consider (Berger 2014). John Dewey
defined freedom as the power to choose among known alternatives. When managers’
options are limited they make mistakes but too often fail to understand the source. Take a
simple example: What is the sum of 5 plus 5? The only right answer is 10. Ask a
different way, What two numbers add up to ten? Now the number of solutions is infinite
(once you include fractions and negative numbers). The two questions differ in how they
are framed. Albert Einstein once observed: If I had a problem to solve and my whole life
depended on the solution, I would spend the first fifty-five minutes determining the
question to ask, for

CHAPTER 2
SIMPLE IDEAS, COMPLEX ORGANIZATIONS
Precisely one of the most gratifying results of intellectual evolution is the continuous opening
up of new and greater prospects.

Nikola Tesla1
September 11, 2001 brought a crisp and sunny late-summer morning to America’s east
coast. Perfect weather offered prospects of on-time departures and smooth flights for
airline passengers in the Boston-Washington corridor. That promise was shattered for
four flights bound for California when terrorists commandeered the aircraft. Two of the
hijacked aircraft attacked and destroyed the Twin Towers of New York’s World Trade
Center. Another slammed into the Pentagon. The fourth was deterred from its mission
by the heroic efforts of passengers. It crashed in a vacant field, killing all aboard. Like
Pearl Harbor in December 1941, 9/11 was a day that will live in infamy, a tragedy that
changed forever America’s sense of itself and the world.
Why did no one foresee such a catastrophe? In fact, some had. As far back as 1993,
security experts had envisioned an attempt to destroy the World Trade Center using
airplanes as weapons. Such fears were reinforced when a suicidal pilot crashed a
small private plane onto the White House lawn in 1994. But the mind-set of principals in

the national security network was riveted on prior hijackings, which had almost always
ended in negotiations. The idea of a suicide mission, using commercial aircraft as
missiles, was never incorporated into homeland defense procedures.
In the end, 19 highly motivated young men armed only with box cutters were able to
outwit thousands of America’s best minds and dozens of organizations that make up the
country’s homeland defense system. Part of their success came from fanatical
determination, meticulous planning, and painstaking preparation. We also find a
dramatic version of an old story: human error leading to tragedy. But even the human-
error explanation is too simple. In organizational life, there are almost always systemic
causes upstream of human failures, and the events of 9/11 are no exception.
The United States had a web of procedures and agencies aimed at detecting and
monitoring potential terrorists. Had those systems worked flawlessly, the terrorists
would not have made it onto commercial flights. But the procedures failed, as did those
designed to respond to aviation crises. Similar failures have marked many other well-
publicized disasters: nuclear accidents at Chernobyl and Three Mile Island, the botched
response to Hurricane Katrina on the Gulf Coast in 2005, and the deliberate downing of
a German