Saturday, September 6, 2008
In order to study decision-making in complex environments, Dietrich Dorner and his associates created a computer simulation in which participants took on the role of a small town "mayor." The fictional town controlled a municipally owned watch factory, and interacted with a number of constituents, including a bank, retail stores, medical practices and restaurants. The "mayors" were allowed to control spending and make important strategic decisions over a simulated ten year period.
Many of the 48 mayors ran their small towns into economic ruin. In these cases, unemployment ran rampant, home ownership dropped, businesses failed, and the citizenry was, as a result, quite unhappy with the leadership displayed by their mayor.
Other mayors, though, presided over a town displaying remarkable economic growth characterized by housing growth, financial success of business institutions, and a delighted citizenry.
As they examined the differences between the "good" and the "bad" mayors, the Dorner group found that success was quite predictable, varying according to the decision behaviors of the mayors. They found:
The good mayors made more decisions than did the bad ones over the ten year period. Moreover, good mayors tended to make more and more decisions over time, while the bad mayors made a number of decision early on, and then tended to stick with the plans they'd made early in the process. As the game rolled out, though, the good mayors continued to find possibilities for influencing the fate of their town than the bad mayors.
As the authors note, "a town is a complex system of interlocking economic , ecological, and political components." It is impossible to make a decision about one aspect of municipal or economic life without impacting some other part of the system. The good mayors were able to see the town as a complex system, and were better at recognizing the cause and effect relationships among variables. Moreover, they anticipated possible unintended consequences of their decisions. When decisions did not result with anticipated effect, the good mayors were able to "tweak" their initial decisions and nudge the town back on the hoped-for economic track, rather than "staying the course" toward economic ruin.
Two essential elements emerged as researchers studied the difference between the good an bad mayors. First, the good mayors seemed to understand that they were dealing with a complex system, and sought to balance complex relationships among a variety of variables. They sought to understand the dynamic underpinnings of the town's economy, and the dynamic relationship among variables.
Second, the good mayors tended to ask more "why" questions. The good mayors tested their hypotheses more often than did the bad mayors. Bad mayors tended to take events at face value, and were less introspective about the ways their own decisions and behaviors had affected the fate of their town. For the bad mayors, "to propose a hypothesis was to understand reality; testing the hypothesis was unnecessary. Instead of generating hypotheses, they generated 'truths.'"
* * * * * * *
The Chernobyl disaster provides an excellent example of decision making about complex systems gone awry. Most remember that a nuclear reactor exploded at the Chernobyl facility in the Ukraine, then part of the USSR, on April 26, 1986. The fact that the disaster was caused entirely by human error of highly experienced and well educated technicians is less widely recalled.
A nuclear reactor is designed to generate intense heat in order to created steam that is then converted to electrical power. The nuclear reactivity must be contained within certain limits so that the reactor does not overheat. The balance between the degree of reactivity necessary to create power and the degree of reactivity that can send a reactor out of control is regulated by what are called control rods. To mitigate reactivity, rods are inserted . To increase reactivity, rods are withdrawn. At Chernobyl, there were 211 rods available for use in Reactor 4. A rule-of-thumb was widely understood that there should never be less than 15 rods inserted into the core.
On that day in April, 1986, a series of experiments was planned for the purpose of identifying improvements to existing safety systems. To conduct the experiments, engineers wanted to reduce the reactor's output to 25% of capacity. As it happened, though, the reactor was cut down to 1% capacity because of a miscalculation by an engineer. Running at such low capacity was known to create instability in the reactor, and the engineers were anxious to bring the process back up to the safe 25% level.
The group was anxious to get started with their study. To allow the reactor to "heat up," the engineers began a series of over-correction, and removed all but all but 6 to 8 of the rods, ignoring well established safety rules, and putting the nuclear process into a state well below safety standards.
With the benefit of hindsight, we can see that the engineers did something we have all experienced in one way or another: They over-steered their vehicle. We all know that you must not over-correct when a wheel slip occurs while driving over ice, because the over-steering can send a car on an even more dangerous course than the one spurring us to make the adjustment. Likewise a ship's captain must not over-steer in adjusting his course because once instigated, forces of physics cannot be easily undone.
Getting the reactor heated back up seemed to take an inordinately long period of time and, perhaps due to simple impatience, the engineers decided to perform their experiments even though the reactor had only returned to 7% level. They knew that they were working with a system with dangerously low stability, but they were an award-winning group of well qualified experts. Relying on their intuition and experience, the group continued, perhaps, because they knew that they were collectively too smart for anything bad to happen. To begin their experiment, the engineers shut off a steam pipe to observe the effects on other elements of the system.
Quite suddenly, the reactor began to react to the series of adjustments the engineers had made. With so many rods removed, the reactor was getting very hot very fast. Imagine the panic experienced by these men as they watched the reactor go out of control. Quickly, they attempted to reinsert the ever-critical rods back into place. Unfortunately, the pipes that received the rods had bent because of the heat and pressure, and the rods could not be pushed in. They reactor exploded within two minutes of the beginning of their experiment.
The immediate explosion took two lives.
On 26 April 1986 at 01:23:44 a.m. (UTC+3) reactor number four at the Chernobyl plant, near Pripyat in the Ukrainian SSR, exploded. Further explosions and the resulting fire sent a plume of highly radioactive fallout into the atmosphere and over an extensive geographical area. Four hundred times more fallout was released than had been by the atomic bombing of Hiroshima.[1]
The plume drifted over extensive parts of the western Soviet Union, Eastern Europe, Western Europe, Northern Europe, and eastern North America. Large areas in Ukraine, Belarus, and Russia were badly contaminated, resulting in the evacuation and resettlement of over 336,000 people. According to official post-Soviet data,[2] about 60% of the radioactive fallout landed in Belarus.
Sunday, August 10, 2008
Why We Don't Learn from Experience
Cognitive dissonance - when there is a discrepancy between reality as observed and one's worldview -- the perception of reality reality as constructed or conceived over one's lifetime -- people experience a sense of discomfort . Social Psychologists call this uncomfortable feeling "cognitive dissonance." Theorists hypothesize that people will tend to reconstruct their perceptions of reality in ways that will reduce the dissonance. cognitive dissonance appears to motivate a state of tension that occurs whenever a person holds two psychologically inconsistent cognitions in mind.
Leon Festinger coined the term after observing a cult of people who who believed they were recipients of a "revelation" from God about the end of times. The date of the apocalypse had been revealed to them through a prophet in exact terms: December 21, 19XX. Festinger stayed close to the group as the fateful day approached. When the end of times did not appear, members displayed a surprising conviction that their belief system had not really been disproven. Rather, they tended to change their memories about what the prophet had said, and to find ways in which the revelation had indeed come true. Their belief system remained intact. Festinger coined the term "cognitive dissonance" as he noted that members appeared to find admitting they'd been mistaken was too much of a cost to bear n light of the emotional and life investments each had made in the group. In a nutshell, cognitive dissonance appears to be such a powerful motivator that people will distort their perception of reality in order to lessen or mitigate the discomfort.
Selective Perception - First, we see that individuals faced with an array of disparate information will "select" and attend to the tidbits of data that support their worldview, or set of expectations, and will ignore information that does not fit their expectations.
Confirmation Bias -- Aronson notes that CD often leads to a "confirmation bias," characterized by both selective perception and selective memory.
Selective Perception - First, we see that individuals faced with an array of disparate information will "select" and attend to the tidbits of data that support their worldview, or set of expectations, and will ignore information that does not fit their expectations.
Selective and Distorted Memory -Second, studies of memory have shown that, over time, peoples' memories tend to change in ways that best "fit" their self-concept and general worldview.
T&A quote: p. 6 - "self serving distortions of memory kick in and we forget of distort past events." We gradually come to believe our own lies, the more we tell them. In fact, obvious misstatements of truth don't begin as full blown lies. First a little, then a little more, and then more still. Quicksand is a good metaphor for this. You get sucked in very gradually.
They cite LBJ, pg 7 - "a president who has justified his actions to himself, believing that he has the truth, becomes impervious to self-correction." "LBJ "had a fantastic capacity to persuade himself that the 'truth' which was convenient for the present was the truth..."
Self-Justification -- CD can also be employed to explain the strong tendency for people to engage in "self-justification." Tavris & Aronson: p. 29: "Dissonance is bothersome under any circumstance, but it is most painful to people when an important element of their self-concept is threatened -- typically when they do something that is inconsistent with their view of themselves."
T&A note (p 20) that the "confirmation bias sees to it that no evidence -- the absence of evidence -- is evidence for what we believe. p 2 - they note that "throughout his presidency GWB was the epitome of a man for whom even irrefutable evidence could not pierce his mental armor of self-justification." They note he was wrong in his assertion of WMD, a Saddam-Al Qaeda link, his estimate in the cost of the war, and his expectation that Iraqis would welcome the arrival of American soldiers with a joyful reception.
Lord Molson, British politician: (p. 17) --- "I will look at any additional evidence to confirm the opinion to which I have already come."
Many polls showed that many Americans believed that WMD had been found in Iraq, for months and even years since the military had concluded that there was nothing there to find. (Find examples.)
Hypocrisy: Aldous Huxley said "there is no such thing as a conscious hypocrite." (p 5). they note Ted Haggard did not dwell on the hypocrisy of railing against homosexuality while participating in sexual relationship with a male prostitute.
We now find that the brain itself appears to shut down in the presence of unexpected or dissonant information. see Dew Westen article: The Neural Basis of Motivated Reasoning. it appears that biases in perception are built into the way the brain works.
Once a decision is made, there are multiple mechanisms available for people to use to justify the rightness of the choice.
T&A point poignantly to the end of Casablanca. Despite Rick's admonition that "maybe not today, maybe not tomorrow, but soon, and for the rest of your life..." Elsa would regret making the wrong choice, staying in Casablanca with Rick. Noble ending, but not a correct observation of human nature. T&A postulate that Elsa would come to believe strongly in the rightness of her choice, no matter which direction she'd chosen. -- the drive for self-justification is just that strong.
Look through the lens of dissonance theory -- Several studies have shown that the judgments of "experts" in many fields are no more sound than those of randomly chosen people. The difference is that the experts are supremely confident in their view, while others admit to doubts.
Moreover, the experts are far more susceptible to distortions of their perceptions due to CD because their professional and personal reputation is at stake, unlike the normal person. This suggests, of course, that experts may be less likely to admit mistakes and learn from experience than non-experts.
T&A - p 36 - story of Jeb Magruder, Watergate culprit - show he was a good and decent person entering into his relationship with Liddy and the White House. -- slowly, a little at a time, Magruder went along with dishonest and illegal actions - justifying each one as he did.
Milgram's subjects followed the same pattern. Most would refuse to deliver the maximum jolt when they entered the situation, but when slowly building up the voltage - they seemed to justify each one because it was not much more than the last. Milgram experiment widely cited as showing that ordinary people will do vile and despicble thing when convinced that they are doing something for the greater good.
pg. 43 - "Democrats will endorse an extremely restrictive welfare proposal, one usually associated with Republicans, if they think it has been proposed by the Democratic Party." page 43 p 43 - we are as unaware of our blind spots as fish are unaware of the water they swim in. these biases lead to wrong decisions because of the confirmation bias, and to continue to justify the decision in order to avoid cognitive dissonance.
Merck funded the drug Vioxx. their own scientists did not uncover dangers, despite available data. Only when independent scientist, neither funded by Merck, nor having their own reputations at stake, was evidence of risks associated with the drug exposed. Likely, Merck's investigators did not lie, not did they "knowingly" distort their findings. Rather, they fell pry to cognitive biases that beset us all.
T&A p. 51 Similarly, a group of scientists paid by a group of parents of autistic children produced a study showing a positive correlation between childhood autism and childhood vaccines. 6 years later, 10 of 13 researchers retracted some of the results, citing a conflict of interest by the lead author. Since then, five studies have produced no causal relationship between vaccines and autism.
Us versus not-US -- pg 59 - "When things are going well, people feel pretty tolerant of other cultures and religions ... but when they are angry, anxious, or threatened, the default position is to activate their blind spots..." in a manner known as ethnocentrism... the belief that out own culture, nation or religion is superior to all others. stereotypes are bolstered by the self-justification bias. prejudice - once accepted to a small degree- is difficult to dislodge - cherry-picked pieces of data justify the initial belief, and prejudice grows a little bit at a time. --
Hitler's henchman Albert Speer - wrote in his memoirs: "people who turn their backs on reality are soon set straight by the mockery and criticism of those around them, which makes them aware they have lost credibility. In the Third Reich there were no such correctives, especially for those who belonged to the upper stratum. To the contrary, every self-deception was multiplied as in a hall of distorting mirrors, becoming a repeatedly confirmed picture of a fantastical dream world which no longer bore any relationship to the grim outside world. In those mirrors I could see nothing but my own face reproduced many times over." T&A refer to memory as "the self-justifying historian." pg 69
p 69 - most of us neither intend to lie nor intentionally deceive. Rather, we are self-justifying. "All of us, when we tell our stories, add details and omit inconvenient facts; we give the tail a small, self-enhancing spin." Reinforced for the story, we embellish it even more up on the next telling. "At the simplest level, memory smoothes out the wrinkles of dissonance by enabling the confirmation bias to hum along., selectively causing us to forget the discrepant information about beliefs we hold dear."
"if mistakes were made, memory helps us remember that they were made by someone else." pg 70 Anthony Greenwald refers to "the totalitarian ego" that ruthlessly destroys information it doesn;'t want to hear. [Find this XXX]
pg 71: great quote: "Confabulation, distortion, and plain forgetting are the foot soldiers of memory, and they are summoned to the front lines when the totalitarian ego wants to protect us from the pain and embarrassment of actions we took that are dissonant with our core self-images."
Nietzsche: "'I have done that, says my memory. 'I cannot have done that,' says my pride, and remains inexorable. Eventually--memory yields."
T&A page 71 Memory is reconstructive - pieces of experience are rebuilt from different parts of the brain. Much like a message is reconstructed over the internet from bytes that travel here through disparate pathways. As we rebuild the core memory, we are subject to the bias of our own theories. Upon repeated rebuilding, our story begins to look more and more as we'd have liked it to be. this relates to the "source confusion" phenomenon ---
When people learn that their memories are wrong, they are stunned.
We shape memories to fit our life story, rather than vice versa. See Barbara Tversky and Elizabeth Marsh study -- they show we "spin the stories of our lives." memories change to fit the story -- this happens gradually, over time. generally, memories change in the direction of self-enhancement.
See story of the book Fragments, by Binjamin Wilkomirski. He created a false biography, but seem to believe in the truth of his story. The book tells of his experiences in the Nazi death camps, despite...... there was a major problem with the story, though... as far as historians know, there were no "orphanages" in the Nazi concentration camps.
from Nasim Nicholas Taleb's The Black Swan: The Impact of the Highly Improbable. Memory is "a self-serving dynamic revision machine: you remember the last time you remembered the event and, without realizing it, change the story at every subsequent remembrance." (italics by original author) "we pull memories along causative lines, revising them involuntarily and unconsciously. We continuously re-narrate past events in the light of what appears to make what we think of as logical sense..."
He calls this "reverberation." memory corresponds to the strengthening of connections from an increase in brain activity in a given sector of the brain-- the more activity, the stonger the memory. The brain works this way: it creates narratives ;; and when a memory does not fit the narrative - we fix it so that it does. dreams are forms of narrative --
Taleb says that the story of the Maginot Line is suggestive. The French did learn from their experiences of WW1. They just "learned too specifically." In the run-up to WW2, they readied themselves to protect their county from a threat similar to that it faced two decades before.
pg. 50 --- we have a tendency to "tunnel"... we focus on a few well-defined sources of uncertainty, on too specific a list of [possibilities] at the expense of unimagined possibilities. This is why we did not expect the 9/11 style attack.
pg. 55 - Taleb describes what he calls "naive empiricism" -- "we have a tendency to look for instances that confirm our story and our vision of the world-- these instances are always easy to find... You take past instances that corroborate your theories and you treat them as evidence."
pg 62 - The Narrative fallacy -- our vulnerability to over interpretation and our predilection for compact stories over raw truth." note how truths are carried across generations in the form of mythology.
pg 84 -- the way to avoid the ills of the narrative fallacy is to favor experimentation over story-telling, experience over history, and clinical knowledge over theories.
pg 119 - "... we are explanation-seeking animals who tend to think everything has an identifiable cause and grab the most apparent one as the explanation. Yet there may not be a visible because; to the contrary, frequently there is nothing, not even a spectrum of possible explanations." Back to T&A: many of these false memories are not the result of calculated self-interest, but of self-persuasion. "The weakness of the relationship between accuracy and confidence is one of the best-documented phenomena in the 100-year history of eyewitness memory research." pg 108 The first interpretation of events is hard to break from. much evidence of this in the interrogation literature. IN this day of DNA evidence showing past cases to be mistakenly decided, s surprising number of prosecutors in these case refuse to admit that verdicts were wrong. "We impulsively decide we know what happened and then fit the evidence to support our conclusion, ignoring and discounting evidence that contradicts it." page 135 T&A When the evidence does not fit, the we tend to simply omit Many court-room based studies have shown that jurors will often make up their minds early in the process, and then selectively accept or reject the validity of evidence as it fits or contradicts their initial inclination. that is, when the evidence does not fit the story, they mitigate the importance of the evidence. In studies of police investigations: The confirmation bias sees to it that the prime suspect beacomes the only suspect. pg 137
"Once we have placed out bets, we don't want to entertain any information that casts doubt on that decision."
Example of spiraling self-justification: The crisis in Iran 1979 - Americans viewed their country as being attacked without provocation. So they were mad at the shah, Americans reacted, "what does that have to do with us?"
what started the hostage crisis? Each side blames the other. To Iranians, the Americans started the process in t1953 when the US aided a coup that deposed a democratically elected leader, Mohammed Mossedegh, and installed the Shah. Iranians blamed America as the shah accumulated great wealth, and used his secret police, the SAVAK, to put down dissent in reportedly brutal member.
the engine of the back-and-forth downward spiral of blame and connotation... self justification.
General Westmoreland said during the Vietnam War: "The Oriental doesn't put the same high price on life as does the Westerner. Life is plentiful. Life is cheap in the Orient." More self justification and stereotyping -- see Boston Globe 7-20-05,
2 ways to reduce cognitive dissonance. First is to say that if we do it, it must not be torture. "we do not torture," said GWB, "we use an alternative set of procedures." Second is to state that that the victims of torture simply got what they deserve.
"When George bush announced that he was launching a "crusade" against terrorism, most Americans welcomed the metaphor. In the West, crusade has positive connotations, associated with the good guys" -- think of the Billy Graham crusades... Batman and Robin, the Caped Crusaders.
Not so for most Muslims. The first Crusade of 1095 is still remembered. At that time, an army of Christians slaughtered the inhabitants of Jerusalem. to Muslims, it "might just as well have occurred last month, it's that vivid in the collective memory." Pg 206
in the 2nd presidential debate with John Kerrey October 8, 2004" Bush was asked for "three instances in which you came to realize you had made a wrong decision, and what you did to correct it." His response: "[When people ask about mistakes] they’re trying to say "Did you make a mistake in going to Iraq?" And the answer is 'Absolutely not.' It was the right decision....Now, you asked what mistakes. I made some mistakes in appointing people, but I'm not going to name them. I don't want to hurt their feelings on national TV."
Lao Tzu:
A great nation is like a great man:
When he makes a mistake, he realizes it.
Having realized it, he admits it.
Having admitted it, he corrects it.
He considers those who point out his faults
as his most benevolent teachers.
Saturday, August 9, 2008
Quotes about Strategy and Decision
"To lack inteloligence is to be in the ring blinfolded."
General David M. Shoup, former Commandant of the US Marine Corps.
Three basic consderations in the threat-evaluation process:
Capabilties - What can the enemy do?
Intentions - What will the enemy do?
Vulnerabilities - What are the enemy's salient weaknesses?
in Rear Admiral J. C. Wylie's book, Military Strategy, he identifies two "elemental, perhaps irreducible strategies, which he entitled "sequential" and "cumulative."
Sequential strategies constitute successive steps, each contingnet onthe one prceding it, that lead to the final objective.
Cumulative strategies constitute a collection of individaul, random actions which collectively and eventually provide an overwhelming or crushing result.
General Beaufre, quoted in Collins pg. 16, "the game of strategy can, like music, be played in two "keys." The major key is direct strategy in which force is the essentail factor. The monor key is indirect strategy, in which force recedes into the background and its place is taken by psychology and planning. Naturally, any strategy may make use of both these keys in varying degree and the result is a large number of "patterns..."
Like Sun Tzu: "to subuethe enemy without fighting is the acme of skill."
See Beaufre for another key strategist of history. Here is a clip for Wikipedia:
In his book 1940: The Fall of France, Beaufre writes: The collapse of the French Army is the most important event of the twentieth century. This may sound strange to American ears, but in a certain point of view this Uchronie is pretty close to correct. Had the French Army held, the Hitler regime would have almost certainly fallen. There would have been no Nazi conquest of Western Europe, no Nazi assault on the Soviet Union, no Holocaust, most likely no Communist takeover of Eastern Europe. He later gave his views on France's fall during interviews for the now famous production by Thames Television, The World at War.
To understand the roots of this catastrophic defeat, one must study social history, political history, and military history. While the proximate causes are to be found in military factors (dispersion rather than concentration of armored forces, in particular), the root causes lie in social and political factors. Anyone reading about France in the 1930s will be struck by the deep divisions in its society, and the extraordinarily vitriolic nature of its politics. Consider, for example, the matter of Léon Blum. In the late 1930s, the following phrase was popular among French elites: "Better Hitler than Blum"
From John M. Collins, Grand Strategy, pg. 2
Collins says: "strategy occupies two distinctive but inter-related planes, one abstract, the other concrete. The former is peopled with strategic philosophers and theoriticians, the latter with practical planners." pg. 14
"Grand strategy, which embraces such niceties as bluff, negotiation, economic skulduggery, and psychological warfare..." this not like Clausweitz but more like Liddell Hart who said "the true aim is not so much to seek battle but to seek a strategic situatuion so advantageous that if it does not of itself produce the decision, its continuation by a battle is sure to achieve this."
me: Two stories should suffice to demosntrate the improtance of strategi thinking. In WW2, strategy was not exemplified by planning for the invasion of Normandy, but by the grand strategy of which Normandy was just a piece.
A second example is Stonewall JHackson's maneuvers in Virginia, forcuing not so much onthe immediate enemy in front of him, but in pulling Union forces away from Lee's primary forces ositioned to the east.
Collins says "only a handful of strategic pioneers, like Alexander, Machiavelli, Lenin, Liddell Hart, and Mao, have devised innovative ways to substitute subtleties for brute force." pg. 16
"In sum, strategy is the are and science of options." Collins, pg 19
"the principle of Flexibility recognizes the inevitability of change in purposes, policies, plans and procedures." p. 25
Wyle (in Collins, p. 25) says "no one can predict with certainty the pattern war will take."
Saturday, July 26, 2008
Framing
- Establishing the question to be answered
- Articulating Assumptions
- Identifying criteria for decision-making
- Identifying appropriate and sufficient options
N&M (1986) suggest that decision making groups should make a standard practice of listing in three separate columns key elements of the immediate situation, namely those Known, Unclear, and Presumed.
They suggest keeping deliberation of "what to do" at bay until the situation surrounding the decision is characterised in this manner.
In corporate settings, we try to leave key assumptions in clear view on a white board or flipchart to remind decision-makers that their deliberations are built on a foundation of beliefs, which may or may not ultimately stand as facts. As the intelligence gathering process continues, the listing of assumptions can be changed with the ease of a white-board erasure, signaling to all that decisions should be tested against the latest set of assumptions.
Assumptions about key decisions are often evident as deliberants reference other decisions made in the past. Indeed, historical analogues can become so embedded in thinking that decision-makers are sometimes unaware the extent to which the present decision-process is flowing through a channel laid out like a template according to the way previous decisions and events unfolded.
The question of war with Iraq appears to have been influenced greatly by the World War II analogy. Rumsfeld's two key advisors, Paul Wolfowitz and Douglas Feith shared a common background that was seminal in each's choice of career and domain of interest, and appears to have shaped the way the two men viewed and understood the situation in Iraq.
Feith's father grew up in pre-WW2 Poland and Germany and became a sailor, adopting what Feith describes as an "uncommon occupation for a J
ew." The senior Feith took heroic action, helping to smuggle Jews off the Continent and onto the British Isles. Captured by the Germans, held in solitary confinement and tortured, he escaped Germany, emigrated to America, and served in the U.S. merchant marines for the remainder of the war. Both of his parents, along with four of his sisters and all three of his brothers were murdered in the holocaust.Partly as a result of his father's story, young Douglas Feith grew up with an abiding interest in history, and especially the circumstances of pre-WW2 Europe in which the British leaders sought to contain Hitler even as his father smuggled desperate Jews off the continent. Educating himself, Feith says he "read books on diplomacy, politics and government," and concluded that "nothing short of war could have stopped, let alone reversed, the Nazi aggression." Feith says that his study affected his views during the Vietnam War, as he began to question the prevailing view among his peers that war is never necessary. Indeed, he says. "the failures of appeasement in the 1930s made me skeptical about the promises of demonstrably bad actors -- tyrants, murderers, liars, terrorists and the like."
Paul Wolfowitz family history is remarkably similar to that of Feith, though Feith does not note the similarities of factors that shaped their thinking in his War & Decision. Wolfowitz' father, too, was a holocaust survivor.
Though he himself left Poland after WW1, the rest of his family perished in the holocaust [12]Wikipedia notes that "As a boy, Wolfowitz devoured books about the Holocaust and Hiroshima—what he calls 'the polar horrors'".[3] Speaking of the influence of the Holocaust on his views, Wolfowitz said:
"That sense of what happened in Europe in World War II has shaped a lot of my views ... It's a very bad thing when people exterminate other people, and people persecute minorities. It doesn't mean you can prevent every such incident in the world, but it's also a mistake to dismiss that sort of concern as merely humanitarian and not related to real interest."[12] He told the NY Times that
As Wolfowitz observed the American policy of "containing" the Iraqi threat to peace as a post Gulf War policy, he saw similarity to the British efforts to contain and appease Hitler's threats to pre-WW2 Europe. He told the NY Times (Ricks, pg. 16) th"that sense of waht happened in EUrope in World War 2 has shaped a lot of my views."
World War II as Analogy to Iraq
Strategic thinking is characterized by openness to new and different ideas. And one way to generate new and different perspectives on strategic situations is through the use of metaphor, or its close relative analogy, perhaps the most advanced form of human thinking. As Aristotle said in Poetics, “the greatest thing by far is to be a master of metaphor.” It is “a sign of genius, since a good metaphor implies an intuitive perception of the similarity in dissimilars.”
In their Harvard Business Review article entitled “How Strategists Really Think,” Giovanni Gavetti and Jan W. Rivkin show that reasoning by analogy plays a major role in the thinking of successful strategists. As an example, these writers point to Intel chairman Andy Grove’s story of how he came up with an important business strategy. Attending a management seminar, Grove heard the story of how fledgling “mini-mills” in the steel industry began in the 1970s to offer a low-end product—inexpensive concrete-reinforcing bars known as rebar. Establishing market share with the low-end products, these steel companies then began to migrate up the hierarchy of products toward the higher-end, more lucrative steel products. U.S. Steel, which had ceded the low-end products to the smaller and seemingly insignificant players, was caught unawares by the companies attacking the market for their core business and lost market share over a number of years.
An epiphany struck Andy Grove as he sat in that management seminar, thinking about the steel industry. Using what Gavetti and Rivkin call “analogical thinking,” Grove saw that Intel was sitting in a similar situation to that of U.S. Steel in the 1970s. Intel had theretofore leaned toward ceding low-end computer chips to niche players, a strategy that, Grove now realized, would put Intel in a dangerous situation. He began to see low-end computers as “digital rebar,” a metaphorical image that helped him in articulating his strategy to Intel management. “If we lose the low end today,” Grove said, “ we could lose the high end tomorrow.” As a result of this thinking, and the deliberations that followed, Intel redoubled its efforts to market the low-end “Celeron processor” for low-end personal computers.
Though a mental model—a hypothesis about cause and effect—provides a useful way of understanding the dynamics and working of the world around us, blind adherence to entrenched models can be dangerous. Once we close our eyes to disconfirming evidence, once we fail to see the weaknesses of our assumptions about cause and effect, we have failed as systems thinkers.
History, of course, is replete with examples of people adhering stubbornly to old paradigms despite overwhelming evidence that a new way of thinking has become necessary.
Mental models become the frames through which we view the world. We attend to what is inside our frame, oblivious sometimes to what occurs outside our frames, which can lead to dangerous blind spots. Frames can be useful insofar as they direct our attention toward the information we seek. But they can also constrict our peripheral vision, keeping us from noticing important information and, perhaps, opportunities. Once liberating, mental models can become shackles.
As an illustration of the way in which mental models and frames can get out of hand, consider Donald Schon’s concept of a generative metaphor. A generative metaphor is an “implicit metaphor that can cast a kind of spell on a community. All solutions are understood in terms of the implicit metaphor.” Some work cultures, for example, use a sports analogy as their generative metaphor, ubiquitously describing events in sports language and casting solutions as “game plans.” A generative metaphor like this can be healthy, but it can also restrict creativity and problem-solving, since the “team” may miss out on ideas and options not endemic to the metaphorical world at hand.
At times, an over-used generative metaphor can lead to a group dynamic known as groupthink. When cultural propensities like this become problematic, leaders can stimulate positive organizational change by introducing new and useful generative metaphors as they communicate with others. The new metaphor can provide people with a lens through which to see things anew and lead to positive change in the work atmosphere and business results.
Turning to history for guidance is the essence of wisdom. Thucydidies, the Greek historian of (XXX BC) said that he "wrote for those who want to udnerstand clearly the events which happened in the past and whihc (humna nature being what it is) will at some time or tother and in much the same ways be repeated inthe future."
N & M suggest "boarding" the Likeness and Differences between the present situation and a given analogy as a way of finding useful ways of thinking while limiting over-use of and particluar guiding metaphor. Had DOD decision-makers used such a process, they may have produced a chart like the following:
Likenesses:
- Saddam, like Hitler, was a tyrannical leader who controlled his minions through intimidation.
- Saddam had once tried and succeeded at over-running a neighboring country (Kuwait) with the use of conventional armored force, much as Hitler's armed forces overwhelmed, say, the Netherlands in 1940.
- Saddam did not hesitate to use torture or maiming in controlling his own people.
- Henchmen like Saddam's sons (XX) displayed ruthlessness reminiscent of (XXX).
Differences:
- Unlike the liberation of WW2 France, an American-led victory did not free an otherwise untied people.
- Unlike the vanquished WW2 Germany, surviving Iraqis were divided into multiple factions, some with an interest in continued strife in the country as battles heated up for control of the future.
- The liberated France of WW2 reestablished a country with a strong sense of national identity, culture and language. Iraq had been cobbled together (when XXX)
Assumption 1: There is a nexus between Saddam Hussein's Iraq and the acts committed by al Qaeda terrorists on Spetmeber 11, 2001.
Years later, a memo written by Wolfowitz surfaced during congressional investigations. The memo appears to imply an assumption by the DOD officials that a link between Iraq exists, and simply needs to be found.LA Times Article by Peter SpiegelApril 06, 2007:
Just four months after the Sept. 11 attacks, then-Deputy Defense Secretary Paul D. Wolfowitz dashed off a memo to a senior Pentagon colleague, demanding action to identify connections between Iraqi dictator Saddam Hussein’s regime and Al Qaeda.
“We don’t seem to be making much progress pulling together intelligence on links between Iraq and Al Qaeda,” Wolfowitz wrote in the Jan. 22, 2002, memo to Douglas J. Feith, the department’s No. 3 official.
Using Pentagon jargon for the secretary of Defense, Donald H. Rumsfeld, he added: “We owe SecDef some analysis of this subject. Please give me a recommendation on how best to proceed. Appreciate the short turn-around.”
Wolfowitz’s memo, released Thursday, is included in a recently declassified report by the Pentagon’s inspector general. The memo marked the beginnings of what would become a controversial yearlong Pentagon project supervised by Feith to convince the most senior members of the Bush administration that Hussein and Al Qaeda were linked – a conclusion that was hotly disputed by U.S. intelligence agencies at the time and has been discredited in the years since.
Hear Feith defend his role in pre-Iraq decision-making here:
http://www.npr.org/templates/player/mediaPlayer.html?action=1&t=1&islist=false&id=7309878&m=7309879
Eventually, the decision was framed from the perspective of the DOD "neo-cons." Bush would make a choice to go to war in the interest of spreading democracy. The world was told, though, that war was necessary to defend the region from weapons of mass destruction. As Feith defended the decision process years later, he insists that war was necessary regardless of the presence of WMD. Rather, he states in War & Decision, that XXXXX. The key error made by the Bush administration, he suggests, was not in assuming the presence of WMD, but in creating that impression on the global audience. The Administration, he says, should have been forthright in portraying the war as a matter of standing up for freedom.
Learning from Experience
Group Process
Thursday, July 3, 2008
Paul Wolfowitz and the World War II Frame
Nearly an entire generation of Wolfowitz paternal family was lost inthe holocaust of WWII. This background appears to have shaped Wolfowitz politics and worldview.