View Poll Results: Andrew Bacevich
- 0. You may not vote on this poll
Big Sister IS watchIng me Sleep
Andrew Bacevich: Delta (Extravert) [IEE-Fi?, LSE-Si?; or EII?]
“The drones are an excuse to avoid thinking seriously about what the United States should be doing to address Al-Qaeda, or more broadly, the threat posed by violent Islamic radicalism.
And so what we need to do is demilitarize U.S. policy. Now, in Washington, that is sort of an unthinkable prospect because we’ve got a national security elite that is so much devoted to the assumption that military power provides our strong suit. But the evidence says otherwise.
The notion that somehow Al-Qaeda is the equivalent of Nazi Germany or Imperial Japan is foolish.”—Andrew Bacevich (Unmanned: America’s Drone Wars)
BREACH OF TRUST: How Americans Failed Their Soldiers and Their Country by Andrew J. Bacevich; pp. 15-27:
NATION AT WAR
How war, which once served to enhance American power and wealth, became a source of national rack and ruin.
[Chapter] 1. People’s War
War is an unvarnished evil. Yet as with other evils—fires that clear out forest undergrowth, floods that replenish soil nutrients—war’s legacy can include elements that may partially compensate (or at least appear to compensate) for the havoc inflicted and incurred.
For the United States, the Civil War offered one such occasion. To preserve the Union and destroy slavery, Americans served and sacrificed without stint. The citizen-soldiers who responded to the charge contained in the “Battle Hymn of the Republic”—“As He died to make men holy, let us die to make men free”—won a great victory. In doing so, they set the stage for the nation’s emergence in the latter part of the nineteenth century as the world’s preeminent economic power. Out of blood came muscle.
World War II proved to be a second such occasion for acquiring muscle, if not for other powers at least for the United States. Yet by 1941, in return for service and sacrifice, Americans expected rewards more tangible than the satisfaction of doing God’s will. Once again, citizen-soldiers would fight for freedom. Thanks to the New Deal, however, freedom meant something more than submission to market forces. It now implied some measure of reciprocity, with citizens guaranteed access to the minimum essentials of life.
In describing what was at stake in World War II, President Franklin D. Roosevelt called this “freedom from want.”* Making freedom thus defined available to the average American was by now becoming the job of political authorities in Washington. So in their approach to justifying war against the Axis, Roosevelt and his lieutenants shrewdly emphasized a shimmering consumer-oriented vision of democratic purpose.
* Franklin D. Roosevelt, “Four Freedoms Speech,” January 6, 1941, http://www.americanrhetoric.com/spee...urfreedoms.htm
In concrete terms, FDR explained, this translated into “economic understandings which will secure to every nation a healthy peacetime life for its inhabitants—everywhere in the world.” Yet as interpreted by the illustrator Norman Rockwell, “Freedom from Want” signified a happy American family gathered around a dining room table, piled high with all the bounty associated with the American tradition of Thanksgiving. By implication, freedom in this context meant not global free trade agreements but unfettered personal consumption.
To a greater extent than any prior conflict, mobilizing for World War II became an indisputably communal undertaking, involving quite literally everyone. So, too, did the war’s actual conduct. As a result, the historian William O’Neill writes, the United States fought World War II as a “people’s war.” Rather than “uphold[ing] personal gratification as the be all and end all of life,” Americans demonstrated a hitherto hidden capacity for government-prescribed collective action. The appetite for personal gratification did not disappear. Yet at least for the duration Americans proved willing to curb it.
In this regard, the cultural moment was propitious. For a short time, the distance separating elite, middlebrow, and popular artistic expression seemed to collapse. Proletarian impulses released by the Great Depression persisted into the war years, infused now with a sense of hope that the promise of American life might indeed find fulfillment—and soon. Yearning and expectation gradually displaced the anger and despair that had characterized the 1930s. On symphony stages, this popular mood found expression in works like Aaron Copland’s Fanfare for the Common Man (1942) and Appalachian Spring (1944). On Broadway, there was Oklahoma! (1943) by Richard Rodgers and Oscar Hammerstein. (“We know we belong to the land, and the land we belong to is grand!”) At the movies, Oscar-nominated films such as Mr. Smith Goes to Washington (1939), Our Town (1940), The Grapes of Wrath (1940), and Sergeant York (1941) all mined the rich veins of populism. In photography these tendencies suffused the social realism of Dorothea Lange and Walker Evans. In painting, American regionalists such as Thomas Hart Benton, Grant Wood, and John Steuart Curry paid homage to ordinary workers while expressing nostalgia for small-town and rural America. In a war-specific context, there was the memorable work of the cartoonist Bill Mauldin, creator of the “dogface” soldiers Willie and Joe. Elitism had not disappeared from the American scene, but for a time it was thrown on the defensive.
“In a democracy,” Undersecretary of War Robert Patterson declared in 1944, “all citizens have equal rights and equal obligations.” A graduate of Harvard Law School, Patterson was himself a combat veteran of World War I. “When the nation is in peril,” he continued, “the obligation of saving it should be shared by all, not foisted on a small percentage.” [Quoted in Keith Eiler, Mobilizing America: Robert P. Patterson and the War Effort, 1940-1945 (Ithaca, N.Y., 1997), p. 282. Patterson was testifying before the Senate Committee on Military Affairs.] With regard to obligations (if not rights), Patterson’s Axiom accurately described the Roosevelt administration’s approach to war. All would contribute to the cause. All would share in whatever burdens the war effort imposed. All (or mostly all) could expect to share in the benefits, the president himself promising “jobs for those who can work. Security for those who need it. The ending of special privilege for the few. The preservation of civil liberties for all.” [Roosevelt, “Four Freedoms Speech.”]
At least as important was this unspoken caveat: although achieving victory would require shared sacrifice, the president would seek to limit the pain and suffering that Americans would actually endure. The price of defeating the Axis promised to be high. Yet FDR intended, wherever possible, to offload that price onto others, while claiming for the United States the lion’s share of any benefits. For some (but not too much) pain, enormous gain — that describes the essence of U.S. grand strategy.
To an astonishing degree, Roosevelt and his lieutenants made good on both elements of this formula.
When it came to raising an army, therefore, inclusiveness became a defining precept. Rather than relying on volunteers, the United States implemented a system of conscription similar to the one devised for World War I. The draft took black and white, rich and poor, the famous and the obscure, Ivy Leaguers and high school dropouts. In order to field a force that peaked at twelve million serving members, the armed services inducted just about anyone meeting their mental and physical prerequisites. The sons of leading politicians like President Roosevelt served, as did the sons of multimillionaires like Joseph P. Kennedy. Hollywood idols Douglas Fairbanks Jr., Henry Fonda, Clark Gable, Tyrone Power, and James Stewart found themselves in uniform. So, too, did A-list movie directors Frank Capra, John Ford, John Huston, George Stevens, and William Wyler; baseball stars Ted Williams, Joe DiMaggio, and Hank Greenberg; and boxing greats Joe Louis and Gene Tunney.
In other words, the United States waged World War II with a citizen army that reflected the reigning precepts of American democracy (not least of all in its adherence to Jim Crow practices). Never again would U.S. forces reflect comparable diversity. Never again would they demonstrate comparable levels of overall effectiveness.
Service exacted sacrifice. Patterson’s Axiom applied across the board. Among the four hundred thousand American lives claimed by World War II were nineteen players from the National Football League. Glenn Miller, America’s most popular bandleader, was killed while serving with the U.S. Army Air Forces. Harvard University contributed its share. Inscribed on one wall of the university’s Memorial Church are the names of 453 Harvard men who died in World War II—just 35 fewer than the total number of West Pointers lost. Harvard’s dead included four members of the university faculty and the nation’s commander in chief (class of 1904).
The citizen-army’s strengths and limitations as a fighting force reflected—and affirmed—the civil-military contract forged for the duration, the essence of which was a widely shared determination “to get the goddam thing over and get home,” the sooner the better. According to the novelist James Gould Cozzens, a World War II veteran, the average soldier lost little sleep contemplating the question “why we fight.” Only a single definition of purpose “carried or ever could carry any weight with him.”
His war aim was to get out as soon as possible and go home. This didn’t mean that he wouldn’t fight—on the contrary. Brought within fighting distance of the enemy, he saw well enough that until those people over there were all killed or frightened into quitting, he would never get home. He did not need to know about their bad acts and wicked principles. Compared to the offense they now committed . . . by shooting at him and keeping him here, any alleged atrocities of theirs, any evil schemes of their commanders, were mere trifles. [James Gould Cozzens, Guard of Honor (New York, 1948), p. 275.]
Home signified homely satisfactions. “Your ordinary, plain, garden-variety GI Joe,” wrote Richard Polenberg in his popular history of the war, “was fighting for the smell of fried chicken, or a stack of Dinah Shore records on the phonograph, or the right to throw pop bottles at the umpire at Ebbets Field.” [Richard R. Lingeman, Don’t You Know There’s a War On? (New York, 1970), p. 208.] Or as the journalist James Wechsler put it, throughout World War II, “the American soldier—happily—always remained a civilian. His vision of the brave new world was hardly as luminous as that of editorial writers. He wanted merely security and peace and a chance to go back where he came from. . . . In a word, status quo ante, with trimmings.”
Such mundane aspirations did not imply a grant of authority allowing Roosevelt to expend American lives with abandon. Indeed, for FDR to assume otherwise would have placed his bargain with the American people at risk. Fortunately, circumstances did not require that the president do so. More fortunately still, he and his advisers understood that.
The outcome of World War II turned, above all, on two factors: in Europe, the prowess and durability of the Red Army; in the Pacific, the weakness and vulnerability of the Japanese economy. To hit the perfect strategic sweet spot—winning big without losing too much—required the United States to exploit both of these factors. This Roosevelt ably succeeded in doing.
Success entailed making the most of America’s comparative advantage in the production of war-essential matériel. Whatever the category—coal, oil, steel, foodstuffs, or finished goods like ships, tanks, and aircraft—no other belligerent could match the United States in productive capacity. Moreover, the American “arsenal of democracy”—difficult to attack and impossible to conquer—lay beyond the effective reach of Axis forces. Not long after Pearl Harbor, the army chief of staff, General George C. Marshall, announced, “We are determined before the sun sets on this terrible struggle that our flag will be recognized throughout the world as a symbol of freedom on the one hand and of overwhelming power on the other.” Tapping that arsenal for all it was worth held the key to fulfilling Marshall’s vision, which was also Roosevelt’s.
The essential task was to expedite the conversion of U.S. economic might into Allied killing capacity. On that score, in the eyes of America’s senior war managers, Soviet fighting power represented an asset of incalculable value. In Washington, Winston Churchill’s speeches about the common heritage of the “English-speaking peoples,” however inspiring, mattered less than did the Red Army’s manifest ability to absorb and inflict punishment. “A democracy,” Marshall later remarked, “cannot fight a Seven Years War.” When it came to waging total war, totalitarian dictatorships did not labor under comparable limitations. The people of the Soviet Union would fight as long as their supreme leader, Joseph Stalin, obliged them to do so.
With France defeated and the British empire short of will and wherewithal, the president looked to the Red Army to destroy the mighty Wehrmacht. “The whole question of whether we win or lose the war depends on the Russians,” he told Treasury Secretary Henry Morgenthau in June 1942. That same year Admiral Ernest King, chief of naval operations, assured reporters in an off-the-record briefing that “Russia will do nine-tenths of the job of defeating Germany.”
Getting the Russians to shoulder the burden of defeating America’s most dangerous adversary promised both to ensure support for the war effort on the home front and to position the United States to become victory’s principal beneficiary. “The American people will not countenance a long war of attrition,” the Pentagon’s Joint War Plans Committee had warned in 1943. A long war of attrition fought by the Soviet Union was altogether another matter, however. For Washington, providing Stalin with whatever the Soviet Union needed to stay in the fight (while easing any doubts the Soviet dictator might entertain about America’s commitment to the cause) constituted not only a strategic priority but also a domestic political imperative.
To appreciate the implications of this arrangement—the Soviets doing most of the fighting while drawing freely on the endless bounty of American farms and factories—consider casualty statistics. At just above four hundred thousand, U.S. military deaths for the period 1941-45 were hardly trivial. Yet compared to the losses suffered by the other major belligerents, the United States emerged from the war largely unscathed. Estimates of Soviet battle losses, for example, range between eleven and thirteen million.* Add civilian deaths—ten million or more in the Soviet Union, a mere handful in the United States—and the disparity becomes that much greater. To ascribe this to the fortunes of war is to deny Roosevelt credit that is rightly his.
* Williamson Murray and Allan R. Millett, A War to Be Won: Fighting the Second World War (Cambridge, Mass., 2000), p. 558.
The U.S. approach to waging war against the Japanese empire offered a variation on the same theme. With opportunities for outsourcing that war less available (and less desired), the United States shouldered the principal responsibility for defeating a Japan that was as resource poor as the United States was resource rich. When it came to industrial capacity, Japan was a comparative pygmy, its economy approximately one-tenth as large as the American leviathan. In 1941, Japan accounted for 3.5 percent of global manufacturing output, the United States 32.5 percent. At the outset of hostilities, Japan was producing 5.8 million tons of steel and 53.7 million tons of coal annually. For the United States, the comparable figures were 28.8 million and 354.5 million. As the war progressed, this gap only widened. The submarines that decimated Japan’s merchant fleet and the bombers incinerating its cities brought the economy to its knees.
“In any week of her war with Germany between June 1941 and May 1945,” writes the historian H. P. Willmott, succinctly expressing the genius of U.S. grand strategy, “the Soviet Union lost more dead than the total American fatalities in the Pacific war.”* Many factors account for that disproportion, but among them were calculated choices made by FDR and his principal advisers: give the Russians whatever they needed to kill and be killed fighting Germans; engage the Wehrmacht directly in large-scale ground combat only after it had been badly weakened; and fight the Japanese on terms that played to American advantages, expending matériel on a vast scale in order to husband lives.
* H. P. Willmott, The Second World War in the Far East (Washington, D.C., 1999), p. 128.
“Our standard of living in peace,” General Marshall had declared in September 1939, “is in reality the criterion of our ability to kill and destroy in war,” adding that “present-day warfare is simply mass killing and mass destruction by means of machines resulting from mass production.” The unspoken corollary was this: the mass production of machines to wage war could enhance the American standard of living in the peace to follow. A preference for expending machines rather than men could—and did—produce strikingly positive effects on the home front.
Even today, the numbers remain startling. While a conflict of unprecedented scope and ferocity was devastating most of Eurasia, the United States enjoyed a sustained economic boom. Between 1939 and 1944, the nation’s gross domestic product grew by 52 percent in constant dollars. Manufacturing output trebled. Despite rationing—inconvenience packaged as deprivation—consumer spending actually increased. [Paul A. C. Koistinen, Arsenal of World War II: The Political Economy of American Warfare, 1940-1945 (Lawrence, Kans., 2004), p. 498.]
More remarkable still, the benefits of this suddenly restored prosperity were broadly distributed. To be sure, the rich became richer, with the wartime pretax income of the top quintile of earners increasing by 55.7 percent. Yet the nonrich also benefited and disproportionately so. Families in the lowest quintile saw their incomes grow by 111.5 percent, in the second lowest by 116 percent.* Between 1939 and 1944, the share of wealth held by the richest 5 percent of Americans actually fell, from 23.7 percent to 16.8 percent. [Richard Polenberg, War and Society: The United States, 1941-1945 (Philadelphia, 1972), p. 94.] The war that exhausted other belligerents and left untold millions in want around the world found Americans becoming not only wealthier but also more equal.
Notably, all of this happened despite (or because of) increased taxation. Throughout the war, tax policy remained a contentious issue. Overall, however, Americans paid more, and more Americans paid. Between 1940 and 1942, the corporate tax rate went from 24 to 40 percent, with an additional proviso taxing “excess” profits at 95 percent. Tax rates on individual income became more progressive even as larger numbers of wage earners were included in the system. In 1941, approximately 7 percent of Americans paid federal income taxes; by 1944, that figure had mushroomed to 64 percent. No one proposed that wartime might offer a suitable occasion for cutting taxes.
None of this is to imply that World War II was a “good war,” either on the fighting fronts or at home. If anything, the war stoked deep-seated prejudices and provided an outlet for modern-day pathologies. Race riots rocked major American cities. Bitter strikes paralyzed critical industries. Prostitution flourished. Unwanted pregnancies and sexually transmitted diseases proliferated. Social dislocation produced increases in juvenile delinquency. To this day, the mass incarceration of Japanese Americans remains a deeply embarrassing stain on President Roosevelt’s record.
Yet if not good, Roosevelt’s war was surely successful. If the essential objective of statecraft is to increase relative power, thereby enhancing a nation-state’s ability to provide for the well-being of its citizens, then U.S. policy during World War II qualifies as nothing less than brilliant. Through cunning and foresight, he and his lieutenants secured for the United States a position of global pre-eminence while insulating the American people from the worst consequences of the worst war in history. If World War II did not deliver something for nothing, it did produce abundant rewards for much less than might have been expected.
Furthermore, the collaboration forged between government and governed yielded more than victory abroad. At home, it dramatically enhanced the standing of the former while reinvigorating the latter. The Great Depression had undermined the legitimacy of the American political system, prompting doubts about the viability of democratic capitalism. World War II restored that lost legitimacy with interest. As a people, Americans emerged from the war reassured that prosperity was indeed their birthright and eager to cash in on all that a fully restored American dream promised. Thanks to FDR’s masterly handling of strategy, those gains came at a decidedly affordable price. War waged by the people had produced battlefield success and much more besides.
* Harold Vatter, The U.S. Economy in World War II (New York, 1985), p. 143.
pp. 28-35: After September 11, 2001, when George W. Bush inaugurated the Global War on Terrorism, he saw another such victory ahead, one that would again refurbish and restore the nation’s sense of purpose. “This time of adversity,” the president declared in his 2002 State of the Union Address, “offers a unique moment of opportunity, a moment we must seize to change our culture.” With the Afghan War seemingly all but won and an invasion of Iraq in the offing, Bush laid out his vision of renewal. “For too long,” he lamented, “our culture has said, ‘If it feels good, do it.’” No more, however. With the advent of global war, Americans were finding inspiration in heroic new role models, the president believed. The implications promised to be transformative. “Now America is embracing a new ethic and a new creed: ‘Let’s roll.’ In the sacrifice of soldiers, the fierce brotherhood of firefighters, and the bravery and generosity of ordinary citizens, we have glimpsed what a new culture of responsibility could look like . . . a Nation that serves goals larger than self.”
No such transformation ensued. Indeed, the way President Bush chose to wage his war ensured a contrary result. If anything, the war on terror, stretching across more than a decade, served to mask a preexisting cultural crisis while setting the stage for large-scale economic calamity. In stark contrast to the Civil War and World War II, it depleted the nation’s stores of moral capital, leaving in its wake cynicism and malaise along with chronic dysfunction. It impelled the country on a downward, not an upward, trajectory.
Embarking upon what he himself unfailingly described as an enterprise of vast historic significance, Bush wasted no time in excluding the American people from any real involvement. Choosing war, he governed as if there were no war.
“We have suffered great loss,” the president acknowledged in a nationally televised address shortly after 9/11. “And in our grief and anger we have found our mission and our moment . . . The advance of human freedom . . . now depends on us. Our nation, this generation, will lift the dark threat of violence from our people and our future. We will rally the world to this cause by our efforts, by our courage. We will not tire, we will not falter and we will not fail.”
But who exactly was this we? To whom was the president referring in his repeated and fervent use of the first-person plural?*
* Wendell Berry first posed this crucially important question in March 2003. See his “A Citizen’s Response to the National Security Strategy of the United States of America,” http://www.quietspaces.com/wendellberry.html
It soon became apparent that Bush’s understanding of we differed substantially from Abraham Lincoln’s “we here highly resolve” at Gettysburg. It differed more drastically still from FDR’s in the post-Pearl Harbor declaration: “We are now in this war. We are all in it—all the way.”
Bush did not intend his we to be taken literally. It was nothing more than a rhetorical device, a vehicle for posturing. Minimizing collective inconvenience rather than requiring collective commitment became the distinctive signature of his approach to war management.
From the very outset, Bush made it clear that he wanted members of the public to carry on as before. After all, to suspend the pursuit of individual happiness (defined in practice as frantic consumption) was to hand the terrorists a “victory.” So within three weeks of the 9/11 attacks, the president was urging his fellow citizens to “enjoy America’s great destination spots. Get down to Disney World in Florida. Take your families and enjoy life, the way we want it to be enjoyed.” To facilitate such excursions, the president persuaded Congress to cut taxes, a 2003 tax relief measure coming on top of one that he had already signed into law prior to 9/11.
In effect, George W. Bush inverted the stern inaugural charge issued by John F. Kennedy in 1961: “Ask not what your country can do for you.” After 9/11, citizens had no need to ask. The Bush administration sought to anticipate their desires. To purchase support for or acquiescence in his global war (and the invasion and occupation of two countries in the Greater Middle East), the administration, with congressional approval, distributed bonuses at home.
Americans had little difficulty interpreting the president’s prompts. In short order, the we called upon to advance the cause of human freedom took a backseat to the we called upon to enjoy life, whether in Disney World or elsewhere. Thus encouraged, Americans disengaged from Bush’s war, leaving to others the task of waging it.
THE THREE NO’S
Senior military and civilian officials who managed World War II had viewed public support for the war effort as both critical and finite, an essential asset to be carefully nurtured and no less carefully expended. Throughout the war years, concern that citizens might balk at marching orders not to their liking remained omnipresent. Hence the pervasive propaganda aimed at sustaining morale on the home front while painting a bright picture of all that peace promised to bring in its wake. Hence, too, the determination of Pentagon planners to avoid asking of Americans more than they were willing to give.
After 9/11, the Bush administration freed itself of any such concerns. It did so by reformulating the allotted wartime role of the public. “We’re at war,” President Bush told his vice president on the morning of the attacks, and “someone’s going to pay.” What soon became clear was that the president’s definition of someone did not include the citizens of the United States.
In the immediate aftermath of 9/11, “United We Stand” held sway as something akin to a national slogan, expressing shared hurt, anger, and determination. Not for long, however. Within a matter of months, although nominally “at war,” the nation began behaving as if it were “at peace.” Americans had by then settled on three first-person-plural axioms to describe the unofficial but inviolable parameters of their prospective wartime role.
• First, we will not change.
• Second, we will not pay.
• Third, we will not bleed.
According to the first postulate, Americans, heeding their president, refused to permit war to exact demands. Instead, they remained intent on pursuing their chosen conceptions of life, liberty, and happiness, unhindered and unencumbered. They would accept no reordering of national priorities intended to facilitate the war’s prosecution.
According to the second postulate, Americans had no responsibility to cover the financial costs entailed by war’s conduct. The books need not balance. Increases in military expenditures, therefore, required neither increased revenue nor a willingness to accept reduced services. Choosing between guns and butter was neither necessary nor acceptable. To fund war, the government simply borrowed.
According to the third postulate, actual participation in war became entirely a matter of personal choice. Service (and therefore sacrifice) was purely voluntary. War no longer imposed collective civic duty—other than the necessity of signaling appreciation for those choosing to serve.
As long as it abided by these proscriptions, Washington could pretty much make war whenever, wherever, and however it wanted, assured of at least tepid popular consent. In this decoupling of the people from war waged in their name lay the Bush administration’s most notable post-9/11 accomplishment. In place of a Lockean social contract based on the concept of reciprocal responsibility, a promissory note now provided the basis for waging war—and the people who so casually endorsed that note had no expectation of ever having to settle accounts.
As a consequence, war became exclusively the province of the state rather than the country as a whole. Invited to indulge in cheap grace, Americans willingly complied. Virtually from the outset, George W. Bush’s Global War on Terrorism was never America’s war in the sense that Lincoln’s war and FDR’s war had been. It was—and at least in some quarters was intended to be—Washington’s war.
To appreciate this distinction, one need only note the gap between the label Washington affixed to its war and the war’s actual conduct as it unfolded. To describe the conflict as a Global War on Terrorism obfuscated existing realities. Neither global in scope nor directed exclusively against terrorists, it was both far less and much more than its name implied. According to President Bush, the events of September 11, 2001, coming out of nowhere, inaugurated the conflict. More accurately, the 9/11 attacks intensified a struggle that had been ongoing for decades. At issue most immediately was the fate of a specific region: who would determine the future of the oil-rich, strategically critical Greater Middle East?
- pp. 7-14 (Introduction): When war claims a soldier’s life, what does that death signify? Almost reflexively, Americans want to believe that those making the supreme sacrifice thereby advance the cause of freedom. Since freedom by common consent qualifies as the ultimate American value, death ennobles the fallen soldier.
Yet sometimes nobility is difficult to discern and the significance of a particular death proves elusive. Consider the case of Captain William F. Reichert, shot and killed on January 27, 1971, at An Khe in the Republic of Vietnam. Captain Reichert did not fall in battle. He was assassinated. His assassin was an American soldier.
Age twenty-three, unmarried, and a graduate of West Point’s Class of 1968, Reichert was at the time commanding Troop C, First Squadron, Tenth Cavalry. As it happened, I was also stationed at An Khe then, serving as a platoon leader in Troop D.
Despite an impressive lineage, by the time I arrived, the First Squadron, Tenth Cavalry (“Buffalo Soldiers”) rated as something other than a “crack” outfit. By the winter of 1970-71, the dwindling American order of battle in Vietnam boasted few crack outfits. The U.S. Army was heading toward the exits, and those units that remained made for a motley collection.
Higher headquarters had assigned One-Ten Cav the mission of securing a long stretch of highway running west from the coastal city of Qui Nhon through the Central Highlands and on to Pleiku. The squadron’s area of operations included the Mang Yang Pass, where in 1954 the Vietminh had obliterated the French army’s Groupement Mobile 100, thereby ringing down the curtain on the First Indochina War.
No such replay of the Little Bighorn punctuated my own tour of duty. Indeed, the operative idea—widely understood even if unwritten—was to avoid apocalyptic encounters so that the ongoing drawdown could continue. As long as the withdrawal of U.S. forces proceeded on schedule, authorities in Washington could sustain the pretense that the Second Indochina War was ending in something other than failure.
One-Ten Cav had been allotted little more than a bit part in this elaborate production. Keeping that highway open allowed daily supply convoys to move food, fuel, ammunition, and other essentials to Pleiku and points beyond. To accomplish this mundane task, Buffalo Soldiers in armored vehicles guarded bridges or reacted to enemy ambushes. Others, in helicopters or on foot, conducted reconnaissance patrols, flying above or trudging through the jungle. The assignment offered little by way of glory or grandeur, both of which were then, in any case, in short supply throughout South Vietnam. That late in the war, navigating between honor and dishonor, foolhardy courage and craven cowardice, necessary subordination and mindless obedience posed challenges. It was not a happy time or place to be an American soldier.
Yet if the squadron did not literally share G. M. 100’s fate, it was succumbing incrementally to a defeat that was hardly less decisive. As any home owner will tell you, a leaky roof, if left unattended, can pose as much danger as a category five hurricane. Collapse is just a longer time coming. In the backwater that was An Khe, the roof was leaking like a sieve.
No one was likely to mistake the United States in 1971 for a land of concord and contentment. During the interval between the assassination of John F. Kennedy and the election of Richard M. Nixon, cleavages dividing left and right, black and white, flag burners and flag wavers, college kids and working stiffs had become particularly acute. Looming in the background was an even more fundamental cleavage between state and country. Depending on which camp you occupied, the government appeared either clueless or gutless. In any case, those exercising political authority no longer commanded the respect or deference they had enjoyed during the 1940s and 1950s. Sullen citizens eyed their government with cynicism and mistrust.
Comparable division and discord pervaded the ranks of those sent to serve in Vietnam. In the war zone, the animosity between the governing and the governed at home found its parallel in the relationship between leaders and led. In Vietnam, sullen enlisted soldiers—predominantly draftees—eyed their officers with cynicism and mistrust.
To vent their anger at policies not to their liking, outraged citizens engaged in acts of protest. To express their animus toward leaders not to their liking, alienated soldiers did likewise, their acts of protest ranging from disrespect to shirking to out-and-out insubordination. (The army’s unofficial motto had by then become “don’t mean nothin’,” usually muttered sotto voce at the back of some annoying superior.)
On January 27, 1971, Private First Class James D. Moyler, a twenty-year-old helicopter crewman from Chesapeake, Virginia, carried matters further. After exchanging words over allegations of barracks theft, the black soldier flipped the safety off his M16 and in broad daylight shot C Troop’s white commander at point-blank range. Captain Reichert bled to death in front of his own orderly room.
With the military justice system promptly cranking into high gear, Moyler was quickly arrested, jailed, charged, court-martialed, convicted, and sentenced to a long prison term. In the blink of an eye, he disappeared. From an institutional perspective, so too did the entire episode. In Saigon and Washington, those presiding over the war had no intention of allowing the death of Captain Reichert to affect their plans.
So in its sole report on the incident, the Pacific edition of Stars and Stripes offered the barest recitation of the facts—a masterful exercise in journalistic minimalism. As if in passing, however, the newspaper hinted at a larger context. Earlier that same month in Quang Tri Province, Stripes noted, one officer had been killed and another wounded “following a quarrel with enlisted men.” Meanwhile, at Tan Son Nhut Air Base outside Saigon, someone had rolled a fragmentation grenade into the quarters of a military police officer, wounding him as he slept. Again, enlisted soldiers were suspected of perpetrating the attack, “although no one ha[d] been charged.” [“G. I. Charged with CO Slaying,” Pacific Stars & Stripes, February 2, 1971.]
In other words, what had occurred at An Khe, however shocking, did not qualify as particularly unusual. Disgruntled soldiers obliged to fight a war in which they (along with most of the country) had ceased to believe were not without recourse. Among the options available was the one PFC Moyler had chosen, turning weapons intended for use against the enemy on those whose authority they no longer recognized.
The implications of Moyler’s actions were, in military terms, beyond alarming. To sustain a massively unpopular war, the state had resorted to coercive means: report for duty or go to jail. At home, clever young men had become adept at evading that choice and so the war itself. Those less clever or more compliant ended up in uniform and in Vietnam. There, the nominally willing—now armed—were having second thoughts. In increasing numbers, they not only refused to comply but were engaging in acts of resistance.
The problem was Vietnam, of course. But the war had become inextricably tied to conscription. To save itself, the army desperately needed to free itself of the war—and of those compelled to serve against their will. Allowed to spread unchecked, the poisons made manifest at An Khe posed an existential threat to the institutions as a whole. Even to a subaltern as callow and obtuse as I was, that much was apparent.
That other, unforeseen consequences might also ensue, unfavorable to the army, to soldiers, and to the country, did not occur to me. All that mattered then was to escape from an unendurable predicament. If that meant putting some distance between the army and the American people, so be it.
In the years that followed, the army effected that escape, shedding the war, the draft, and the tradition of a citizen-based military. Henceforth, the nation would rely on an all-volunteer force, the basis for a military system designed to preclude the recurrence of anything remotely resembling Vietnam ever again. For a time, Americans persuaded themselves that this professional military was a genuine bargain. Providing fighting forces of unrivaled capabilities, it seemingly offered assured, affordable security. It imposed few burdens. It posed no dangers.
In relieving ordinary citizens of any obligation to contribute to the country’s defense, the arrangement also served, for a time at least, the interests of the military itself. In the eyes of their countrymen, those choosing to serve came to enjoy respect and high regard. Respect translated into generous support. Among the nation’s budgetary priorities, the troops came first. Whatever the Pentagon said they needed, Washington made sure they got.
As a consequence, the army that I left in the early 1990s bore no more resemblance to the one into which I had been commissioned than a late model Ferrari might to a rusted-out Model T. The soldiers wanted to soldier. NCOs knew how to lead, and smart officers allowed them to do so. Given such a plethora of talent, even a mediocre commander could look good. As for an unofficial motto, the members of this self-consciously professional army were inexplicably given to shouting “Hooah” in chorus, exuding a confidence that went beyond cockiness.
Here, it appeared, was a win-win proposition. That the all-volunteer force was good for the country and equally good for those charged with responsibility for the country’s defense seemed self-evident. Through the twilight years of the Cold War and in its immediate aftermath, I myself subscribed to that view.
Yet appearances deceived, or at least told only half the story. Arrangements that proved suitable as long as deterring the Soviet threat remained the U.S. military’s principal mission and memories of jungles and rice paddies stayed fresh proved much less so once the Soviet empire collapsed and the lessons of Operation Desert Storm displaced the lessons of Vietnam. With change came new ambitions and expectations.
For a democracy committed to being a great military power, its leaders professing to believe that war can serve transcendent purposes, the allocation of responsibility for war qualifies as a matter of profound importance. Properly directed — on this, President George W. Bush entertained not the least doubt — a great army enables a great democracy to fulfill its ultimate mission. “Every nation,” he declared in 2003, “has learned an important lesson,” one that events since 9/11 had driven home. “Freedom is worth fighting for, dying for, and standing for—and the advance of freedom leads to peace.” Yet the phrasing of Bush’s formulation, binding together war, peace, and freedom, might have left a careful listener wondering: Who fights? Who dies? Who stands? The answers to this triad of questions impart to democracy much of its substantive meaning. [At least two additional questions also figure in determining the content of democracy. What is the operative meaning of freedom? And to whom are the privileges of freedom permitted?]
In the wake of Vietnam, seeking to put that catastrophic war behind them, the American people had devised (or accepted) a single crisp answer for all three questions: not us. Except as spectators, Americans abrogated any further responsibility for war in all of its aspects. With the people opting out, war became the exclusive province of the state. Washington could do what it wanted—and it did.
In the wake of 9/11, as America’s self-described warriors embarked upon what U.S. leaders referred to as a Global War on Terrorism, the bills came due. A civil-military relationship founded on the principle that a few fight while the rest watch turned out to be a lose-lose proposition—bad for the country and worse yet for the military itself.
Rather than offering an antidote to problems, the military system centered on the all-volunteer force bred and exacerbated them. It underwrote recklessness in the formulation of policy and thereby resulted in needless, costly, and ill-managed wars. At home, the perpetuation of this system violated simple standards of fairness and undermined authentic democratic practice.
The way a nation wages war—the role allotted to the people in defending the country and the purposes for which it fights—testifies to the actual character of its political system. Designed to serve as an instrument of global interventionism (or imperial policing), America’s professional army has proven to be astonishingly durable, if also astonishingly expensive. Yet when dispatched to Iraq and Afghanistan, it has proven incapable of winning. With victory beyond reach, the ostensible imperatives of U.S. security have consigned the nation’s warrior elite to something akin to perpetual war.
Confronted with this fact, Americans shrug. Anyone teaching on a college campus today has experienced this firsthand: for the rising generation of citizens, war has become the new normal, a fact they accept as readily as they accept instruction in how to position themselves for admission to law school.
The approach this nation has taken to waging war since Vietnam (absolving the people from meaningful involvement), along with the way it organizes its army (relying on professionals), has altered the relationship between the military and society in ways that too few Americans seem willing to acknowledge. Since 9/11, that relationship has been heavy on symbolism and light on substance, with assurances of admiration for soldiers displacing serious consideration of what they are sent to do or what consequences ensue. In all the ways that actually matter, that relationship has almost ceased to exist.
From pulpit and podium, at concerts and sporting events, expressions of warmth and affection shower down on the troops. Yet when those wielding power in Washington subject soldiers to serial abuse, Americans acquiesce. When the state heedlessly and callously exploits those same troops, the people avert their gaze. Maintaining a pretense of caring about soldiers, state and society actually collaborate in betraying them.
This book subjects the present-day American military system to critical examination. It explains just how we got into the mess we’re in. It shows who benefits and who suffers as a consequence. By way of remedy, it proposes that defending the country once more become a collective responsibility, inherent in citizenship.
pp. 42-43: The crux of the problem lay with two symmetrical one-percents: the 1 percent whose members get sent to fight seemingly endless wars and that other 1 percent whose members demonstrate such a knack for enriching themselves in “wartime.” Needless to say, the two one-percents neither intersect nor overlap. Few of the very rich send their sons or daughters to fight. Few of those leaving the military’s ranks find their way into the ranks of the plutocracy . . .
. . . a people who permit war to be waged in their name while offloading onto a tiny minority responsibility for its actual conduct have no cause to complain about an equally small minority milking the system for all it’s worth. Crudely put, if the very rich are engaged in ruthlessly exploiting the 99 percent who are not, their actions are analogous to that of American society as a whole in its treatment of soldiers: the 99 percent who do not serve in uniform just as ruthlessly exploit the 1 percent who do.
BETWEEN WAR AND PEACE: How America Ends Its Wars (Edited by Col. Matthew Moten); pp. 302-322 [“The United States in Iraq: Terminating an Interminable War” by ANDREW J. BACEVICH]:
The senior U.S. field commander knew it was over before it was over. The forces under his command had accomplished what they had been sent to do. The imperative was now to wind down the war, promptly and neatly. Avoiding unnecessary bloodshed had become a priority. So too was upholding the warrior’s code of honor, demonstrating that the troops under his command were not only brave but also humane. Given all that had been accomplished on the battlefield, the task of bringing Operation Desert Storm to an end did not appear to be particularly difficult. The heavy lifting was done.
So when Gen. H. Norman Schwarzkopf, commanding U.S. Central Command (CENTCOM) and all the coalition forces assembled to liberate Iraqi-occupied Kuwait, appeared before the press on February 27, 1991, in Riyadh, Saudi Arabia, his mood was ebullient. In assessing the situation that evening (early afternoon Washington time) Schwarzkopf turned in a boffo performance. Quickly enshrined as “the mother of all briefings,” it completed the gruff general’s transformation into Stormin’ Norman and vaulted him, however briefly, into the uppermost ranks of global celebrity.
Yet Schwarzkopf’s purpose was less to offer a progress report than to seize the historical initiative. In effect he aimed to do in the course of an hour-long televised presentation what Winston Churchill had accomplished over the course of his six-volume memoir of the Second World War: to lay down an authoritative firsthand account with the intention of establishing an interpretive framework to which others thereafter would adhere.
The overarching theme of Churchill’s work had been one of tragedy: out of a war that foresight and resolve might have averted came not enduring peace but continuing and potentially even more dangerous conflict. The overarching theme of Schwarzkopf’s hastily prepared presentation differed; simply put, he used the occasion to declare victory.
Against tall odds, the outnumbered U.S. forces, assisted by loyal allies, had executed an epic feat of arms. Outgeneraled and outfought, the Iraqi army had all but ceased to exist. What Schwarzkopf described as a “classic tank battle” had left 3,700 of the enemy’s 4,000 tanks hors de combat. Over the course of a mere four days the troops under his command had “almost completely destroyed the offensive capability of the Iraqi forces in the Kuwait theater of operations.” The surviving remnant, pursued on the ground and pummeled from the air, was doomed. There was no escape. “The gates are closed.” For the coalition, the road to Baghdad lay open. Yet Schwarzkopf evinced neither the intention nor the desire to march on the Iraqi capital. Although fighting continued, he considered his task complete. “We’ve accomplished our mission,” he concluded, “and when the decision makers come to the decision that there should be a cease-fire, nobody will be happier than me.”
The views expressed by the field commander in Riyadh meshed with and reinforced those gaining currency back in Washington. In the desert, things had gone much better than expected. Senior U.S. officials, civilian and military alike, felt little inclination to press their luck. Better to stop now and cash in their winnings, which promised to be considerable. Besides, to continue clobbering an already beaten foe might give the wrong impression. Concern for appearances and reputations was eclipsing serious strategic analysis.
A call to CENTCOM headquarters earlier that same day from Gen. Colin Powell, chairman of the Joint Chiefs of Staff and an officer of acute political sensitivity, signaled which way the winds were blowing at home. “The doves are starting to complain about all the damage you’re doing,” Powell told Schwarzkopf. Images of mangled Iraqi trucks and burning armored vehicles piled up on the main road leading from Kuwait City back toward Iraq, dubbed by the press “the Highway of Death,” were causing unease. “The reports make it look like wanton killing.” [H. Norman Schwarzkopf, It Doesn’t Take a Hero (New York, 1992), 468.] Perhaps, Powell suggested, it was time to bring things to a halt.
Schwarzkopf’s initial inclination was to continue the pursuit for another day. “[I want to] drive to the sea, and totally destroy everything in our path. That’s the way I wrote the plan,” he told the JCS chairman, “and in one more day we’ll be done.”
Yet when pressed by Powell, Schwarzkopf quickly gave way. With Kuwait liberated and the Iraqi army no longer an effective fighting force, the time had come to think about the history books. When it came to winning a decisive victory, the Israeli Six-Day War of 1967 represented the reigning gold standard. By ending the fight on February 28, U.S. forces could outdo the Israelis by a day, Operation Desert Storm becoming the Five-Day War. Both Powell and Schwarzkopf (ignoring the several weeks of bombing that had preceded the launch of ground operations) found this an appealing prospect.
A short time later in the Oval Office the JCS chairman duly relayed Schwarzkopf’s assessment to the commander in chief. “Mr. President, it’s going much better than we expected. The Iraqi army is broken. All they’re trying to do now is get out,” he said, adding, “By sometime tomorrow the job will be done.”
“If that’s the case,” President George H. W. Bush wondered aloud, “why not end it today?” Caught off guard by the president’s suggestion, Powell said he needed once more to consult the field commander. Ducking into the president’s study, he placed another call to Riyadh. Schwarzkopf needed little persuading. “I don’t have any problem,” he replied when briefed on the president’s inclination to end the fighting forthwith. “Our objective was to drive ‘em out and we’ve done it.” Although Schwarzkopf wanted to check with his own chief subordinates, he fully expected them to concur.
Schwarzkopf’s views proved decisive. At 6:00 p.m. Washington time, after a final round of discussion with his advisors, Bush rendered his decision. In a 9:00 p.m. televised address from the Oval Office he would announce a cessation of hostilities. There was just one wrinkle: rather than declaring an immediate termination of combat operations, the president would designate 12:01 a.m as the endpoint of Operation Desert Storm. White House Chief of Staff John Sununu had suggested that “the One Hundred Hour War” had a nice ring to it. Midnight marked exactly 100 hours since ground operations had commenced. Once again playing to the history books nudged other considerations aside.
“Kuwait is liberated,” President Bush told the nation and the world that evening in his televised presentation. “Iraq’s army is defeated. Our military objectives are met.” The president and his administration were ready to move on. “This war is now behind us.”
That judgment proved premature. Even as Operation Desert Storm wound down, complications were beginning to emerge. Expectations of overwhelmingly superior U.S. military power producing a decisive outcome soon proved to be illusory. President Bush himself was among the first to suspect that something might be amiss. “Still no feeling of euphoria,” his diary entry for that night reads. “It hasn’t been a clean end—there is no battleship Missouri surrender. This is what’s missing to make this akin to World War II, to separate Kuwait from Korea and Vietnam.” [George Bush and Brent Scowcroft, A World Transformed (New York, 1998), 486-87.]
At the time, few Americans shared Bush’s sense of unease. Most had bought into his administration’s depiction of the Persian Gulf crisis as a morality tale, replaying the events of Europe from 1939 to 1945. The crisis had ostensibly sprung out of the blue on August 1, 1990, when the H*tler-like figure of Saddam Hussein had invaded an innocent, unassuming neighbor. In defeating Saddam’s legions and liberating Kuwait, U.S. troops (with a bit of allied assistance) had now put things right. End of story.
This reassuring narrative was deeply misleading, however. It failed in at least three respects. First, rather than embodying the problem facing the United States, Saddam merely represented a prominent symptom. Second, rather than suddenly appearing in August 1990, the actual problem—pent-up resentment throughout much of the Islamic world finding expression in anti-Western violence—had been festering for decades. Third, rather than offering an antidote to that problem, the employment of U.S. military might would only serve to make matters worse.
A World War II-style outcome was not in the cards if only because the enterprise in which the United States was engaged in no way resembled World War II. Since the promulgation of the Carter Doctrine in 1980, its declaratory purpose being to prevent a hostile nation from gaining control of the Persian Gulf, Washington had sought to achieve a commanding position in the Gulf and its environs. Power and presence promised to ensure access and stability; whatever the author of the Carter Doctrine may have intended, within a decade this had emerged as the doctrine’s underlying rationale.
At a superficial level Operation Desert Storm seemed to validate this strategy while creating fresh opportunities to exert U.S. influence across the region. Beneath the surface, however, the American-led intervention and its aftermath only served to affirm suspicions that the United States had become simply the latest in a long list of Western powers seeking to impose its will on the Islamic world. Viewed from this perspective, American power and presence, which the Persian Gulf War of 1991 had vividly displayed, served as a rallying cry for jihad.
President Bush’s declaration of an end to hostilities on February 28, 1991, terminated one small war while paving the way for a much larger one. Operation Desert Storm settled nothing of importance. Instead, the One Hundred Hour War served as a precursor and catalyst for what a decade later became known as the Long War.
Why did victory over Saddam yield such perverse results? Why did Operation Desert Storm, briefly celebrated as an epic feat of arms, so quickly lose its luster? Existing answers to these questions reflect two distinctive schools of thought.
According to the first, Operation Desert Storm was a brilliantly conceived and executed military campaign, botched at the very end and thereby leaving unfortunate loose ends. The failures were military in nature, reflecting the errors and inadequacies of very senior military officers.
According to the second school of thought, Operation Desert Storm was a brilliantly conceived and executed military campaign launched in pursuit of the wrong mission. In essence the United States had erroneously planted the goalposts well short of the end zone. From this perspective the failures were political in nature, with the fault laid to very senior civilian officials.
Adherents of the first school spread the blame among three U.S. Army generals, including in their indictment, along with Powell and Schwarzkopf, Lt. Gen. Frederick M. Franks, commander of VII Corps. The charge against Franks—within the officer corps a widely revered figure—is an especially severe one. It amounts to this: he failed to accomplish his assigned mission. As conceived by planners working for Schwarzkopf, the coalition ground offensive was to consist of two essential elements. First, a supporting attack from south to north toward Kuwait City would fix Iraqi forces in place. Second, a wide flanking attack from west to east would envelop the enemy and ensure his defeat in detail. VII Corps—50,000 vehicles and 146,000 soldiers strong—was to execute that flanking attack. More specifically, the war plan assigned Franks the task of destroying the Iraqi Republican Guard, the best-equipped, best-trained, and most formidable element of Saddam Hussein’s otherwise raggedy army.
In the end, the Republican Guard, although badly damaged and put to flight, evaded destruction, major elements fleeing back toward Baghdad. The hit on Franks, one that Schwarzkopf in particular endorsed, was that in a situation calling for dash, the VII Corps commander had exhibited caution. Determined to minimize coalition casualties—the avoidance of fratricide competed with destruction of the enemy as a priority—Franks attacked methodically and deliberately, allowing his quarry to escape. At one point, in response to a reported friendly-fire incident, he ordered a cease-fire throughout the entire corps, a tribute to the general’s humanity but not a decision likely to have found favor with W. T. Sherman or George S. Patton.
In the immediate wake of Operation Desert Storm, with dissident Iraqi Shiites and Kurds (encouraged by President Bush) rising up to overthrow Saddam, the Republican Guard provided the Iraqi dictator with the wherewithal to crush internal opposition to his regime and retain his hold on power. Saddam survived, creating an abiding problem for the United States, a direct result, according to some, of VII Corps having come up short.
The charge against the CENTCOM commander is broader. Simply put, over the four days during which the ground offensive unfolded, Schwarzkopf’s temperament, volcanic in the best of circumstances, became a major source of dysfunction that eventually permeated the senior levels of his command. In a job that required cool, he ran piping hot, cultivating a style of leadership that emphasized bluster, intimidation, and threats of relief.
Whether due to fatigue, pressure, or sheer orneriness, Schwarzkopf erred repeatedly on issues of primary importance, a tendency that became especially evident as Operation Desert Storm wound down. Having overestimated the expected level of enemy resistance, he adjusted only belatedly to evidence that the Iraqis were far weaker than advertised. Overstating the losses his forces had inflicted on the enemy, he concluded prematurely that his work was finished. By declaring publicly, with hostilities still under way, that his forces were not going to Baghdad, he made a great gift to Saddam Hussein offering authoritative insight into the coalition’s ultimate intentions.
When the White House, with public relations uppermost in mind, proposed to slip the cessation of hostilities by three hours, Schwarzkopf meekly assented, heedless of the implications of this change—trivial as Washington saw things, massive from the perspective of the various troop units scattered across Kuwait and southern Iraq. Furious at discovering that operations were coming to a halt three hours earlier than Washington had announced, he demanded that his commanders resume an all-out offensive for this brief interval. In the field this stop—go—stop again sequence of orders from on high simply generated confusion. Schwarzkopf was acting more like a squad leader than the overall commander of a massive air, ground, and naval coalition. The campaign’s very last act served to showcase Schwarzkopf’s shortcomings. Charged by the White House with negotiating a formal cease-fire, the CENTCOM commander paid more attention to appearance and atmospherics than to substance, with fateful results.
As the site for this event Schwarzkopf selected Safwan, an obscure crossroads in southern Iraq. While President Bush was lamenting the absence of a World War II-style surrender ceremony, his field commander intended to preside over a reasonable facsimile, Safwan standing in for the battleship Missouri, Schwarzkopf casting himself in the role of Douglas MacArthur.
After notifying Washington that he had designated Safwan as the chosen venue, however, Schwarzkopf learned that the Iraqi army still occupied the place; a garbled report had erroneously put it under the control of U.S. forces. This glitch proved too much for Schwarzkopf. By his own account he now “came completely unglued,” with General Franks the specific target of his wrath. “I felt as if I’d been lied to. All of my accumulated frustration and rage with VII Corps came boiling out.”
Moving the cease-fire talks to a different location was out of the question: doing so might cast doubt on Schwarzkopf’s omniscience and stain a campaign that all concerned were eager to portray as flawless. So for the next twenty-four hours nudging the Iraqis out of Safwan without instigating a major bloodletting became CENTCOM priority number one, taking precedence over all other considerations, not least of all any substantive considerations related to war termination.
Schwarzkopf himself had drafted proposed terms of reference for the talks. His own views were quite simple: “Our side had won, so we were in a position to dictate terms.” After some minor wordsmithing, Washington had approved Schwarzkopf’s draft. As he headed toward Safwan on March 3, 1991, the general was in the driver’s seat.
His mandate, as he himself understood it, was “confined to military issues,” above all securing the release of coalition soldiers taken prisoner. When Schwarzkopf arrived for the talks, a reporter shouted a question: What exactly was going to be negotiated? “This isn’t a negotiation,” came Schwarzkopf’s curt reply. “I don’t plan to give them anything. I’m here to tell them exactly what we expect them to do.”
As he sat down across from two hitherto obscure Iraqi generals — Schwarzkopf’s thoughts were focused on donating the furniture to the Smithsonian Institution “in case they ever wanted to re-create the Safwan negotiation [sic] scene” — he wasted little time before straying beyond his mandate and offering his interlocutors generous concessions.
. . . The American general assured his interlocutors that the United States and its allies viewed Iraq’s boundaries as sacrosanct. U.S. forces were going home forthwith. “We have no intention of leaving our forces permanently in Iraqi territory once the cease-fire is signed,” he announced. As the talks proceeded, an atmosphere of stiff formality gave way to a spirit of mutual accommodation. The Iraqis had given Schwarzkopf what he wanted most; he returned the favor. Asked if the Iraqi army might resume use of its helicopters after the cease-fire, Schwarzkopf readily assented. “Given that the Iraqis had agreed to all of our requests, I didn’t feel it was unreasonable to grant one of theirs.” As if to affirm that the crisis triggered by Iraq’s invasion of Kuwait had reached a definitive conclusion, Schwarzkopf concluded his conversation with the Iraqi generals by exchanging salutes and comradely handshakes.
For Saddam Hussein, Schwarzkopf’s loquaciousness and magnanimity came as a welcome if wholly unearned gift. The helicopters alone proved invaluable. At the very moment when Saddam’s hold on power was most precarious here was another asset employed in his vicious campaign to suppress internal opposition. Yet even more important to the Iraqi dictator was Schwarzkopf’s tacit admission that the United States had no interest in interfering in Iraq’s internal affairs. Safwan assured Saddam that he need not worry about an externally mounted challenge to his continued rule.
Schwarzkopf may have fancied himself reprising MacArthur’s role on V-J Day. In their penchant for theatricality, the two generals bore more than a passing resemblance to each other. Yet even though Japan in 1945 lay utterly prostrate, MacArthur grasped this essential point: politically, much work remained to be done. After surrender was to come occupation and rehabilitation. In his encounter with the defeated foe on the deck of the Missouri, therefore, MacArthur gave away nothing. By comparison, Schwarzkopf at Safwan gave away the farm. In his haste to terminate the war he mistakenly thought he had won decisively, the CENTCOM commander helped ensure the war’s de facto continuation.
Finally there is the charge against Powell. Here matters cross fully from the operational realm into the arena of politics. To characterize Powell as a political general is to acknowledge a convergence of the obvious and the essential, much like calling Marilyn Monroe a sex symbol. Yet Powell’s political concerns were of a very specific sort. In his hierarchy of aims one priority outranked all others: as the U.S. military’s senior serving officer he was absolutely determined that nothing bring into disrepute the institution over which he exercised stewardship. Any policy or action posing a threat to the military’s standing in the eyes of the American people—especially anything smacking of another Vietnam—he opposed. Any policy or action that promised to enhance the military’s collective reputation—especially anything that might help bury the memory of Vietnam—he favored. When it came to war termination, the JCS chairman, his own agenda complementing Schwarzkopf’s, served in effect as the CENTCOM commander’s enabler.
When the Persian Gulf crisis erupted in August 1990, Powell supported the deployment of U.S. forces to defend Saudi Arabia but evinced little enthusiasm for liberating Kuwait by force, preferring instead to rely on economic sanctions to pry Saddam Hussein out. [Bob Woodward, The Commanders (New York, 1991). Powell’s reluctance to use force is a recurring theme of this book, which recounts the Persian Gulf crisis up to the eve of Operation Desert Storm.] Although Powell lost that argument, he remained intent on doing everything possible to preclude U.S. forces from being drawn into anything remotely resembling a quagmire.
The design parameters informing Operation Desert Storm reflected Powell’s own preferences for how the United States ought to wage war. Risk avoidance was a priority. The mission therefore was specific, concrete, and narrowly drawn, with Powell’s own statement of purpose a model of economy. “Our strategy to go after this army is very, very simple,” he announced at a press conference. “First we’re going to cut it off, and then we’re going to kill it.” In pursuit of that aim the United States assembled a broad allied coalition (no going it alone) and deployed a combat force of overwhelming strength (calling up citizen reserves ensured the country’s commitment to what was to come). Deliberation rather than daring defined the spirit of the enterprise, as illustrated by the weeks of bombing that preceded the launch of the ground offensive. Once hostilities had commenced, Powell worked hard to guarantee Schwarzkop complete freedom of action, insulating him from the sort of meddling by high-ranking civilian officials that had ostensibly made such a hash of Vietnam.
Here was the real essence of what came to be called the Powell Doctrine: minimizing uncertainty by employing maximum force and allowing commanders in the field to exercise broad autonomy, and by implication subordinating political considerations to operational ones. In the context of Desert Storm this meant, among other things, calling it quits at the first available moment.
All of these views put Powell in Schwarzkopf’s corner when it came to hastening an end to hostilities. Although aware that coalition forces had not in fact “killed” the Iraqi army, the JCS chairman nonetheless urged a prompt and unilateral end to hostilities. “There was no need to fight a battle of annihilation,” he argued. Although Powell himself believed “that Saddam would likely survive the war,” that was no reason to prolong the fighting. Schwarzkopf had shattered the Iraqi defenses and put Saddam’s legions to flight. To give the appearance of piling on was unseemly. “There is,” Powell remarked during the course of one Oval Office meeting, “chivalry in war”—an astonishing statement to make at the end of a blood-soaked century devoid of even an approximation of chivalry.
Largely due to Powell’s efforts, generals, not politicians, determined the precise terms that concluded the Persian Gulf War of 1990-91. President Bush “had promised the American people that Desert Storm would not become a Persian Gulf Vietnam.” Powell wrote in his memoirs, “and he kept his promise.” The JCS chairman contributed mightily to ensuring that outcome. For Powell (and for other members of the officer corps) precluding Desert Storm from becoming another Vietnam ranked as a paramount objective. Based on that criterion, he and Schwarzkopf had collaborated to achieve a rousing success.
Commanders temperamentally ill-suited for the responsibilities to which they were assigned; fog and friction that in this war no less than others bred confusion, obscured reality, and clouded judgment; a pronounced tendency to subordinate strategy to institutional goals; and (in Schwarzkopf’s case) a penchant for grandstanding: these number among the factors accounting for the errors senior military officers committed in bringing the Gulf War to a conclusion.
In the war’s immediate aftermath, however, few of these miscues attracted more than passing attention. If noticed at all, they didn’t seem to matter and certainly didn’t affect the general view that Desert Storm had ended in a historic victory.
Among the vast majority of Americans the Persian Gulf War elicited a euphoric response that allowed little room for skepticism or second thoughts. A compendium of reporting assembled by the editors of Time captured the mood of the moment. Time described Operation Desert Storm as “a drama of dazzling display, brutal crispness and amazingly decisive outcome.” Out of victory had come “a glow of righteousness.” For the United States the brief conflict in the Gulf had produced “a giddy mixture of pride and a renewed sense of the nation’s worth.” Expelling Iraq from Kuwait signaled “the end of the old American depression called the Vietnam syndrome, the compulsion to look for downside and dooms.” It marked “the birth of a new American century—the onset of a unipolar world, with America at the center of it.” The brilliance displayed by Schwarzkopf and his warriors heralded “the apotheosis of warmaking as a brilliant American craft.”
Setting aside his own initial misgivings, President Bush himself signed on to the proposition that something truly profound had occurred in the desert. Asked at a press conference on March 1, 1991, if Operation Desert Storm presaged a new era of U.S. military interventionism, the president demurred: “I think because of what has happened, we won’t have to use U.S. forces around the world. I think when we say something that is objectively correct, like don’t take over a neighbor or you’re going to bear some responsibility, people are going to listen. . . . So, I look at the opposite. I say that what our troops have done over there will not only enhance the peace but reduce the risk that their successors have to go into battle someplace.”
Bush expected Desert Storm to create the foundation for “a new world order”: less violent, more law abiding, a spirit of goodwill supplanting old-fashioned power politics. The triumph over Saddam Hussein, enhancing the authority and influence of the United States, especially in the Muslim world, had created opportunities to take on other problems. Heading the list, the president believed, was the Arab-Israeli conflict. As never before, the prospects for bringing peace to the Holy Land seemed bright.
In the end, none of this came to pass. As a mechanism to advance the cause of global peace and harmony, the war proved a total bust. Apart from restoring Kuwaiti sovereignty, Operation Desert Storm solved remarkably little. With a defiant Saddam Hussein still hunkering down in Baghdad, even the security of the Gulf itself remained uncertain. By the time Desert Storm’s first anniversary had arrived, the journalist Rick Atkinson wrote, the war was well on its way to becoming “a footnote, a conflict as distant as the Boxer Rebellion of 1900.” For most Americans a conflict that had briefly seemed to mark a turning point in history was fast becoming “irrelevant.”
As if to emphasize the point, the hostilities thought to have ended at Safwan soon resumed. In March 1991, under the guise of keeping Saddam in his “box,” the United States and Great Britain launched a program of coercive intimidation intended to ensure that Iraq would remain militarily weak. This campaign of recurring air strikes and demonstrations continued for more than a decade.* To support this open-ended quasi-war, beefed-up U.S. military contingents remained in the region, operating out of a network of bases in Saudi Arabia, Kuwait, Turkey, and elsewhere. Viewed in some quarters as occupiers, these U.S. forces became themselves the target of attack. The upshot was this: the event that President Bush’s advisors had enthusiastically marketed as the One Hundred Hour War became instead an amorphous conflict that dragged on indefinitely. Instead of reducing the likelihood of U.S. troops going into harm’s way, as President Bush had predicted, it produced precisely the opposite effect.
The disappointing results produced by Operation Desert Storm gave rise to this revisionist interpretation; while the troops had done all they were asked to do, the politicians had screwed up. President Bush had assigned Schwarzkopf the wrong task. Whatever mistakes might have occurred in the desert, the really big gaffes were committed back in the Oval Office.
Put simply, the president had failed to adjust U.S. strategy to take into account success being achieved on the battlefield. The ease with which U.S. and allied forces had ousted the Iraqi army from Kuwait (the agreed-upon coalition mission) had created the opportunity to solve the Saddam Hussein problem once and for all. The Bush administration should have seized that opportunity. Whether out of timidity, a lack of imagination, or simply because events were outrunning his administration’s decision cycle, President Bush chose instead to stick with the original, and in retrospect too narrowly drawn, mission.
The president had misunderstood the problem, which was not Iraq annexing Kuwait, but Saddam Hussein ruling in Baghdad. As long as the Iraqi dictator remained in power he would menace the entire Middle East. Peace and stability in the region therefore required Saddam’s removal. Baghdad rather than Safwan was the proper place to settle things. In failing to grasp this essential fact, Bush and his advisors had erred fundamentally. Viewed from this perspective, Operation Desert Storm represented not a great victory but a squandered opportunity.
No group articulated this revisionist interpretation of Desert Storm with greater fervor and persistence than the very same militarists (mostly but not exclusively Republicans) who in 1991 had cheered President Bush as a courageous and farsighted statesman. Once Bill Clinton gained control of the White House in 1993 and embraced Bush’s strategy of containing Iraq, these hawks began agitating for aggressive efforts aimed at eliminating Saddam. Regime change in Baghdad, they believed, was sure to fix things. Jousting in the skies above Iraq wasn’t good enough. They hankered for the real thing.
Here, from an essay written in 1998, is Robert Kagan, a prominent militarist, making the case for an outright invasion of Iraq. Likening Saddam to Adolf ******—Kagan refers to the Iraqi dictator as “Herr Hussein”—Kagan depicts the existing policy of containment as hardly better than supine appeasement: “The only solution to the problem in Iraq today is to use air power and ground power, and not stop until we have finished what President Bush began in 1991. An air campaign is not enough. . . . Only ground forces can remove Saddam and his regime from power and open the way for a new post-Saddam Iraq whose intentions can safely be assumed to be benign.” Kagan expresses confidence that victory will come easily and produce a windfall of positive effects. “A successful intervention in Iraq,” he breezily predicts, “would revolutionize the strategic situation in the Middle East, in ways both tangible and intangible, and all to the benefit of American interests.”
The point of citing this passage is not to suggest that it carried any particular weight in shaping policy. Yet views such as Kagan’s illustrate the trend of opinion as Operation Desert Storm lost some of its luster. The victory narrative that briefly vaulted the likes of Schwarzkopf to the status of Great American Hero no longer retained any persuasive authority.
* This war (after the [‘Desert Storm’] war) continued without pause and little public notice until the Anglo-American invasion of Iraq in March 2003. During this period U.S. and British combat crews flew hundreds of thousands of sorties and launched thousands of weapons at Iraqi air defense sites, communications centers, and other targets. Terry Boyd, “Operation Northern Watch: Mission Complete,” Stars and Stripes, March 31, 2003; “Operation Southern Watch,” http://www.absoluteastronomy.com/top...Southern_Watch
During the 1980s, Washington had quietly collaborated with Saddam Hussein, ignoring his many crimes while supporting Iraq in its war of aggression against the Islamic Republic of Iran. Throughout the 1990s, with the Iran-Iraq War now ended, American policymakers and pundits discovered those crimes and elevated the Iraqi dictator to the status of global bogeyman. Saddam’s mere survival now seemed an intolerable insult, and U.S. policy became increasingly personalized as a result. Out of that fixation with Saddam emerged this new consensus, shared by Republicans and Democrats alike: the key to achieving peace and stability in the Middle East was to complete the task that President Bush had foolishly left undone in 1991. [The clearest expression of that consensus was the Iraq Liberation Act, passed by unanimous consent in the U.S. Senate and by a vote of 360-38 in the House of Representatives. This legislation declared it the policy of the United States to oust Saddam Hussein from power. President Bill Clinton signed the bill into law on October 31, 1998.]
The events of 9/11 found a second President Bush in office and many prominent anti-Saddam militarists occupying positions of influence. Iraq was in no way involved in the terrorist attacks of September 11, 2001, of course. Yet for those who had obsessed about Saddam for so long, this fact proved utterly irrelevant. The “global war on terror” offered a made-to-order opportunity to test what had by now become an article of faith: the only problem with Operation Desert Storm was that it hadn’t gone far enough. The new Bush administration now set out to amend this perceived defect. Through regime change in Baghdad the United States would, in Kagan’s phrase, “revolutionize the strategic situation” and reap handsome benefits.
Informed by such large ambitions, Operation Iraqi Freedom, launched in March 2003, differed from Operation Desert Storm in important ways. Speed, not deliberation, now became the name of the game. Quagmire? An emphasis on “shock and awe” would preclude any such possibility. As for foot-dragging or hand-wringing generals, they were either marginalized, ignored, or subjected to public humiliation. Senior civilian officials, most prominently Secretary of Defense Donald Rumsfeld, left no doubt about who was calling the shots. The concept that Rumsfeld devised for toppling Saddam aimed to blow through the Iraqi army, get to Baghdad as fast as possible, and have done with it.
U.S. forces did blow through the Iraqi army, occupy Baghdad, and overthrow the Baath Party regime, all in a matter of weeks. To all appearances the younger President Bush had triumphed, thereby outdoing his father: “Mission Accomplished.” At first blush, the victory achieved over Iraq in 1991 had seemed definitive. Surely this victory actually was definitive, marking, President Bush promised on May 1, 2003, “the arrival of a new era.”
Appearances proved deceptive, however. Once again, bringing the conflict to a tidy conclusion proved elusive. The Anglo-American invasion of 2003 transformed Iraq from a crumbling dictatorship into a failed state. Soon thereafter an ethnically based civil war engulfed the country, while radical Islamists infiltrated Iraq to wage jihad against occupying infidels. What was intended to be a short conventional war morphed into a protracted, very ugly, and very costly unconventional one. U.S. forces found themselves caught in the middle.
Operation Iraqi Freedom defied the president’s insistence that major combat operations had ended. The war lingered on and on, costing the United States thousands of dead and wounded and hundreds of billions of dollars. Worse, rather than solving problems, Saddam’s removal saddled Washington with onerous new problems, regime change in Baghdad further emboldening anti-Western forces not only in Iraq but elsewhere in the region.
Colin Powell had given the older President Bush high marks for waging his Persian Gulf War in ways that avoided its becoming another Vietnam. A variant of Vietnam, sans the jungles and rice paddies, now all but consumed the younger Bush’s presidency.
What had gone wrong? Secretary Rumsfeld quickly emerged as a favored scapegoat, tagged with having failed to plan adequately for “Phase IV,” all the tasks inherent in occupying and rebuilding Iraq after Saddam’s removal. The challengers cropping up following the fall of Baghdad had caught American leaders, military no less than civilian, flatfooted. A string of unimpressive American generals—Tommy Franks, John Abizaid, Ricardo Sanchez, and George Casey—spent several painful years struggling to figure out exactly what had hit them.
Yet amid the floundering, those who had planned and orchestrated the first U.S. military encounter with Saddam Hussein could claim a certain vindication: on second thought, their reluctance to march on Baghdad when the opportunity first presented itself back in 1991 may not have been so terribly misguided after all.
As Americans today reflect back on twenty years of war and quasi-war in Iraq, they might think a bit more kindly about the generalship and the statesmanship displayed way back when that conflict was young. Perhaps General Franks, Schwarzkopf, and Powell didn’t do such a bad job after all. Perhaps George H. W. Bush knew a few things that George W. Bush ought to have absorbed. Certainly any mistakes committed by Bush I and his lieutenants appear almost trivial in comparison to the blunders perpetrated by Bush II and his circle.
Taking this two-decade-long narrative as a whole, Operation Desert Storm may even represent a high point of sorts. For American soldiers serving in the Persian Gulf it has been mostly downhill since. Yet despite all that has ensued since General Schwarzkopf swaggered into Safwan, and despite all that his successors have (presumably) learned since, ringing down the curtain on the U.S. military misadventure in Iraq continues to pose daunting problems. Why has war there proven to be so interminable?
The real explanation lies with this depressing fact: from 1991 to the present, American policymakers, along with the senior U.S. military commanders serving as their agents, have committed the most fundamental of errors. Simply put, in the Persian Gulf and in the so-called Greater Middle East more broadly, they misconstrued the problem. Having done so, they devised inappropriate solutions. The persistent reliance on those solutions exacerbated actual problems to which Washington has remained steadfastly oblivious.
For decades, two assumptions have formed the basis for U.S. policy in the Persian Gulf, the provenance of both those assumptions traceable back to the Carter Doctrine of 1980. According to the first assumption, the key to establishing Persian Gulf stability lies in fostering a balance of power congenial to the United States. Yet in attempting to create such a balance, Washington has sought more than simply an equilibrium among the major Gulf states. It has sought to place the United States itself in a position to manage that balance, with Washington having the final say on matters determining the course of events in the Gulf. Furthermore, U.S. policymakers have sought to change the character of Gulf states, with an eye toward making them less susceptible to irresponsible behavior and more amenable to Washington’s coaching or direction.
Always, however, the United States has clung to the view that the region consists of a more or less fixed number of legitimate states governed by more or less legitimate governments—in short that the Persian Gulf (and the Middle East more broadly) does not differ structurally from Europe or Latin America. Implicit in this perspective is a tendency on the part of U.S. policymakers (operating in a largely post-Christian milieu and blind to the history of American imperialism) to undervalue the importance of Islam and to ignore the legacy of Western colonialism and postcolonial meddling in the region.
According to the second assumption, the key to orchestrating and managing a Persian Gulf balance lies in the adroit application of American power, either directly or through proxies. Policymakers have accepted as axiomatic this proposition: that American activism—diplomatic, economic, but above all military—serves to reduce the sources of regional conflict, thereby advancing core American interests. If instability persists in the face of U.S. exertions—as it has—then the antidote is to be found in trying harder, bringing to the task more resources, almost inevitably expressed in terms of an enlarged military presence and agenda.
In essence Washington’s concept of balance in the Persian Gulf implied the region’s incorporation into the post-1945 Pax Americana. Bluntly, the phrase balance of power was a code word for hegemony.
The historical record suggests that these two assumptions are false. Adherence to the Carter Doctrine over the past three decades has vastly enlarged the scope of U.S. commitments to the Persian Gulf (and across the Greater Middle East). It has also resulted in the expenditure of American resources in staggering quantities. Yet these exertions have served not to reduce but to enflame the sources of conflict. The region has become not more stable but less. Proponents of violent anti-Western Islamism have had little difficulty garnering support and even recruiting foot soldiers. Terrorism has become epidemic. U.S. hegemony meanwhile has remained a chimera.
Washington’s response to the Gulf crisis of 1990-91 illustrates in microcosm the abiding defects of Washington’s preferred approach to policy. Operation Desert Storm and its aftermath help us understand why U.S. efforts time and again produce outcomes radically at odds with professed American intentions, and why wars in that region once begun prove so difficult to end.
Deliberations within the Bush I administration leading up to Desert Storm routinely cited the concept of a regional balance as both source and solution to the Persian Gulf crisis. “The Iraqi invasion of Kuwait was possible because of a collapse of the regional balance of power”: such was the analysis offered by a typical National Security Council paper, which then went on to insist that reestablishing a regional balance ranked as “a key security objective.” In maintaining any such restored balance the United States would necessarily play an ongoing and pivotal role.
So too would Saddam Hussein’s Iraq . . . The Bush I administration wanted to render Iraq incapable of further aggression without leaving a gaping hole in the center of the Persian Gulf. Doing so meant walking a fine line. As one study by the National Security Council put it, in gauging the punishment to be visited upon Iraq, the United States needed to avoid “so weaken[ing] Iraqi military capability as to create a power vacuum in the region.”*
*“War Termination,” n.d. [January 1991], Robert M. Gates Files, Office of the National Security Council, Bush Presidential Records, Bush Library (hereafter cited as Gates Files). For more on the importance of incorporating Iraq into a restored Gulf balance of power, see U.S. Embassy Riyadh to Secretary of State, “U.S. and Coalition War Aims: Sacked Out on the Same Sand Dunes, Dreaming Different Dreams?,” December 30, 1991, Haass Files. The U.S. ambassador to Saudi Arabia at the time was Charles W. Freeman.
Even as preparations for Operation Desert Storm proceeded, however, the White House had begun to place the upcoming offensive in a much broader context. Saddam’s expulsion from Kuwait would mark not an end point but a new beginning. Up to this time the Persian Gulf had lagged behind Western Europe and East Asia in the hierarchy of U.S. strategic interests. The war was going to change that. With the Cold War now all but ended, from now on the Middle East was likely to rank “first in terms of threats to our interests and the need for the United States to act.”
Whether the upcoming fight proved to be tough or easy, the administration saw victory over Iraq as a foregone conclusion. The big question was how the United States should capitalize on the success gained on the battlefield. Within the administration the preferred answer emphasized three components: the creation of new regional security structures, increased emphasis on opening up and modernizing the nations of the Islamic world, and intensified efforts to broker peace between Israel and its Arab neighbors. Common to all three was this feature: each implied a more forceful U.S. role and an enlarged U.S. presence.
Actions undertaken by the Pentagon pursuant to the imperatives of the Carter Doctrine already provided a basis for more robust regional security arrangements. From a U.S. perspective these included “negotiating access agreements, building bases, having highly mobile forces, prepositioning equipment, conducting joint exercises and the like”—the very actions that were even then facilitating the ongoing buildup of U.S. forces in the Gulf. Although administration analysts wanted the U.S. military “to have as low a profile as possible,” they also wanted to be sure that forces would be readily available when needed. The idea was to do things with the minimum amount of publicity. “Details about the size of U.S. forces,” one State Department memo recommended, referring to postwar arrangements, “should be as vague as possible consistent with U.S. domestic requirements.”
The Joint Chiefs of Staff concurred in the need to develop “an enhanced U.S. capability to rapidly reinsert forces,” advocating “expanded security guarantees’’ to friendly nations throughout the region. To keep the door to the Persian Gulf propped open, the JCS called for “bilateral agreements, exercises, and planning teams to establish and maintain visible military-to-military relationships in the region.” Adding to the quantities of U.S. military equipment stashed in the region would also be helpful. The JCS was counting on the increased “credibility” accruing to the United States for “defending one Arab state against another” to make the region’s various emirs, kings, and presidents amenable to such a program. The bottom line was simply this: for the national security apparatus, Operation Desert Storm promised to open up a vast new array of opportunities for deepening military engagement.
The second element informing administration thinking about the region related to the methods of governance prevailing in and around the Persian Gulf. Within the Arab world “impulses toward political openness and democratic institutions” met with resistance in the form of “regime preoccupation with short-term stability.” The result was “unfulfilled basic human needs,” which in turn left “little space for political and social fulfillment.” In the wake of Desert Storm these matters too were about to become Washington’s business.
Administration analysts understood that tampering with Arab political arrangements carried considerable danger. In the near term, promoting political and economic liberalization might actually increase rather than guard against instability. Still, on balance, the promised gains outweighed those risks.
Bolstering the administration’s sense of self-confidence was its clear understanding of exactly what had been holding back the peoples of the Middle East. In Washington the antidote to backwardness and stagnation appeared all but self-evident. Arab countries “need to be receptive to new ideas.” They had to understand that “to be competitive in an increasingly interdependent world, they have to be able to deal with the free flow of information and technology.” The upshot was that the “Middle East should become less insular.” To flourish, Arab nations would have to “keep their borders and airwaves open, encourage broad-based participation in political decision-making, and promote free economic choices and reduced government control over economic and trade decisions.” It was therefore incumbent upon Arabs to “move toward greater economic integration with the industrialized world.” In terms of political economy Arabs had no real alternative but to subscribe to the norms prevailing throughout the Pax Americana. Only then would the peoples of the Middle East “become full beneficiaries of the new world order.”
Finally, although the administration did not delude itself into thinking that resolving the Israeli-Palestinian conflict would be easy, renewed support for the so-called peace process constituted the third leg of Washington’s planned post-Desert Storm strategy. Administration officials expected the war to transform U.S.-Israeli relations as well as U.S.-Arab relations and therefore to enhance Washington’s leverage as mediator. Making progress on this front was an urgent priority for the United States. As long as the United States remained close to Israel, Arab perceptions that Palestinian grievances were being ignored would adversely affect U.S. relations with the entire Islamic world. In that regard, wrote Richard Haass, the National Security Council’s senior director for Near East and South Asian affairs, “we may deny linkage as a matter of policy, but we cannot ignore it as a fact of life.” Here too victory over Iraq had the potential to be a game changer.
The strategy devised as an adjunct and follow-up to Operation Desert Storm—emphasizing the assertion of military primacy, the export of liberal democratic capitalism, and the mediation of conflicts viewed in Washington as extraneous—was entirely consistent with the post-World War II tradition of American statecraft. Such an approach had worked quite well in Europe and at least passably well in East Asia and Latin America.
What policymakers in 1990 and 1991 could not see or refused to entertain was the possibility that conditions in the Islamic world—Islam itself seldom qualified for mention in the policy papers circulated by the National Security Council—differed, and those differences rendered methods applied elsewhere not only irrelevant but even counterproductive. Rather than inducing acquiescence, continuing efforts to assert U.S. military primacy since Operation Desert Storm—for example, stationing U.S. forces in Saudi Arabia for a decade—have inspired sustained resistance. Efforts to promote liberal values have made little headway. And efforts to impose a solution ending the Israeli-Palestinian conflict have done little except to breed cynicism.
The real failure afflicting Operation Desert Storm, evident both at Safwan and in Washington, was not mishandled war termination. Rather the real failure lay in a grotesque misunderstanding of the context from which the Persian Gulf War had emerged: a persistent refusal on the part of the West to allow the people of the Islamic world to determine their own fate in their own way. And that refusal contributed mightily to the rise of violent anti-Western Islamism.
However inadvertently, Operation Desert Storm advanced the Islamist cause. This became its principal legacy. As a consequence, the partial victory over Saddam in 1991 helped set the stage for what Americans in the wake of 9/11 chose to call their Global War on Terror. That conflict, subsequently redesignated the Long War, continues today with no end in sight, even as the Obama administration clings stubbornly to the conviction that asserting military dominance, exporting liberal values, and advancing the peace process promise a way out. Flawed twenty years ago, that approach to strategy remains no less flawed today. Washington’s insistence to the contrary makes prospects of terminating the Long War any time soon nearly nonexistent.
Last edited by HERO; 07-05-2014 at 10:00 AM.