The American Interest

By Owen Harries and Tom Switzer

A Washington adage holds that someone commits a “gaffe” when he inadvertently tells the truth. This seemed to be what a U.S. policymaker did two decades ago when he mused about the limits to U.S. power in the post-Cold War era. On May 25, 1993, just four months into the Clinton Administration, a certain senior government official—the new Undersecretary of State for Political Affairs and a former president of the Council on Foreign Relations—spoke freely to about fifty journalists on condition that they refer to him only as a “senior State Department official.” Gaffe or no gaffe, Peter Tarnoff’s frank remarks at the Overseas Writers Club luncheon set off serious political turbulence in the foreign policy establishment.

Tarnoff’s message was that, with the Cold War over, America should no longer be counted on to take the lead in regional disputes unless a direct threat to its national interest inhered in the circumstances. To avoid over-reaching, he warned, U.S. policymakers should define the country’s interests with clarity and without a residue of excessive sentiment, concentrating its resources on matters vital to its own well-being. That meant Washington would “define the extent of its commitment and make a commitment commensurate with those realities. This may on occasion fall short of what some Americans would like and others would hope for”, he recognized. The U.S. government would, if necessary, act unilaterally where its own strategic and economic interests were directly threatened, but it would otherwise pursue a foreign policy at the same time less interventionist and more multilateral.

President Clinton’s deferral to European demands on the Bosnian crisis, Tarnoff added, marked a new era in which Washington would not automatically lead in international crises. “We simply don’t have the leverage, we don’t have the influence, we don’t have the inclination to use military force, and we certainly don’t have the money to bring to bear the kind of pressure that will produce positive results anytime soon.”

At first glance, there was nothing new here. As far back as the Nixon Doctrine, U.S. officials had spoken of more voluble burden-sharing, of asking allies to do more on their own behalf, and of a variable-speed American foreign policy activism that could be fine-tuned to circumstances. And then, within a year of the Soviet Union’s collapse, Bill Clinton won a presidential election in part because he promised to “focus like a laser” on domestic issues. Neither during Nixon’s tenure nor in 1993 did anyone use the phrase “to lead from behind”, but this new locution is consonant with the basic thinking of those earlier formulations. In some ways, “leading from behind” is the third coming of a seasoned and generally sensible idea.

Nor was Tarnoff saying anything outside the implicit consensus of presumed foreign policy “wise men” at the time. Many dedicated Cold Warriors and leading foreign affairs experts, Republicans and Democrats alike, had been arguing for the previous three years that, having just won a great victory, it was time for America to embrace a more restricted view of the nation’s interests and commitments. “With a return to ‘normal’ times”, Jeane Kirkpatrick argued in The National Interest in 1990, “we can again become a normal nation—and take care of pressing problems of education, family, industry and technology. . . . It is time to give up the dubious benefits of superpower status and become again an . . . open American republic.” Nathan Glazer proposed that it was “time to withdraw to something closer to the modest role that the Founding Fathers intended.” William Hyland, editor of Foreign Affairs at the time, wrote, “What is definitely required is a psychological turn inwards.” And according even to Henry Kissinger, the definition of the U.S. national interest in the emerging era of multipolarity would be different from the two-power world of the Cold War—“more discriminating in its purpose, less cataclysmic in its strategy and, above all, more regional in its design.”

Notwithstanding all this, and no doubt to his own surprise and chagrin, Tarnoff’s remarks started a firestorm of fear and indignation almost the moment reports of his background briefing hit the press. As one Australian newspaper correspondent observed at the time, “the reaction to his words could scarcely have been more dramatic if he had stripped naked and break-danced around the room.”

Talking heads denounced not just Tarnoff but the new President for whom he spoke as “isolationist” and “declinist”; some beheld a “creeping Jimmy Carterism” with an Arkansas accent. Foreign embassies went into overdrive as diplomats relayed the news back home. The White House quickly attempted to distance itself from what its press secretary dismissed as “Brand X.” The Secretary of State, Warren Christopher, stayed up all night making personal phone calls to journalists and appearing on late-night television to reassure the world that America’s global leadership role was undiminished. In a hastily rewritten speech, Christopher pointedly used some variant of the word “lead” 23 times. Meanwhile, rumors swirled that the official (only later identified as Tarnoff) was about to lose his job. Yet for all his allegedly neo-isolationist sins, the hapless official remained employed. No apology or explanation was forthcoming.

The incident could only be described as bizarre. Here was a senior U.S. policymaker saying something that official Washington had deemed outrageous. Yet Tarnoff was not proposing that America pull up the drawbridge from a messy world, nor was he suggesting that Washington withdraw from any international institutions, let alone from any Cold War alliances. He was merely recognizing the reality of the emerging post-Cold War world and America’s place in it: that, depending on the circumstances and the nature of its interests, the United States could and would pick and choose where to commit its formidable weight. Given the Clinton Administration’s great reluctance to intervene in the Balkans, one could well be left with the impression that, if Tarnoff was guilty of anything, it was not of misstatement, but of excessive candor. He probably would have been wise to take William Henry Harrison’s advice: “The more you talk, the less you should say.”

 

This episode from almost exactly twenty years past is worth recalling as President Obama, at the beginning of his second term, attempts to define a new U.S. role in the world that fits America’s changed circumstances and more limited resources. Today, in the wake of the financial crisis and the Iraq and Afghanistan wars, Americans are rediscovering the costs and limits of the use of force. Consequently, the President appears to be in the process of putting into practice the central tenets of what very briefly became known as the Tarnoff Doctrine.

One theme is that it is time for the United States to focus on “nation-building at home”, something Obama has stressed several times in recent years. The other theme is the vague, subtle emphasis on caution, prudence, balance, modesty and proportionality in dealing with adversaries and competitors. But just as Tarnoff’s nuanced remarks about discriminating leadership ignited a firestorm two decades ago, Obama’s efforts to recast the U.S. role in the world have stirred controversy.

The President’s recent appointments of Chuck Hagel as Defense Secretary and John Kerry as Secretary of State, neither of whom is likely to press Obama to adopt an ambitious agenda, have been met with derision and even outrage on the Right. A Wall Street Journal editorial called the new national security team a “flock of doves.” Among other things, Hagel is a skeptic of preventive war against Iran and a supporter of cutting the “bloated” Pentagon—views that, according to opinion polls, command majority support. Yet according to the Weekly Standard’s William Kristol, he is “out on the fringes”, and, Republican Senator Lindsay Graham thinks he “is out of the mainstream of thinking . . . on most issues regarding foreign policy.”

Two years ago, an unnamed senior Obama adviser inadvertently coined a foreign policy doctrine in suggesting the United States was “leading from behind” in Libya. According to the New Yorker’s Ryan Lizza, the term represented “a different definition of leadership than America is known for”, and reflected the reality that the United States now lacks the relative power to impose its will and leadership across a more pluralistic world. Once again, the response was hostile. It’s “not leading. It is abdicating”, argued syndicated columnist Charles Krauthammer. It “sounds rather pathetic”, lamented Maureen Dowd in the New York Times. It “just doesn’t work in today’s world”, warned veteran foreign affairs columnist David Ignatius. And the phrase, editorialized the Washington Post, reflects “extraordinary U.S. passivity” and a “pattern of torpidity” during the Arab Spring.

But however awkward the language, the argument is less foolish than the reaction it provoked. The message is not that passivity is a foreign policy virtue, however passive the second Obama Administration may prove to be. Rather, it is that, depending on the circumstances and the national interest, it is sometimes appropriate for Washington to take the lead in mobilizing multilateral action and proffering credible threats, but sometimes it is not. Being selective and discerning can be but is not necessarily a mask for generic weakness or a disposition to explain away difficult choices. There are times when the United States is wise and wily, rather than weak and wayward, to be not so visible during a crisis. Sometimes it’s better to play a key logistical role from the sidelines rather than take the lead, as with the 2011 Libya campaign. (Never mind, for the time being, whether that was a wise operation on the whole to begin with.) Discrimination and selectivity should take precedence over consistency and comprehensiveness whenever resources are limited, which is virtually always. That was precisely Tarnoff’s point, too. Plus ça change?

One of the disconcerting things about Obama’s more muscular critics, from neoconservatives to liberal hawks, is that they seem to favor American global military interventionism as a binding principle rather than as a course to be pursued only when the effort is commensurate with the stakes, and when other measures have failed. Besides, to insist that the United States must lead globally is to imply that everybody else must follow or else be held in injurious disrepute. That describes a forced subordinate relationship alien to America’s basic avowal of freedom and egalitarianism. The United States should be concerned about spreading responsibility and initiative across the world, and building cooperative, not subordinate, relationships.

If such arguments were not persuasive in 1993, they ought to be today. Geopolitical circumstances have changed significantly since then. Public opinion on foreign policy has shifted dramatically. A commander in chief—who has run foreign policy from the White House more than any President since Nixon—recognizes the need to balance ends and means. A policy of indiscriminate U.S. global interventionism (“global leadership” in common, high-toned, jingoistic parlance) is not strategically sustainable, even if the economic resources were available to pay for it, which they certainly are not.

 

As to changed circumstances, consider that in the 1990s the United States emerged the victor from the Cold War without a shot being fired. It achieved global hegemonic status not by especially assertive or ambitious action on its own part, but by a combination of the self-induced collapse of its rival and the discipline not to gloat about it. Despite the pseudo-historical mythologies concerning the 1992 Department of Defense Planning Guidance document, the George H.W. Bush Administration had no plan in place to exploit its unexpected dominance, nor did its successor adopt one during the rest of the decade. Tarnoff was admonished for stressing a more discriminating U.S. world role while advising that “our economic interests are paramount.” Yet his boss, Bill Clinton, was generally given a pass for showing little interest in foreign affairs, for having no grand doctrine, and for insisting that the economy should be the country’s main preoccupation throughout his two terms in office. Certainly, the U.S. armed forces were maintained at a high level despite post-Cold War reductions in force strength and the defense budget. But new commitments, over and above the maintenance of America’s alliances, were scrutinized mercilessly, and any undertaken were kept limited in time and scope.

Sensibly, the U.S. military often seemed more concerned with effective exit strategies than with implementing ambitious, open-ended, foreign policy projects—whether that concerned the Balkans, Haiti, Somalia or places entirely avoided, like Rwanda. This restraint involved no loss of prestige or influence. Indeed, restraint and discrimination can also win respect. Hegemons are always regarded with great suspicion and resentment by other states. If they throw their weight around, they are likely to bring into being hostile coalitions formed to balance and contain them. That is partly why Warren Christopher’s successor at Foggy Bottom, Madeleine Albright, was able to boast, convincingly if unnecessarily, that the United States was “the indispensable nation” that stood “tall and sees further than other countries into the future.” As Thomas Friedman of the New York Times summed it up: “Today’s era is dominated by American power, American culture, the American dollar and the American navy.” Strikingly, the comparative restraint of the new hegemony allowed its dominance to be accepted with comparatively little complaint. Of all people, it was the French Foreign Minister of the day, Hubert Védrine, he who also coined the term “hyperpower”, who best reflected the prevailing view: “American globalism . . . dominates everything. Not in a harsh, repressive, military form, but in people’s heads.”

Meanwhile, the United States was experiencing what to all appearances was the longest economic expansion in its peacetime history. To paraphrase candidate Clinton in 1992, everything that should be up—wages, growth, stock market—was up, while everything that should be down—inflation, joblessness, deficits—was down. Hardly anyone recognized at the time that many of the positive indices reflected bubbles that would later pop, and not until 1999 did the so-called Financial Services Modernization Act lay the ground for the debacle to come in 2007–08. America was being widely hailed, abroad as well as at home, as the miracle economy, master for all time of “the great moderation.”

That was then. Today, things look very different. The dollar is weak. The debt mountain is of Himalayan proportions. Budget and trade deficits are alarming. Infrastructure is aging. The AAA bind credit rating is lost. Economic growth is exceptionally sluggish for a nation that is nearly four years out of a recession. And whereas twenty years ago U.S. military power was universally considered awesome in its scope, today, after more than a decade of its active deployment, the world is much more aware of its limitations and costs. It is decidedly less impressed.

From hindsight it is clear that the terror attacks of September 11, 2001 constituted a major inflection point. As America’s alleged “holiday from history” came to an abrupt end, outrage over the attacks, taken together with the mental habits of American hegemony and American exceptionalism, had apparently given U.S. leaders a clear, overriding sense of purpose. A new central organizing principle that it had lacked after containment had been retired with laurels arose so crisp and clear as to be too good to be true—because it was. A mature and experienced President might have been able to resist, modify or deflect the temptations of the moment. George W. Bush not only yielded to them; he gave them authoritative voice. Thus September 11 shifted the balance in favor of those who saw things in sweeping terms—away from prudence and modesty toward an ambitious and assertive use of U.S. power to topple tyrannical regimes and export democracy far and wide.

Of course, there was nothing wrong with the promotion of democracy and its associated values as one goal among many in U.S. foreign policy. That theme has been present in one form or another for two centuries, qualified by global circumstances and relative American power. It made sense after September 11, 2001, so long as it was pursued with care and modest expectations, when it did not conflict with more demanding goals, and when the conditions for its success were favorable. There was everything wrong with pretending (or, even worse, genuinely believing) that it should be the overriding purpose of policy, one that had to be achieved in quick time and by the application of American force. As it happened, in both Iraq and Afghanistan America has been its own worst enemy—inefficient, incompetent and overconfident.

Now, a great power can cope with a bloody nose on the battlefield. The United States coped with Vietnam, after all, in due course. What is more serious is a general loss of credibility and prestige associated not with an episode of bad judgment, but with a generic inability to conduct one’s affairs responsibly. It is the latter that truly diminishes a great power’s ability to lead and persuade other nations. Washington’s demands and requests are increasingly ignored by its longtime foes in Tehran and Pyongyang as well as its largest aid recipients in Cairo and Jerusalem. Its influence is fading at global summits, too: in the G-20, where the Germans reject Washington’s loose fiscal policy prescriptions; at climate conferences, where the Chinese, Indians and Brazilians ignore U.S. calls to reduce their carbon footprint; and in security talks, where the Pakistanis refuse to sever ties between their intelligence services and the Taliban.

True, the United States is still the world’s largest economy and the issuer of its reserve currency. It is also its lone military hegemon. It is also true that as hegemons go, Washington is not feared. It did not seek to exert absolute control over international events at the height of the Cold War, but that had to do with the Soviet Union’s power as well as with the novelty of “an empire by invitation.” Nonetheless, the United States exercises less influence today than it did before the fall of the Berlin Wall, not least for the elemental reason that its allies no longer crave its protection as much as they once did. Although many proponents of an activist and ambitious leadership role acknowledge this point, they do so on the grounds that a reckless President has embraced American weakness. As they see it, decline is less a condition than a choice. The reality is that the age of American unipolarity, which began with the collapse of Soviet Communism, is being replaced by a world populated by new players. To the extent that China, India, Turkey and Brazil become more assertive internationally, it challenges the notion that an activist America can impose its will and leadership across the globe. And a notion thus challenged is a notion weakened, regardless of objective circumstances.

 

As to public opinion, in 1993 Tarnoff’s proposals, however current among among a dominant section of the foreign policy cognoscenti, represented too radical a change for the American people so soon after the end of the Cold War. Today, the environment is different. Simply put, the American public has tired of the world. It is suffering from foreign policy fatigue. For seventy years—first against fascism, then communism and more recently Islamist fanaticism—it supported and sustained a foreign policy and defense commitment of the most intense and comprehensive kind. Everything else was subordinated to it, and all sorts of domestic concerns were neglected. As America withdraws from Afghanistan—a war that has lasted longer than both world wars of the 20th century combined—there is an overwhelming feeling that it is high time for the nation to concentrate on its own neglected internal problems.

Last September, the Chicago Council on Global Affairs published its highly regarded annual survey of American public opinion and U.S. foreign policy. Among other things, it found that Americans are now less likely to support the use of force in many circumstances and are more likely to endorse defense spending cuts; that fewer Americans are concerned about global terrorism as a “critical” threat to the United States than at any time since September 11, 2001; and that majorities oppose either a unilateral U.S. preemptive strike on Iran or even an attack authorized by the UN. It also highlighted the dovish views of Millennials, those born after 1980: 52 percent believe that the United States should “stay out” of world affairs.

The upshot is that Americans today appear less concerned about foreign policy than at any time since the heyday of isolationism between the world wars. In a polity that is acutely sensitive to public opinion, veritably driven by polls, focus groups and the relentless 24/7 news cycle, this means that foreign policy is severely downgraded in the calculations of politicians. The most obvious exemplar is that more Republicans now care about downsizing government than about a strong defense, and so have learned to love sequestration.

We had some warning of this. In last year’s presidential election it was the dog that did not bark. Mitt Romney’s 40-minute speech at his party’s national convention devoted only one paragraph to foreign affairs and never mentioned Afghanistan. According to The Guardian, his campaign did not even have a senior foreign policy staffer. (It rather had two dozen somewhat less-than-senior advisers who often disagreed with each other.) During the October foreign policy debate, both candidates sounded positively McGovernesque, keen to pivot to domestic affairs while warning about getting bogged down in another Middle Eastern quagmire. The President reiterated his stump line that “nation-building begins at home” on three occasions, while Romney mentioned “peace” 12 times. The challenger’s insistence that “we can’t kill our way out of this mess” sounded something like an antiwar catchphrase at Woodstock. All of this explains why the President dedicated only a few paragraphs to foreign affairs in his Second Inaugural Address. Again, to the extent that such attitudes prevail, they are inimical to the notion of America as world policeman.

 

Finally, as to ends and means, we are wise to remind ourselves that one impediment to clear thinking about foreign policy since the collapse of Soviet Communism has been the conspicuous divorce of ends and means. In the 1940s, Walter Lippmann penned the single most important sentence ever written about American foreign policy:

Without the controlling principle that the nation must maintain its objectives and its power in equilibrium, its purposes within its means and its means equal to its purposes, its commitments related to its resources and its resources adequate to its commitments, it is impossible to think at all about foreign affairs.

This idea likely inspired Tarnoff’s remarks twenty years ago. But the American political class would have none of it, whether in foreign policy or, as we now see so vividly, in domestic policy. By willfully ignoring the principle that ends must have some relation to means, a huge gap opened between America’s global pretensions and its ability to finance them. And that, exactly, is a gap President Obama seems intent on closing.

In the past four years, the President has jettisoned his predecessor’s sweeping doctrine of preventive warfare, aggressive unilateralism and a clear division between those “with us” and “against us.” Washington has kept out of hot spots such as the Syrian civil war while playing down, without ever ruling out, the prospect of a preemptive strike on Iran’s nuclear facilities. This new strategy does not stop Obama from escalating drone strikes against terrorists, nor will it derail his Administration’s “pivot” of U.S. forces toward East Asia. Nor does it necessarily obviate a strike against Iran if all else fails. But it does allow him to reorder priorities in favor of discrimination and selectivity, to get a good running start on once again matching resources and aspirations.

None of this is about a new isolationism. It’s an approach that stresses an unsentimental focus on national interests, pursued with a prudent calculation of commitments and resources, while focusing on rebuilding the U.S. economy. Such realism would require a significantly leaner defense budget, and it would require a far more scrupulous definition of which wars are and are not wars of necessity.

Meanwhile, Lippmann’s axiom is being comprehensively ignored by many conservatives. They simultaneously demand assertive American global leadership, on the one hand, and reduced domestic spending on the other. They make the downsizing of the Federal government a high priority while simultaneously demanding an ambitious foreign policy inspired by “vision” and sense of mission—the kind of foreign policy that has been instrumental in building up the power of the state over the past seventy years, and indeed the power of states throughout history. For politicians to will the ends but balk at providing the means is one of the deadly sins of foreign policy. A disjunction between ambition and resources—the attempt to sustain greatness on the cheap—is highly dangerous in terms of American lives and interests.

The imbalance between ends and means isn’t just apparent in a reluctance to commit American treasure. It is also apparent in the reluctance to commit American blood. A legacy of the Iraq and Afghanistan wars is an unwillingness on the part of the American public to take casualties on behalf of less than truly vital challenges. American servicemen, for instance, were kept well out of harm’s way in the campaign to topple Muamar Qaddafi in 2011. While such concerns may be admirable in humanitarian terms, they are incompatible with a superpower posture and pretensions to global leadership. If a nation is not prepared to take casualties, it should not engage in the kind of policies likely to cause them. If it is not prepared to take casualties, it should resign itself to not having the kind of respect from others that a more resolute nation could expect.

 

The process of implementing a Tarnoff-style doctrine in a second Obama Administration would certainly not be without problems. For one thing, deeply ingrained habits from the Cold War persist despite radically altered circumstances. Habit is one of the most powerful forces in human life, not least because it is such a labor-saving device, making it possible to dispense with thought.

The clearest example today is the reluctance of lawmakers to make serious cuts to defense spending, except perhaps through the self-imposed deus ex machina of sequestration. Never mind that the Pentagon budget has nearly doubled since September 11, 2001. Never mind that Washington spends about as much on the military as the world’s remaining nations put together. And never mind that defense spending often falls substantially as the United States ends military actions, as it is doing today and has been doing for some years now. Ignoring changed circumstances is not only laziness, but also functionally necessary for some. It is not cynical to note that many of the policies and institutions of the Cold War and post-9/11 era represent huge vested interests in terms of careers, contracts, reputations, consultancies and so on.

If habit—seeing American priorities as they were rather than as they are—has been an impediment to clear thinking about foreign policy, another has been the belief in American exceptionalism: namely, that as a nation born of an idea, embodying a principle, dedicated to a proposition and claiming a manifest destiny, the United States is fundamentally different from other nations in its very nature and hence in its behavior. This belief underpins the notion that America is a benign hegemon—a provider of “public goods” to the international community, the keeper of order and stability, the promoter of freedom, an unthreatening and disinterested presence to all except those who are evil.

A belief in American uniqueness is not, of course, new. Americans have always maintained that they are more moral and disinterested than other states and that power politics is a game played by the rest of the world but not by them. Nor is it confined to the neoconservative wing of the Republican Party. It was not Paul Wolfowitz but Hillary Clinton who argued in 2010 that “Americans have always risen to the challenges we have faced. It is in our DNA. We do believe there are no limits on what is possible or what can be achieved.” In recent times, however, America has entered a period of upheaval that has shaken the national psyche: persistently high unemployment, a debt larger than gross national product, diminished net wealth, a rising China, a series of what Rudyard Kipling called “the savage wars of peace”, and a polarized and dysfunctional political system beholden to special interests. Many Americans are in an increasingly foul mood and are looking for someone to blame. According to a Gallup survey in October last year, 91 percent of the American people thought the nation was on the wrong track, an historic vote of no-confidence.

The danger of American exceptionalism is that it discourages compromise and flexibility, and encourages a sense of omnipotence. And although the United States has shown an impressive ability to bounce back from past setbacks—Civil War, Depression, Pearl Harbor, Vietnam—it will not enjoy the kind of absolute global supremacy that it held in the post-World War II period. Nor is it likely to command the unrivalled power and prestige that accompanied the so-called unipolar moment of the early the 1990s. If political leaders fail to prepare the nation for this reality, they risk leaving the American people open to sad surprise in an era where not every option is available and resources are not unlimited. Despair and frustration could continue to roil the political climate and contribute to what the liberal historian Richard Hofstadter famously identified as “the paranoid style in American politics.”

This is the background against which the debate over U.S. foreign policy is now being waged, and the anger and anxieties of the nation perhaps explain why Obama has been so reluctant to give voice to the Tarnoff doctrine, or something like it, even though he accepts its central tenets. After all, what Tarnoff said twenty years ago—that America no longer has the will, wallet or influence to impose an active and ambitious global leadership across the world—is actually true today, even if it was not at the time. President Obama has an opportunity, and a responsibility, to build support for a foreign policy that stresses realism, restraint, modesty, limits, selectivity and discrimination.

To do that, however, the President will have to stop playing hide-and-seek with his strategic thinking. It is one thing for the Administration to embrace prudence and discretion in the deployment of America’s still unmatched military power, and to realize the need to be more discreet in using its power and more inclined to act in concert with other nations. It is another thing for the President to state to the American public where Washington should lead and where it should not. During the Tarnoff flap two decades ago, the distinguished foreign affairs commentator Michael Mandelbaum argued that without such priorities America would dissipate its energies on secondary issues and that, in consequence, an administration might find itself bereft of support for its most important and justifiable priorities. That remains a valid concern.

President Obama rightly wants to focus on “nation-building” at home. But only when he articulates an approach that emphasizes prudence and modesty in the most forceful and eloquent manner will such a doctrine win public acceptance. His challenge is to match resources with aspirations, to bring commitments and power into balance. He should also explain that the United States has left the realm of necessity and is increasingly entering the realm of choice, where the key word is not “and” but “or”, and the key question is not “how?” but “why?”

Pilita Clark, “Uncle Sam Looks Inward”, Sydney Morning Herald, June 21, 1993.

Lippmann, U.S. Foreign Policy: Shield of the Republic (Little, Brown, 1943), pp. 9–10.

Mandelbaum, “Like it or not, we must lead”, New York Times, June 9, 1993.

 This article was originally published at The American Interest