The Case for Questioning U.S. Leadership
Donald Trump boasts that “America is leading again on the world stage.” In both cases, the basis for this leadership role remains undefined. The original question remains: Whom is America leading and why?
The election of an “America First” president has produced much reflection on the potential passing of a period of U.S. global leadership. If the country were to abandon the pursuit of international influence in favor of a narrower defense of national interest, then what in the world would change? Any attempt to answer this question must begin by identifying what, or whom, the United States has been in the business of leading.
In recent years, many advocates of U.S. leadership have determined its purpose to be the preservation of an entity they call the “liberal international order.” Although versions of the concept enjoyed popularity within the Obama administration and among individual scholars of international relations, it was only after the election of 2016, as foreign policy pundits rushed to name a status quo in crisis, that the phrase “liberal international order” entered broader circulation. The novelty of the concept and the anxiety responsible for its sudden relevance were evident in the title of a Foreign Affairs anthology from February 2017—“What Was the Liberal Order? The World We May be Losing.”
As debates over its future intensified, the “liberal international order” ceased to sound strange and even acquired the acronym LIO. Its proponents commonly describe it as a combination of rules, norms, and institutions that govern international behavior and depend on the United States’ example and enforcement. Lists of the LIO’s components are typically long and rarely uniform, but a representative sample might include democracy, human rights, and open markets alongside the United Nations, NATO, and the World Bank. Because the internal elements of the “liberal international order” are so varied, its contours remain rather vague.
Its chronology, by contrast, is fixed. The most consistently asserted fact about the “liberal international order” is that it emerged when the United States assumed global leadership after World War II and has persisted, however precariously, into the present. Although advocates of the “liberal international order” allow space for it to change over time, their central claim is its continued existence. The result is a fairly flat conception of history, where the entirety of the Cold War (among other conflicts), as well as the upheavals of the late 1960s and early 1970s (including the abandonment of the postwar monetary system), are folded into a long period characterized by U.S. leadership of the “liberal international order.”
Whether it is analytically or politically productive to posit a long-lasting LIO is the subject of debate. One question is whether the United States did, in fact, establish a coherent set of rules, norms, and institutions that have governed relations among nations since World War II. Another is whether to support the political program that the “liberal international order” has come to represent. This second question raises a rarely asked third. If attitudes toward the LIO are shaping debates in the present, then what other ideas might have motivated policymakers in the past? After all, those who led the United States during the period now associated with the establishment and preservation of the “liberal international order” had no such project in mind. Their own aims derived from different concepts.
A new book by Michael Kimmage, The Abandonment of the West: The History of an Idea in American Foreign Policy, investigates the impact of one such concept. The broad outline of his argument is, by now, familiar. The United States, he claims, assumed a role of international leadership in the wake of World War II, has abandoned it under Donald Trump, and must work to regain it in the future. However, as the book’s title indicates, Kimmage’s subject is not the newly invented LIO, but the far more familiar West. And instead of searching for an enduring order, he traces the trajectory of an important idea. As Kimmage writes in his Introduction, the “liberal international order” has “no cultural resonance” and speaks to “no recognizable American identity and no historical pattern.” The idea of the West, on the other hand, has long influenced how Americans perceive themselves in relation to the rest of the world.
The Abandonment of the West is a valuable attempt to understand the course of American leadership in terms of a key motivating concept. Kimmage clearly demonstrates that early Cold War policymakers, in particular, prioritized the preservation of the West and often framed international developments and American aims in terms of historical and cultural reference points derived from the story of Western civilization. Just as important, Kimmage’s focus on the long-term evolution of the idea of the West ensures that the United States’ assumption of international leadership does not emerge ex nihilo in the middle of the twentieth century, but instead arrives as the product of previous political and cultural developments. Finally, he makes sure to match depth with breadth, charting the rise and fall of the idea of the West “across multiple worlds.” In addition to examining the perspectives of presidents and policymakers, Kimmage turns his attention to intellectuals such as William McNeill, whose 1963 “academic blockbuster” The Rise of the West: A History of the Human Community reflected a White House worldview, and Edward Said, whose 1978 book Orientalism “permanently undermined the prestige of Western civilization curricula at American universities.” Kimmage completes the picture by revealing how architectural fashions, such as the neoclassical craze of the late nineteenth and early twentieth centuries, reflected American relations with an imagined West.
Kimmage’s account of the development and disappearance of the Western Civilization course in American universities is particularly effective. He shows the early genesis of such programs in the late nineteenth century, their proliferation and standardization after World War I, and their fall from favor following the cultural shocks of the 1960s and 1970s. Indeed, this rise-and-fall narrative is so compelling that Kimmage’s effort to reveal a parallel pattern in the conduct of American foreign affairs suffers by comparison. Whereas the content of the Western civilization course remained remarkably consistent throughout the five or six decades of its dominance in university curricula, the importance of the West waxed and waned over the years in Washington. As Kimmage himself acknowledges, locating the concept in a policymaking context is no simple task: “Confusingly, the West is a place, an idea, a value—or places, ideas and values,” and, to make matters more difficult, many of these meanings “are mutually contradictory.”
Kimmage tends to define the West in three different ways. The first focuses on the shared historical and cultural heritage believed to bind the United States to Europe. This definition corresponds most closely to the notion of Western civilization as it appeared in higher education. The second refers to the “geopolitical West,” understood as a specific region sharing a set of material capabilities and strategic interests. This is the sense in which Kimmage refers to an “immensely powerful” transatlantic West after the end of the Cold War. The final meaning is the furthest removed from common conceptions of the West, and yet this is the one that Kimmage chooses to emphasize above all others. In his introduction, he claims to follow “one particular definition of the West in American foreign policy. This is the West embedded in a Euro-American narrative of self-government and liberty.” Thus, Kimmage conflates the idea of the West with specific values apparently derived from the European Enlightenment.
Accordingly, Thomas Jefferson is a champion of the West, since he “forged the bond between the Enlightenment and the main lines of American foreign policy.” Never mind that he, together with many of his compatriots, defined American liberty in direct contrast to European tyranny. In the same vein, Kimmage categorizes World War I and World War II as wars for the West because Woodrow Wilson and Franklin Roosevelt represented them as crusades on behalf of liberty and self-government. In fact, neither president made a habit of invoking a virtuous West or an evil East. On a rare occasion when Roosevelt did refer to “Western civilization,” it was to insist that “the old term […] no longer applies,” since events had, for the first time, brought humanity together into “a real world civilization.” Finally, Kimmage claims that Clinton, Bush, and Obama “stayed firmly and sincerely within the Jeffersonian—or Western—tradition of American foreign policy” despite avoiding the word “West,” whereas Donald Trump should be seen as “the first non-Western president,” because he has “abandoned the Jeffersonian West of liberty, multilateralism and law in favor of an ethno-religious-nationalist West.”
Kimmage tacks between different definitions of the West in order to provide himself with a way out of a present-day dead end. On the one hand, he worries that the American public has lost the sense of collective purpose that identification with the West once provided. On the other, he recognizes—and, for the most part, accepts—the reasons for the idea’s abandonment. Selectively sublimating a civilizational and geopolitical West into a set of Enlightenment values provides him with an apparent solution to this problem. In the end, he proposes that a post-Trump United States should adopt “a dual-track approach” in foreign policy, “promoting the West when dealing with Europe and promoting the ideals of liberty and self-government when dealing with countries that are not European.” American universities, meanwhile, should foster a shared “historical and cultural purpose” among students, not by reviving the old Western civilization programs (Said’s critique still stands), but by teaching the West of “liberty and self-government.” It is difficult to imagine such a tenuous and selective application of a watered-down West providing the American public with a renewed sense of purpose.
If the West seems unlikely to serve as a unifying ideal in the future, then what should we make of its role in the past? If it sets aside the values of liberty and self-government, which Americans have often seen as evidence of their superiority over Europe rather than as signs of a shared transatlantic heritage, then what remains of the West in the history of American foreign affairs? Kimmage himself points to the period “between 1945 and 1963” as the “heyday” of the transatlantic West and notes that “the presidents who spoke most often and most glowingly about the West were the Cold War presidents.”
Yet it was not the West that they claimed to lead. During the crucial period when the United States assumed long-term commitments across the globe, the primary focus of American foreign policy was the leadership of the “Free World.” As Harry Truman declared on May 9, 1950: “All our international policies, taken together, form a program designed to strengthen and unite the free world in its resistance to the spread of communism.” Throughout the 1950s and 1960s, American presidents and policymakers invoked the Free World far more frequently than the West in both public statements and classified documents, while major statements of U.S. strategy routinely claimed that strengthening and defending the Free World was the country’s primary aim.
Although the Free World became a key foreign policy category after the onset of the Cold War, the idea’s roots extend at least as far back as the early twentieth century. In fact, the concept’s arc in U.S. foreign affairs follows more or less the same pattern as Kimmage’s account of the West in American culture—a rise and fall stretching roughly from World War I to the 1970s. In the case of the Free World, it was Woodrow Wilson who first formulated many of the concept’s key postulates. In arguing for a declaration of war against the Central Powers, Wilson drew a stark line between democracies and autocracies and insisted that the very “existence of autocratic governments” constituted a “menace […] to peace and freedom” all over the globe. Only the elimination of such naturally aggressive governments, he argued, would make “the world itself at last free.”
A similar logic came to the fore during World War II. In the two years before the United States formally aligned with the Allies, Roosevelt repeatedly told the American public that technology had made the world small and interconnected, that a threat to peace anywhere endangered peace everywhere, and that the Nazis aimed to spread slavery across the world. Frank Capra’s 1942 Oscar-winning film Prelude to War illustrated the point, opening with an image of two spheres side by side, one white, the other black, one the “free world,” the other the “slave world.” The opposition between the two worlds was total: “One must die, one must live.”
Like Wilson, Roosevelt assumed that the elimination of freedom’s existential foe would create the conditions for a more peaceful and cooperative world. But whereas Wilson’s League of Nations treaty had failed to earn the approval of Congress, Roosevelt’s efforts to establish a postwar international organization received the full support of the Republican party and the American public. By the end of the war, many Americans had embraced the slogan of “One World,” a phrase popularized by the former Republican presidential candidate Wendell Willkie in his 1943 bestseller by the same name. Although the world government that some desired never came close to fruition, the international institutions that eventually emerged from the war took a more or less unitary globe for granted.
The onset of the Cold War put an end to the promise of global cooperation, and Willkie’s “One World” once again became two. NSC 68, submitted to the president on April 14, 1950, proposed to make this global binary the basis of American grand strategy. The document described a zero-sum conflict between a “free world” and “slave society,” declared “a defeat of free institutions anywhere” to be “a defeat everywhere,” and concluded by calling on the United States to build up “the political, economic, and military strength of the free world.” The outbreak of the Korean War a few months later transformed this outlook into orthodoxy. Finally, the passage of the 1951 Mutual Security Act institutionalized a Free World foreign policy, combining all of the country’s aid efforts into a single program with the express purpose of building up the “defenses of the free world” as a whole and for the long haul.
For the rest of the 1950s and into the 1960s, American perceptions of the Cold War and U.S. leadership remained inextricably tied to the notion of the Free World. Throughout this period, the concept was characterized by three peculiar qualities. First, its existence was taken for granted. Although the phrase “free world” had hardly appeared in print before the late 1940s, almost no one noticed its novelty when it finally entered the political lexicon. Unlike the “cold war,” another neologism from the same period, whose sudden appearance and essential strangeness struck Americans from the start, the Free World seemed to be a natural category. Hardly anyone defined the Cold War because its meaning seemed a mystery; almost no one defined the Free World because its meaning seemed self-evident. As a proper noun naming an enduring entity, the Free World was indeed new. But much of the logic underpinning the concept had long been familiar in U.S. foreign affairs.
The Free World’s second key quality was its fixed geography. During World War II, Capra’s black and white “worlds” had existed as metaphors and symbolic abstractions, indeterminate in time and space; the Free World of the 1950s, on the other hand, occupied specific territory on a single globe. Because of the geographical consolidation of the Communist bloc, American policymakers could, and did, divide the entire globe into two distinct and mutually exclusive areas. This made the zero-sum relationship between the two “worlds” starkly apparent and required a readiness to defend the entire length of the Free World’s frontiers. It also permitted the government to generate specific figures regarding the Free World’s population and material resources
The final, and perhaps most troublesome, characteristic of the Free World was that its champions defined it in negative terms. For American policymakers, the Free World was always identical to “the non-communist world.” It was only after circumscribing a pathological zone characterized by freedom’s total absence that the leftover areas acquired apparent coherence. On the rare occasions when the U.S. government defined the Free World, it always did so indirectly, by listing the countries that it excluded. The non-communist nations that made up the Free World became members by default. Any further distinctions or commonalities among them remained strictly secondary. Apparent rejections of Free World membership, by India’s Jawaharlal Nehru for example, might cause consternation, but, until the 1960s at least, did not produce any real revision of the category.
The U.S. government recognized the potential weakness of Free World solidarity and made a concerted effort to fill the concept with positive content, spending considerable resources on official and unofficial propaganda programs across the globe. The material thus disseminated derived much of its inspiration from the story of Western civilization, the tradition of the Enlightenment, and the tenets of Christianity, but American officials deliberately diluted these sources in order to make the product suitable for global consumption. Invocations of a vague and all-encompassing spirituality, for example, tended to replace Christianity as the opposite of Communist atheism. More minor tweaks were also possible, as evidenced by a 1952 U.S. Information Agency pamphlet entitled “The March of Freedom,” which added sections on the Babylonian Hammurabi, and the Indian Asoka to an otherwise traditional Western narrative focusing on Greece and Rome, the Magna Carta and the Declaration of Independence.
The nominal Free World was not convinced. In 1952, the State Department distributed a questionnaire entitled “Key Words in American and Free World Propaganda” to 157 posts abroad, proposing forty-two positive keywords “to be applied to the United States and the free world” and fourteen negative keywords “to be applied to communist imperialism.” The results indicated no “universally effective propaganda terms” across different regions. The various members of the Free World, it seemed, shared no easily identifiable ideals. At the same time, American conservatives such as James Burnham and Whittaker Chambers believed that the thin universalism of Cold War liberalism represented a betrayal of Western Christian civilization. They were right in the sense that the United States had not chosen to champion a positively defined West. The Free World the country had actually committed to lead represented a quite different construction—one ultimately based on a presumed absence of Kremlin control rather than any real commonality of culture or values.
Over the course of the 1960s, the concept of the Free World gradually lost purchase as the problems it addressed became less pressing and the tensions it produced grew more pronounced. The institutionalization of the United States’ global commitments and the routinization of the Soviet-American rivalry made the Free World less essential as a mobilizing concept. Meanwhile, the Vietnam War lessened the appetite for U.S. global leadership both domestically and in Europe and served to discredit the automatic defense of the Free World’s frontiers. Just as significantly, the countercultural movement called the Free World’s freedom into question, relocating the main threat from Soviet totalitarianism to domestic institutions—from George Orwell’s Big Brother to Ken Kesey’s Big Nurse. Finally, many of the post-colonial states opted for freedom as national sovereignty over freedom as anti-communist solidarity, eventually forcing the former champions of the Free World to recognize a separate Third World, effectively blurring the lines of the original binary.
The abandonment of the Free World occurred at about the same time and for some of the same reasons as the abandonment of the West. But those who rejected the West did so explicitly, whereas the Free World faded from view as unobtrusively as it had first appeared. As a result, attempts to revive the West have typically encountered outspoken resistance. The logic underpinning the Free World, on the other hand, has remained readily deployable. In the 1970s and 1980s, the idea of a fundamental struggle between freedom and its opposite continued to structure the United States’ relations with the Soviet Union, although references to the Free World itself appeared infrequently, and the government’s insistence on a strict global dichotomy became much less intense. Then in the 1990s, after the sudden collapse of the Communist bloc, many American policymakers assumed that it might finally be possible to “make the world itself at last free.”
The destruction of the Twin Towers on September 11, 2001, brought a reversion to a dichotomized world. George W. Bush immediately proclaimed a fundamental global conflict between freedom and terrorism. In 2004, he appealed to the Free World directly, using the phrase more often than any president in any year since 1964. But the term did not stick, and the Bush division of the world proved politically controversial. In 2008, Barack Obama swung the White House back to a “One World” position, promising to end the “War on Terror” and to promote cooperation with the “international community.” By the end of his second term, however, his administration had recognized a divide between the “liberal international order” and a collection of authoritarian adversaries. For much of the foreign policy establishment, the election of Trump provided further proof of the unavoidability of this global conflict. In 2018, Thomas Wright of the Brookings Institution wrote that “there is no prospect on the horizon of a universal liberal order. Instead, there is a free world competing with a neo-authoritarian world.”
Trump’s presidency represents a crossroads not because he is somehow “non-Western,” but because he has seemingly rejected the premise that a threat to freedom anywhere is a threat to freedom everywhere and that only American leadership in confronting this menace will allow democracy to flourish in the United States and to continue its natural expansion across the globe. This assumption lay at the core of the concept of the Free World and has played an important role in American foreign policy for much of the last century. In light of this history, what can be said about the country’s future course?
One possibility is a return to orthodoxy. In a recent Foreign Affairs article, Joe Biden explicitly promised to revive a Free World foreign policy: “The triumph of democracy and liberalism over fascism and autocracy created the free world. But this contest does not just define our past. It will define our future, as well.” According to Biden, the United States must once again “rally the free world” because “no other nation is built on that idea.” And so, in the first year of his presidency, Biden would have the United States “organize and host a global Summit for Democracy to renew the spirit and shared purpose of the nations of the free world.”
This foreign policy has the advantage of long familiarity and the support of much of the Washington establishment. The difficulty is that the new “free world,” like the old, lacks positive content, deriving much of its meaning from the stated fact of American leadership and the presumed threat of existential adversaries. After all, the renewed interest in the Free World and the discovery of the “liberal international order” came in response to a rapid rise in apparent challenges to the United States’ established position in the world. Yet again, recognition of the Free World required the prior identification of its enemies. The difference is that American policymakers in the early 1950s constructed the Free World on the assumption of a geographically and ideologically monolithic antagonist—one who happened to divide the globe along similar lines. The internal eclecticism of the “neo-authoritarian world” makes it a far less productive opposite pole. What deeper “shared purpose” would unite Biden’s “free world”? And how would he decide which states to invite to his summit? The political failure of the War on Terror shows how difficult it can be to mobilize a “free world” when the enemy is too diffuse to define. It also demonstrates the difficulty of ever achieving a decisive victory. Even when the “free world” has ultimately prevailed, as in the Cold War and the two world wars, the ensuing pursuit of “One World” has always encountered new obstacles and produced new global dichotomies.
One possible escape from this cycle is to limit the U.S.-led bloc to an entity that does not define itself against an existential antagonist or identify itself with the entire world. This is the course that Kimmage advocates when he calls for the revival of an American idea of the West. The problem is that “the West” never guided the conduct of U.S. foreign affairs to the extent that Kimmage claims. Although it is true that Americans had a much greater sense of civilizational belonging in the past than they do now, the foreign policy of the postwar presidents was always more global than Western. Meanwhile, the conservative champions of an explicitly pro-Western posture—from James Burnham and William Buckley to Patrick Buchanan and Steve Bannon—have mostly remained marginal critics of establishment policy. It, is of course, possible that a “new cold war” with China would revive the notion of an “East-West conflict.” But as before, a Free World framework would likely predominate, allowing the United States to continue to make appeals to allies and potential partners in Africa and Asia.
A final possibility is for the United States to abandon the project of long-term leadership altogether. This is the position seemingly represented by Trump and, in a different form, by Bernie Sanders. There is certainly a long tradition of envisioning America apart, from John Winthrop’s city on a hill to Washington’s warning against foreign entanglement to Monroe’s Doctrine of hemispheric separation. But American commitments around the globe have ratcheted up to such a degree that a return to “isolationism” in the foreseeable future is impossible in purely practical terms.
A second problem with a policy of retrenchment is the lack of a clear ideological basis for the country’s new role. A defining characteristic of both the “isolationist” and “internationalist” eras has been a more or less continuous belief in American exceptionalism. The country would stay separate in order to remain superior. It would lead because “no other nation is built on that idea.” Both Sanders and Trump have rejected this faith. The first invokes Scandinavian socialism as a superior system. The second conceives of American greatness as a goal, not a given. But so far, neither has come up with a successful alternative ideology. Although they might argue against endless wars, burdensome alliances, or unfair trade agreements, they tend to offer mostly ad hoc solutions that ultimately remain within the previous framework of American foreign policy. On his website, Bernie Sanders declares that “the U.S. must lead the world.” In the 2017 National Security Strategy, Donald Trump boasts that “America is leading again on the world stage.” In both cases, the basis for this leadership role remains undefined.
The original question remains: Whom is America leading and why?
Peter Slezkine is a PhD candidate in history at Columbia University and a fellow at the Clements Center at the University of Texas.