Idealists Study World War II; Realists Study World War I

Reuters
September 24, 2019 Topic: History Region: Americas Tags: HistoryWarAmericaMilitaryStrategy

Idealists Study World War II; Realists Study World War I

World War I represents reality as it is, not as we wish it to be.

 

Two descriptions of the world for your consideration:

Scenario one: a dominant superpower, serving as the unquestioned head of an international alliance, and possessing unmatched military and economic strength. This superpower appears exceptionally led at the military, political, and bureaucratic levels, and possesses the time and space to conceptualize a focused strategy against a specific threat.

 

Scenario two: several powerful nations operating in a multipolar world, pursuing divergent interests, with none possessing an absolute advantage over the others. Exceptional political and military leadership is lacking, and the rapid pace of change means these nations are carried forward by events over which they exercise little control. Few are certain what a future threat or strategy may resemble.

Our first scenario describes the world of 1945; the triumphant United States at the head of a new international order and poised to combat a clear challenger. Our second describes the world of 1918: victorious powers fighting over diverging interests, squandering their opportunities of peace, but convinced in their conceit that the world could be ordered in their image.

Take these descriptions and apply them to 2019, and one will find the first far timelier than the second. One hundred years after the Treaty of Versailles the parallels seem uncanny; everything old is new again. Those seeking the lessons of history—to the extent that history provides them—should study World War I carefully. The world of 1945 holds far more appeal in the popular imagination and continues to influence policy. Notions of unipolarity, morally unambiguous causes, and great leaders continue to dominate contemporary thinking. Conversely, few long for the bleak and complex days of 1918. Contrast the lessons of the two wars, and one will find that the world today resembling 1918, and the world as we wish it to be resembling 1945. Bleakness and complexity are the order of the day.

The Cause: If there is one constant in Wetsern foreign policy, then it is at least one politician will refer to our opponent-of-the-day as Hitler. Nazi Germany represented the worst of humanity, but such references threaten to dilute this reality when applied with reckless abandon. The “Hitler” label feeds a sense of righteousness, which, to phrase it mildly, can prove counterproductive to the national interest. To appreciate the absurdity of this overuse, consider: which of the following scenarios is a more likely casus belli? 1) An invader hell-bent on world domination and the wholesale elimination of choice ethnic and religious groups or 2) A great-power nation, pursuing policies believed to be in its legitimate national interest, miscalculates with tragic consequences. Morality plays make for good stories, but competition is possible without demonization. Understanding the nature of great-power competition and strategic miscalculation a la World War I will likely serve the national interest better than searching for a “little Hitler” in every conflict.          

The Leadership: Political and military leaders are often viewed as giants astride the world stage, imposing their will on their surroundings. This is understandable, given the popular perceptions of World War II leadership: Churchill, Roosevelt, Eisenhower, Montgomery, et al. However, skepticism is warranted. As Calvin Coolidge accurately noted, “heroism is not just in the man, but in the occasion.” Again, incorporating such narratives into foreign policy entails a significant risk of overreach. Leaders usually cannot bend complex events to their will. The regular order of the military sphere entails armies that are far more often led by a Joseph Joffre or Conrad von Hotzendorf—men of limited imagination and rigid dogma—rather than a George Patton or Erwin Rommel. Grandiose political leaders more often take the form of Woodrow Wilson—who combined the zeal of a preacher’s son with the inflexibility of a career academic—than Winston Churchill. Studying the hapless generals of the Western Front and the architects of the Versailles disaster is more likely to instill humility in a political or military leader than attempting the second coming of Franking D. Roosevelt or Patton.

The Effects: The idea of a world made anew is one of the most enduring aspects of the post–World War II narrative. The United Nations, North Atlantic Treaty Organization, the Bretton-Woods system—all appear to testify to the supremacy of international cooperation. In a regularly repeated World War II narrative, these events demonstrated that the world learned the lessons of two great wars and responded appropriately. There is some truth to this, but more than ideology was at work. These institutions were direct results of the United States exceptional global power and in certain ways a response to the binary reality of the Cold War. Both this binary structure and the United States’ singular power are no longer strategic realities, and thus, a twenty-first century facsimile is impossible.

The end of World War I, by contrast, involved much of the same intent, but not the same conditions. The results of a hubristic effort to remake the world according to the victors—consisting of several great powers rather than one—were appropriately disastrous. Efforts by temporarily strong nations to order the world in their image, despite the protestations of the temporarily weak, can easily involve unintended consequences. Even today, much of the world’s regional conflicts are a direct result of the borders drawn at Versailles. The difference between the lessons of these two narratives is the difference between humility and hubris.

The Pace of Change: While World War II coincided with extraordinary changes in technology and the strategic environment, many of these resulted from a deliberate planning process. The United States possessed the time and space to consider a postwar world even as they fought. Changes in technology such as the Atomic Bomb, while momentous, were the result of years of planning and development. Many of the changes wrought by World War II while significant, were anticipated.

Contrast this with World War I, in which great nations appeared carried along by events rather than controlling them. From colorful uniforms to khakis; from cavalry charges to tanks and airplanes; from multinational empires and historic dynasties to brand new nation-states—all took shape within four years. Studying World War I in this context poses relevant questions for the modern leader. Can nations plan for those events which are unanticipated? The current pace of global change suggests that preparing for the unknown bears more relevance than assuming anticipation.

 

Conclusions: This is not to indicate that World War II contains no lessons of contemporary value, only that the lessons of World War I bear more relevance today. Even if one accepts the entirety of the idealist’s World War II narrative as truth, at the very least one should recognize that World War II represented a unique set of circumstances that no longer exist. As global conditions inevitably change, seeking to make the present a facsimile of the past is unlikely to end well. The utility of history lies in avoiding past failure rather than trying to recreate past success.

Sam Canter is a Captain in the United States Army who has served as both an Infantry and Military Intelligence officer. He holds an MA in Military History from Norwich University, and his writing has appeared in the National Interest, RealClearDefense, and Divergent Options. The views expressed in this article are those of the author and do not necessarily reflect those of the U.S. Army or Department of Defense. 

Image: Reuters