The Most Terrifying Lesson of World War I: War Is Not Always "Short and Sharp"

September 5, 2014 Topic: HistoryForeign Policy Region: EuropeUnited States

The Most Terrifying Lesson of World War I: War Is Not Always "Short and Sharp"

Why nations like China, the United States, Russia and others should look back to the Great War for perhaps the most important reason of all.

 

The centenary of World War I has spawned numerous retrospectives comparing 1914 to 2014. Some have drawn parallels between the complex webs of security guarantees that helped spark conflict in August 1914, and similar treaties and commitments that might lead to war in our time. Others have used the outbreak of “the Great War” as a reminder that economic interconnectedness does not preclude conflict; great-power war today could unravel the globalized economy in much the same way that it did a century ago.

While these comparisons may be valid, there is another more worrying similarity between 1914 and 2014: a failure to prepare for the possibility of protracted conflict based on the flawed belief that conventional war between great powers would be brief and decisive.  In 1914, the failure to consider or prepare for a conflict that might last years rather than weeks made war more likely by creating the illusion that strategic goals could be achieved at minimal cost. Today, advances in technology have fostered a similar notion regarding conventional war between major powers. And yet militaries have a poor track record when it comes to predicting the character of future wars, particularly in times of rapid technological change. New military hardware may simply increase the destruction wrought by great-power war, much as it did in 1914, without making the conflict shorter or more decisive. Avoiding a similar catastrophe in the 21st century may therefore require deterring military adventurism by planning and preparing for protracted conflict.

 

Before World War I, military leaders and statesmen failed to comprehend how technological change had altered the conduct of war. A host of late 19th and early 20th century inventions, including smokeless powder, rapid-firing rifles, machine guns, breech-loading artillery with recoil compensation, reconnaissance aircraft, wireless communication, and barbed wire gave defenders an enormous advantage over attacking forces, yet every major power entered August 1914 with offensive strategies and military doctrines designed to achieve quick, decisive victories.

In hindsight this paradox seems obvious, but few contemporary military thinkers fully grasped how these new technologies would interact on the battlefield, largely because their use until 1914 was one-sided. During the wars German Unification, for example, early versions of some of these weapons were used by one belligerent or another, but not simultaneously by both sides. These conflicts, and particularly the Franco-Prussian war of 1870, left the impression that war between industrialized great powers could be short and decisive. Possible counterexamples such as the American Civil War, Russo-Japanese War, and Boer War were marginalized or misinterpreted, while European forces spent much of the period from 1870 to 1914 using their technological edge to overwhelm poorly equipped colonial adversaries. In the years leading up to World War I, therefore, European militaries had the impression that war would be swift, with the advantage accruing to the side that could mobilize fastest.

Worryingly, echoes of this mismatch between the technological means of war on one hand and its doctrinal ways and strategic ends on the other are evident today. Military hardware has advanced at a rapid pace alongside developments in civilian information technology. Capabilities that were either cutting edge or science fiction decades ago—such as precision-guided munitions, unmanned systems, satellites, stealth aircraft, data networks, and cyberwarfare—are becoming more commonplace. Much like the weapons of 1914, however, the use of these systems in combat has thus far been almost entirely one-sided.

Beginning in earnest with the first Gulf War in 1991, the United States and its allies have exploited their technological edge to defeat a string of weaker conventional opponents and inflict disproportionate casualties on irregular foes.  Based on the evidence of these lopsided wars, the U.S. military has come to view the combat phase of conventional warfare as a relatively brief affair. Official strategy and doctrine pertaining to conventional war frequently discuss rapid, decisive major combat operations. The topic of protracted conflict is consigned to discussions of irregular warfare and, after more than twelve years of conflict in Iraq and Afghanistan, protracted irregular warfare is viewed as something best avoided. The opening days and weeks of conventional wars are dissected in countless war games and table-top exercises, but less energy is devoted to examining how these scenarios might unfold from day thirty onward, save for occasional discussions of post-conflict stability operations. This is primarily because many U.S. defense planners view a conventional great-power war as unlikely and protracted great-power war as a virtual impossibility.

U.S. defense spending is another telling indicator of this short-war mentality. Shortages of precision-guided munitions (PGMs) have slowed the pace of U.S. and allied operations in conflicts ranging from Kosovo in 1999 to Libya in 2011; yet PGM stocks remain insufficient to support high-intensity operations for any substantial length of time and there is virtually no slack capacity to increase production. Defense budgets are crafted through a complex bureaucratic and political process, so they do not always perfectly reflect strategic priorities. Nevertheless, the failure to address this persistent shortfall seemingly indicates a lack of planning and preparation for combat operations lasting longer than a few months. A similar situation occurred in 1914. All combatants, but particularly France and Great Britain, experienced shortages of artillery shells. This so-called “shell famine” occurred because military planners had not conceived of the possibility that the war would not be over by Christmas.

The U.S. military is not alone in its belief that modern conventional war will be short and decisive. China, for instance, has spent the last twenty years developing an “anti-access/area-denial” strategy designed to hold U.S. forces at bay while the People’s Liberation Army conducts operations that could include re-absorbing Taiwan by force or seizing disputed territory from one of China’s neighbors. This strategy relies heavily on conventional cruise and ballistic missiles to strike at the air bases and naval forces of the United States and its regional allies and partners. China’s arsenal of missiles is prodigious, but their high cost and long production timelines make these weapons a poor instrument for protracted warfare. Chinese strategy is therefore predicated on winning a “short and sharp” conflict before the United States can bring its full military power to bear.

But what if the present pervasive belief in short conventional wars is spectacularly wrong? Militaries throughout history have failed to correctly guess the character of future wars, particularly during eras of rapid or disruptive technological change. Military leaders on both sides of the U.S. Civil War, for example, believed that it would be a short conflict characterized by maneuver and climactic battles in the style of the Napoleonic wars. Instead, the war devolved into an attritional slog that—with its trenches and attacks on civilian populations and their production capacity—presaged the horrors of early 20th century warfare. Moreover, technology is not necessarily deterministic. The wars of German Unification took place immediately after the Civil War and featured extremely similar technologies, yet none of them became protracted conflicts. Political will, operational acumen, and chance all play major roles in determining the character and outcome of conflicts.

Accurately guessing the character of future wars based solely on technological factors is difficult; nevertheless, trends suggest that protracted great-power war in the 21st century could be more probable than many strategists or policymakers presently believe. The current proliferation of PGMs and supporting capabilities such as unmanned aerial vehicles could create a firepower-dominant warfare regime similar to that of 1914. When used by both sides in a conflict, these may not enable rapid, decisive operations, but might instead create a massive no-man’s land where large numbers of aircraft, ships, and bases are damaged or destroyed to neither side’s lasting advantage. Even if this worst-case scenario does not come to pass, the qualitative and quantitative advantage the United States and its allies have enjoyed over their military rivals since the end of the Cold War is eroding. Given the narrowing of this gap and the possibility that current and emerging technologies may make operational stalemates more likely, planning and preparing for protracted conventional conflict would allow the United States to hedge against the unpredictability of war while also serving as a powerful deterrent against military adventurism.

This could require significant changes to U.S. defense posture. Winning a long war first requires not losing a short one. Although August 1914 is remembered as the month when the fields of northern France and Belgium became a military slaughterhouse, the war could have had a far different result if French and British forces had not halted the German advance at the Marne. U.S. and allied forces must be able to ward off a coup de main and continue fighting. Resiliency may become a higher priority than efficiency, with just-in-time logistics and lean defense production giving way to forward stockpiles of PGMs and additional manufacturing capacity. Protracted war might also necessitate an ability to absorb attrition in personnel and major systems while maintaining combat effectiveness. This could, for example, push acquisition programs to place greater emphasis on quantity, as opposed to the present tendency to build small number of exquisite systems.