Inglorious Revolutions
Mini Teaser: Revolutions rarely produce stable democracies and human rights overnight—and it's foolish to expect otherwise.
TWO AND A HALF years after it began, the revolution was widely considered a quagmire, even a disaster. Rebels had made disappointingly little headway against the forces of the hated tyrant. The capital and the country’s second major city remained under his control. Foreign powers had provided sympathy, but very little real aid. And despite promising to respect human rights, rebel forces were committing widespread abuses, including murder, torture and destruction of property. In short, the bright hopes of an earlier spring were fading fast.
This may sound like a description of Syria today, but it also describes quite well the situation of another country: the young United States in the winter of 1777–1778. George Washington had taken refuge in the miserable winter encampment of Valley Forge. Philadelphia (then the capital) and New York were both in British hands. France had not yet agreed to help the new republic militarily. And in areas under rebel control, loyalists were being persecuted—far more than most American school textbooks admit.
There is little reason to think that conditions in Syria will turn around the way they did in the United States between 1778 and 1781, when the American revolutionaries managed to eke out a military victory. But the comparison illuminates a different point. Historically, very few revolutions have been quick successes. They have been messy, bloody, long, drawn-out affairs. Victory has very rarely come without numerous setbacks, and, unfortunately, without abuses carried out by all sides. It has generally taken many years, even decades, for the real gains, if any, to become apparent. Yet today, international public opinion and international institutions usually fail to recognize this historical reality. There is an expectation that revolutions, where they occur, must lead within a very short period to the establishment of stable democracy and a full panoply of human rights, or they will be viewed as failures.
Consider, for instance, the disappointments that followed the Arab Spring and the resulting worldwide hand-wringing. Thomas Friedman, that great barometer of elite American conventional wisdom, wrote in May 2011 about the young Arabs who had begun to “rise up peacefully to gain the dignity, justice and self-rule that Bin Laden claimed could be obtained only by murderous violence.” Less than two years later, he was lamenting that “the term ‘Arab Spring’ has to be retired,” and comparing events in the region to the seventeenth century’s massively destructive Thirty Years’ War, in which areas of Central Europe lost up to a third of their populations. Many other commentators throughout the world now write off the Arab Spring as a disaster and failure, pure and simple. But arguably, not the least of the problems bedeviling the Arab revolutionaries of the last two and a half years has been the absurdly inflated expectations they have had to live up to. Put simply, they have been asked to achieve the sort of rapid and complete success that hardly any predecessors, including in the West, ever managed. The same has been true of the “color revolutions” of the past decade in the former Soviet Union, which commentators like Melinda Haring and Michael Cecire, in a recent Foreign Policy article, have been quick to label “terribly disappointing.”
But think for a moment about the point that some other major revolutions had reached two years or so after they began. Two years after the first shots of the American Revolution, Washington had not even gotten to Valley Forge, and victory looked very far off indeed. Two years after the beginning of the French Revolution, a huge and dangerous conflict was opening up between the country’s political factions, and that summer King Louis XVI severely exacerbated it by trying to flee France and join an enemy invasion force. Many more years of chaos and bloodshed would follow. Two years after the beginning of the Latin American revolutions against Spain, the First Venezuelan Republic had already collapsed, with Spain reestablishing its authority. In each of these cases, the revolutionaries themselves also failed, often quite spectacularly, to behave in a manner that modern human-rights activists would have condoned. Even the West’s paradigmatic example of a “good revolution,” Britain’s “Glorious Revolution” of 1688, was only “bloodless” and “quick” if one equates Britain with England, and fails to consider the extended series of destructive wars that convulsed Ireland and Scotland for decades thereafter. The historian Steve Pincus has written that “far from being aristocratic, peaceful, and consensual,” the Glorious Revolution was “popular, violent, and extremely divisive.”
WHY DO most observers today seem so oblivious to the historical record of revolutions? What are the consequences of this obliviousness? And what might it actually take, in the way of concerted international action, to help revolutions like the one in Egypt take place in a way that accords better with observers’ ideal script?
In addressing the first of these questions, one place to start is with a rather odd development: current expectations about revolutions in fact represent something of a return to a very old understanding of such events. Up until the mid-eighteenth century, the word “revolution” meant little more than “political upheaval.” Revolutions were held to be sudden, unpredictable and largely uncontrollable. History books told the story of countries’ violent changes of dynasty almost as if they were a series of earthquakes. Revolutions were things that happened to people, not things that people themselves were seen as capable of consciously directing. A typical usage can be seen in the title of a pamphlet by the seventeenth-century English radical Anthony Ascham: A Discourse: Wherein is Examined, what is Particularly Lawful During the Confusions and Revolutions of Government. Samuel Johnson’s dictionary gave “revolution” as a synonym for “vicissitude.” Tellingly, at the beginning of what we now call “the American Revolution,” very few people actually described what was taking place as a “revolution.” The word does not appear in the Declaration of Independence, or in Thomas Paine’s great 1776 pamphlet Common Sense (except in reference to 1688 in Britain). In 1777, John Adams could write to his son John Quincy about “the late Revolution in our Government,” implying that the event was already finished and in the past.
These ideas began to change in the late eighteenth century, with significant consequences for the events that would continue to convulse the Atlantic world for half a century. In America, by 1779 it was becoming clear that the political and social transformations set in motion by the War of Independence had yet to run their course. In that year, Richard Henry Lee wrote to Thomas Jefferson about “the progress of our glorious revolution,” and Jefferson himself finally began to use the word in reference to American events. By 1780, John Adams was writing to his wife Abigail about “the whole course of this mighty revolution,” treating it as something still taking place. Yet even then, he did not present it as a process he himself had a hand in directing, but as a great natural upheaval sweeping him along.
It was in France where the most decisive conceptual transformation took place. As the country’s “old regime” began to crumble in 1789, observers immediately started to refer to what was going on as a “revolution” in the traditional fashion. Then, within a matter of months, they began speaking of it less as a sudden and cataclysmic event than as an ongoing process. Soon they went even further, presenting the revolution as something that could be controlled and directed. Stanford’s Keith Baker, who has written luminously on this shift, characterizes it as one from revolution as “fact” to revolution as “act.” Before this moment, the word “revolutionary” did not exist, and would have made little sense to people, referring as it does to people or actions that actively drive revolutions forward. But in September 1790, the radical deputy Bertrand Barère referred to the demolition of the Bastille as “a truly revolutionary act,” and soon his colleague Georges Danton was describing himself as “a steadfast revolutionary.” In 1792, Maximilien Robespierre renamed the executive committee of Paris’s municipal government the “General Revolutionary Council,” making it the first political institution in history to bear such a title.
Baker’s colleague Dan Edelstein has added a further fascinating wrinkle to the story, noting that by 1792–1793, “the revolution” seemed to be taking on a life of its own, becoming, in the eyes of its advocates, a quasi-mythic force and a source of political legitimacy. After armed crowds stormed the royal palace in 1792 and overthrew Louis XVI, there were calls to put the king on trial. The radical Louis-Antoine Saint-Just, however, insisted that the people had already delivered a verdict through their revolutionary action. Any procedure that might exonerate the king therefore amounted to “putting the Revolution itself on trial,” in the words of his patron, Robespierre. A year later, with France at war with much of Europe, Saint-Just made a remarkable speech demanding that the ruling National Convention formally suspend the new constitution it had just approved, and declare the government “revolutionary” until the end of hostilities. He insisted on a full overhaul of the government’s personnel and procedures, arguing that “the laws are revolutionary; those who execute them are not.” And he added the following, remarkable sentence: “Those who make revolutions, those who wish to do good, must sleep only in the tomb.”
This new understanding of revolutions partly reflected the simple fact that the French Revolution was indeed a very different sort of event from its predecessors. Instead of its principal political changes coming to an end quickly, culminating in a document such as a declaration of independence, a process of explosive radicalism continued to build, leading to the deadly Reign of Terror of 1793–1794. But the new ways of thinking themselves provided a spur to radicalization, by giving the political actors of the day a way to see “revolutions” as exceptional historical moments in which ordinary practices and principles could be suspended. The leading figure of the Terror, Robespierre, developed an entire political theory on this basis. In a legislative report he wrote in the winter of 1793–1794, he distinguished between ordinary “constitutional” government, whose role was to govern a republic, and “revolutionary” government, whose role was to found the republic. In the latter, he argued, the state needed far greater leeway, both to protect its citizens and to ensure that institutions would be given a durable form. “The Revolution,” he thundered, “is the war of liberty against its enemies.” Several of Robespierre’s allies openly urged him to become a “dictator,” a title still then associated with the ancient Roman military office of the name, and which they viewed favorably. In theory, the dictatorship would end once the republic had been durably founded, and the revolution completed, but given the vastness of the radicals’ ambitions, it was not clear when this goal would be reached. “Revolution” was becoming not just a process, but also a utopian one that might extend into the future, indefinitely.
This new concept of revolution as what G. W. F. Hegel would call a “world-historical” event helped to justify the French revolutionaries’ most outlandish projects. These included a new calendar, which started with the birth of the French Republic; the attempt to replace Christianity either with state-sponsored atheism or Robespierre’s deistic “Cult of the Supreme Being”; plans for universal education and charity; and, dangerously, the transformation of a war against other European powers into a crusade for universal human liberation. Robespierre and his allies went so far as to characterize “revolutions” as millennial projects that could literally change human nature. “The French people seem to be about two thousand years ahead of the rest of the human race,” he mused in the spring of 1794. “One is tempted to regard them as a separate species.”
IT IS HARD to exaggerate the hold that this French model of revolution exerted over imaginations throughout the world in the nineteenth and twentieth centuries. In country after country, generations of would-be revolutionaries plotted to take power and instigate upheavals of similar or even greater ambition. Starting in the mid-nineteenth century, the model was potently combined with socialist visions of history as a story of class struggle, but the idea of revolution itself as an ongoing, consciously directed process remained much the same. In Russia, China, Southeast Asia, Latin America and the Middle East, self-proclaimed “revolutionary” regimes took power with goals of nothing less than transforming human beings into something new and better. In Terrorism and Communism, written at the height of the Russian Civil War, Leon Trotsky (a great admirer of the French Revolutionary Terror) expressed sentiments very close to those of Saint-Just and Robespierre:
We were never concerned with . . . prattle about the “sacredness of human life.” We were revolutionaries in opposition, and have remained revolutionaries in power. To make the individual sacred we must destroy the social order which crucifies him. And this problem can only be solved by blood and iron.
Mao Zedong, who repeatedly spoke of revolution as a long and arduous road, called its ultimate goal the changing of society and the establishment of a new sort of human freedom (he also famously remarked that “a revolution is not a dinner party”).
Of course, in country after country these later revolutions produced even greater chaos and bloodshed than in France. In Russia and China and Southeast Asia, the number of victims stretched into the millions. And finally, after the Russian Civil War, Stalin’s terror, the Gulag, the Chinese Cultural Revolution and the Cambodian holocaust, the myth of a redemptive, world-transforming revolution lost its allure, as one moment of dreadful disillusionment followed another. By the late twentieth century, when the self-proclaimed revolutionary regimes of the Soviet bloc began to crumble, the dissidents who stepped into the breach generally refused the label of “revolution” altogether. As the Polish Solidarity leader Jacek Kuron informed French readers in a remarkable newspaper column in the summer of 1989—as the Poles were ousting the Communists and the French were marking the bicentennial of 1789—the age of revolution was over, and a good thing too. Germans self-consciously refer to the events of 1989–1990 not as a “revolution,” but as die Wende—“the change.”
In some cases, the exhaustion that has followed upon bloody utopian experiments has itself created the conditions under which moderate democratic regimes could eventually take root. In France, for instance, the events of 1789 marked the start of nearly nine decades of astonishing political turmoil. Monarchies, republics and empires succeeded each other so rapidly that, according to one popular joke, libraries began storing copies of the constitution in the “periodicals” section. But finally, after the fall of Napoleon III during the Franco-Prussian War, and one final outburst of radical utopianism in the doomed Paris Commune of 1871, a relatively stable, moderate republic was established, and it lasted until the Nazi occupation of 1940. François Furet, one of twentieth-century France’s great historians, labeled the entire long period from the late eighteenth century to the late nineteenth as “the French Revolution.” In his view, it only came safely “into port” with the Third Republic in the 1870s. But it is hard to argue that the turmoil and bloodshed was necessary to achieve this relatively limited goal. And, of course, in many other countries—Russia and China, most obviously—similarly long periods of revolutionary disruption have so far failed to produce similarly benign outcomes.
THIS LONG PROCESS of disillusionment helps explain why, today, revolutions are expected to be so quick and neat. If revolutionary movements no longer come bearing utopian hopes of redemption, then there is less need for them to extend indefinitely into the future. And indeed, most of the revolutions that have taken place since 1990, such as the “color revolutions” in the Soviet bloc and the revolutions of the Arab Spring, have aimed at relatively modest goals, in comparison with their French or Russian or Chinese predecessors: representative democracy, stability, the rule of law, human rights. The great exceptions to this rule, of course, are the Islamists, who hope to impose their vision of godly order on human societies. The Iranian Revolution was in this sense the last of the great line of utopian revolutions that began in the eighteenth century. Francis Fukuyama has been widely mocked for his 1989 National Interest article “The End of History?” and his prediction that free-market democracy would become universal throughout the world. But with the exception of the Islamic world, free-market democracy has indeed overwhelmingly become the preferred political model in most countries. As Fukuyama himself put it: “At the end of history it is not necessary that all societies become successful liberal societies, merely that they end their ideological pretensions of representing different and higher forms of human society.” At the heart of these earlier ideological pretensions was the idea that the means to these “higher forms” was a French-style revolution.
Of course, even where free-market democracy has become the preferred model, reality has often failed quite dismally to comply. Back at the time when Fukuyama wrote, nearly all observers woefully underestimated the sheer difficulty of instituting such systems in countries plagued by poverty and ethnic and religious differences, and lacking experience in the rule of law or the toleration of opposition. The goal of a revolution may be entirely clear: for example, to transform your country into something resembling Finland. But how can that goal be reached?
This is a question that continues to bedevil political scientists. But the experience of Europe, first after the end of World War II, and then after the collapse of Communism, suggests at least one absolutely crucial condition: a proper structure of incentives for the population in question. After the defeat of 1945, as recent historical work has stressed, the population of West Germany did not magically lose all attraction to Nazism. But the West Germans knew the victorious Allies would not tolerate any serious attempts to revive Hitler’s regime. And at the same time, they quickly learned that moves toward democracy would reap them substantial rewards in the form of Marshall Plan aid and inclusion in the new Western military alliance. Likewise, after 1989 the people of Poland had relatively little to draw on in the way of democratic tradition. But they understood that free-market democracy would bring the massive rewards of closer connections to Western Europe—culminating in European Union membership—and the protection of NATO. In both these countries, the incentives to build free-market democracy proved more than sufficient to overcome the natural tendency of factions within a state to grab what they can for themselves, and to do whatever possible to keep their enemies out of power. In each country, it was generally recognized that there was far more to gain from establishing democratic, free-market institutions.
In contrast, the populations of countries with recent revolutions have had far weaker incentives to establish these sorts of institutions. Take Georgia and Kyrgyzstan, homes of the “Rose” and “Tulip” Revolutions of 2003 and 2005. Observers like Haring and Cecire have a simple explanation for why these revolutions “failed” (their blunt verdict): “Quite simply, the rule of law never took root.” In fact, they chide the revolutionaries for making what they call “a key mistake: They took the revolutions themselves as the apogee of democracy rather than focusing on the hard, grinding work of institution-building.” But what incentive did the populations of Georgia or Kyrgyzstan have to respect the rule of law and democratic governance? What incentive have the competing groups in Egypt had since 2011? Has the United States been offering massive economic aid in return for progress toward free-market democracy? Has the European Union been offering a quick timetable for membership? The “hard, grinding work of institution-building” depends on a large degree of popular cooperation. But most people in these countries have not seen any great benefit to be obtained from such cooperation, while seeing all too clearly the dangers of allowing opponents to seize power, or of not taking advantage of the chance for their faction to enrich itself while it can.
MANY DIFFERENT factors help populations to play by the rules, and to resist temptations to crush traditional enemies or to treat the state as little more than an instrument of personal enrichment. Ingrained habits of rigid social discipline, found in such widely different societies as colonial New England and twentieth-century Japan, can serve, given the proper conditions, to dampen forms of behavior that damage democratic cooperation. Inspiring, charismatic leaders committed to such cooperation—a Washington or a Mandela—can play a critical part as well. The role of eloquently formulated revolutionary principles in inspiring loyalty to democratic institutions should not be underestimated. But these factors are rarely enough. Incentives matter hugely. Furthermore, providing a clear incentive structure is arguably just about the only possible way to “jump start” democratic revolutions and bring them to a successful, rapid conclusion, especially in countries that have long traditions of division, corruption and intolerance.
In short, it is unreasonable, even rather absurd, to expect revolutions to usher in stable representative democracies that respect human rights virtually overnight. It is condescending and cruel to scold countries for their “failure” to reproduce, within a span of a year or two, what took France, the United States and many other countries decades or even centuries to achieve. We need to recognize that even the establishment of supposedly limited, nonutopian goals may well require a revolutionary process that lasts for many years or decades, and that may involve a good deal of violence, chaos and abuse along the way, including abuse by people we would like to think of as the good guys. In fact, just about the only way to avoid this kind of process (which itself may well eventually fail anyway) is to provide a serious external incentive structure, involving long-term commitments to large-scale aid and protection. Clearly, the West is in no position to start massive new aid programs to democratic revolutionaries across the world. But in that case, we have no cause to tout our own superiority over peoples just starting out on the long and difficult road that took us so very long to travel. Quite the contrary.
David A. Bell is the Sidney and Ruth Lapidus Professor in the Era of North Atlantic Revolutions at Princeton University.
Image: Pullquote: It is condescending and cruel to scold countries for their “failure” to reproduce, within a span of a year or two, what took France, the United States and many other countries decades or even centuries to achieve.Essay Types: Essay