Going Critical
Mini Teaser: Long before the American Empire becomes overstretched abroad, it will implode economically at home.
. . . the interesting subject of the finances of the declining empire.
-Edward Gibbon, Decline and Fall of the Roman Empire, Book I, CH. XVII
Toppling three tyrannies-that of Slobodan Milosevic, the Taliban and now Saddam Hussein-within four years is no mean achievement by the standards of any past global empire. What makes this achievement so remarkable is that it comes little more than a decade after a wave of anxiety about American "overstretch" and decline. In The Rise and Fall of the Great Powers, Paul Kennedy warned that the United States was running "the risk . . . of what might roughly be called 'imperial overstretch.'" America, he maintained, was spending too much on its overseas military commitments, to the detriment of the U.S. economy. Under such conditions, "The only answer to the question", as to whether the United States could remain a superpower, was "no."
As John Maynard Keynes once said, when the facts change, one ought to change one's opinion. Writing last September about America's subsequent ascent from superpower to "hyperpower", Kennedy invoked the deus ex machina of the "revolution in military affairs" to explain why his predictions of overstretch had not been fulfilled. All that investment in military research and development of which he had been so disapproving back in the 1980s had paid an unforeseen dividend. Not only did the Soviet Union collapse as it strained to match the Reagan-Weinberger arms extravaganza; the United States also went on to collect a triple peace dividend in the 1990s: falling defense spending as a share of GDP, accelerating economic growth and a quantum leap in military capability that left all other states far behind.
The irony is that Kennedy's original thesis of fiscal overstretch is now about to be vindicated-but not as a result of America's overseas military commitments. Today's overstretch is the result of chronically unbalanced domestic finances, primarily the result of a mismatch between earlier social security legislation, some of it dating back to the New Deal, and the changing demographics of American society. In just five years' time, 77 million "baby boomers" will start collecting Social Security benefits. In eight years, they will start collecting Medicare benefits. By the time they are all retired in 2030, the United States will have doubled the number of its elderly population but increased by only 18 percent the number of workers able to pay for their benefits. Over time, a falling birthrate and lengthening life expectancy are indeed a potent combination.
A Menu of Pain
Economists regard the commitment to pay pension and medical benefits to current and future elderly as part of the government's "implicit" liabilities. But these liabilities are no less real than the obligation to pay back the principal plus the interest on government bonds. Politically speaking, it may be easier to default on explicit debt than to stop paying Social Security and Medicare benefits. While no one can say for sure which liability the government would renege on first, one thing is clear: the implicit liabilities dwarf the explicit ones. Indeed, their size is so large as to render the U.S. government effectively bankrupt.
The scale of this implicit insolvency was laid bare this summer in an explosive paper by Jagadeesh Gokhale, a senior economist at the Federal Reserve Bank of Cleveland, and Kent Smetters, former Deputy Assistant Secretary of Economic Policy at the U.S. Treasury and now an economics professor at the University of Pennsylvania. They asked the following question: Suppose the government could, today, get its hands on all the revenue it can expect to collect in the future, but had to use it, today, to pay off all its future expenditure commitments, including debt service. Would the present value (the discounted value today) of the future revenues cover the present value of the future expenditures? The answer was a decided no: according to their calculations, the shortfall amounts to $45 trillion. To put that figure into perspective, it is twelve times larger than the current official debt and roughly four times the size of the country's annual output.
Gokhale and Smetters also asked how much taxes would have to be raised, or expenditures cut, on an immediate and permanent basis to generate, in present value, $45 trillion? Their answer takes the form of a "menu of pain" with four unpalatable dishes to choose from. We could either, starting today, raise income taxes (individual and corporate) by 69 percent; or we could raise payroll taxes by 95 percent; or we could cut Social Security and Medicare benefits by 56 percent; or we could cut federal discretionary spending by more than 100 percent (which, of course, is impossible).
Another way of expressing the problem is to compare our own lifetime tax burden with that of the next generation if the government does not adopt one of the above policies. Hence the term often used to describe calculations like these: generational accounting. What such calculations imply is that anyone who has the bad luck to be born in America today, as opposed to back in the 1940s or 1950s, is going to be saddled throughout his working life with very high tax rates-potentially twice as high as those his parents or grandparents faced. Notwithstanding the Bush Administration's tax cuts, Americans are hardly under-taxed. So the idea of taxing our children at twice the current rates seems ludicrous.
It is not as if people are completely oblivious to the problem. It is common knowledge that we are living longer and that paying for the rising proportion of elderly people in the population is going to be expensive. What people do not yet realize, however, is just how expensive.
One common response is to say that the economists in question have a political axe to grind and have therefore made assumptions calculated to paint the blackest picture possible. But the reality is that the Gokhale-Smetters study was commissioned by then-Treasury Secretary Paul O'Neill and was meticulously prepared while Smetters was at the Treasury and Gokhale was on loan to the Treasury from the Federal Reserve. And, far from being a worst-case scenario, the Gokhale and Smetters figures are based on what are arguably optimistic official assumptions about future growth in Medicare costs, as well as about future increases in longevity.
Perhaps predictably, the Treasury now denies that it had anything to do with the Gokhale and Smetters study. It would rather we read the supposedly independent Congressional Budget Office's (CBO) ten-year budget forecasts, which are frequently cited in the press and are one of the principal reasons for the prevailing mood of complacency about fiscal policy. The credibility of the CBO's forecasts is a perfect illustration of the phenomenon known to students of drama as the suspension of disbelief. This also operates in the financial world. How does the CBO get us to suspend disbelief? The same way a good movie director does it-with good special effects.
During the Clinton Administration, the CBO routinely projected that, regardless of inflation or economic growth, the federal government would spend precisely the same number of dollars, year in and year out, on everything apart from Social Security, Medicare and other entitlements. At the same time, the CBO confidently assumed federal taxes would grow at roughly 6 percent each year. As a result, it was able to make dizzying forecasts of budget surpluses stretching as far as the CBO could see. (These phantom surpluses were the money Al Gore promised to spend on voters and George W. Bush promised to return to them during the 2000 election.)
With the election over, the CBO decided that not adjusting projected discretionary spending for inflation was no longer "useful or viable." Making this correction reduced the CBO's projected 2002-11 surplus from $6.8 trillion to $5.6 trillion. But that was nothing compared to the impact of subsequent unforeseen events. Two years later, after a recession, a huge tax cut, the September 11 attacks and the Iraq war, the projected ten-year surplus has fallen to $20 billion. Nevertheless, the CBO is still able to predict a medium-term decline in the federal debt in public hands from 35.5 percent of GDP to 16.8 percent ten years hence. To generate this result the CBO conveniently assumed that discretionary spending would remain fixed over the next decade even as the economy grows. In fact, these purchases, which include the additional military and security expenditures prompted by September 11, have grown more than twice as fast as the economy over the last three years. As we write, the CBO has done some more correcting. It now predicts a deficit for the coming fiscal year of close to half a trillion dollars. And for the ten-year period 2002-11, it no longer predicts a surplus but a $2.3 trillion deficit. This is $9.1 trillion more in debt than the CBO was predicting before the last election.
Unfortunately, even the CBO's latest projections still grossly understate the true size of the government's liabilities because its "bottom line" is only that part of the federal government's liabilities that takes the form of bonds. Publicly issued and traded debt, however, is simply dwarfed by the gargantuan off-balance-sheet liabilities of the Social Security and Medicare systems.
Conventional wisdom predicts that if investors and traders in government bonds anticipate a growing imbalance in a government's fiscal policy, they will sell that government's bonds. There are good reasons for this. A widening gap between current revenues and expenditures is usually filled in two ways: first, by selling more bonds to the public and, second, by printing money. Either response leads to a decline in bond prices and a rise in interest rates: the incentive people need to purchase bonds. That incentive has to be larger when the real return of principal plus interest on the bond is threatened by default or inflation.
Figures like those produced by Gokhale and Smetters might have been expected to precipitate a sharp drop in bond prices. But at the time their study appeared, the markets barely reacted. Yields on ten-year Treasuries have in fact been heading downward for more than twenty years. At their peak in 1981 they rose above 15 percent. As recently as November 1994, they were above 8 percent. By mid-June 2003-two weeks after the $45 trillion figure had hit the front page of the Financial Times-they stood at 3.1 percent, the lowest they have been since 1958.
Today, however, there are clear signs of a slight upward shift in investors' inflationary expectations. The yield on the ten-year Treasuries has jumped to 4.4 percent in response to the government's admission that its actual deficit for 2003 would be $475 billion-a rather different figure from the surplus of $334 billion that was forecast back in April 2001. The yield curve, which had become more or less flat by the late 1990s, is now sloping more steeply upwards. At the end of 2000, the spread between ninety-day and thirty-year interest rates was slightly negative (minus 42 basis points). By August, it stood at over 400 basis points. Finally, the spread between yields on ten-year bonds and index-linked bonds with the same maturity has widened slightly, from around 140 basis points in October last year to over 230 basis points in late August.
Yet this still seems a relatively modest reaction given the size of the fiscal crisis facing the United States. There are two possible explanations for the relative insouciance of the bond market. One is that investors and traders know of a painless answer to the federal government's coming fiscal crisis, which they are somehow managing to keep secret from the economics profession. The other is simply that they are in denial. Or maybe, to be fair, nobody can quite work out what it implies. We are, after all, in uncharted waters. Previous fiscal crises were not like this because most governments' liabilities took the form of official bonds, not statutory pledges to pay various index-linked benefits to citizens. Bond traders are accustomed to a world in which governments in fiscal difficulties either default or allow inflation to erode the real value of their debts. They look at the United States and find it hard to imagine either scenario.
For reasons quite unrelated to federal fiscal policy, there are strong deflationary pressures operating at home and abroad. Overcapacity generated during the 1990s boom, investor pessimism in the wake of the bust, consumer anxiety about job losses-all these factors mean that virtually the only sector of the U.S. economy still buoyant is housing, for the simple reason that mortgage rates are the lowest in two generations. At this writing, the U.S. unemployment rate has just reached a nine-year high. Meanwhile, the unleashing of China's productive energies is filling the global economy with amazingly cheap consumer goods.
Earlier this summer, one of the lead stories on the Bloomberg website described deflation as the "great bugaboo menacing the markets and the economy in the early 2000s." On May 22, the Chairman of the Federal Reserve, Alan Greenspan, acknowledged that there was a "possibility" of deflation in his most recent testimony before the Joint Economic Committee of Congress.
There is, however, another way of looking at the bond traders' mindset. Compare their predicament with that of their colleagues (now in many cases former colleagues) trading equities just five years ago. At that time, it was privately acknowledged by nearly everyone on Wall Street and publicly acknowledged by most economists that American stocks, especially those in the technology sector, were wildly overvalued. In 1996, Alan Greenspan famously declared that the stock market was suffering from "irrational exuberance." Over the next three years, a succession of economists sought to explain why the future profits of American companies could not possibly be high enough to justify their giddy stock market valuations. Still, the markets rose, and it was not until January 2000 that the bubble burst.
It is now clear that something similar was going on in the bond market this year. Just as investors and traders knew that most Internet companies could never earn enough to justify their 1999 valuations, investors and traders knew that future government revenues cannot remotely cover both the interest on the federal debt and the transfers due on the government's implicit liabilities. But just as participants in the stock market were the mental prisoners of a five-year bull market, so participants in the bond market were the mental prisoners of a twenty-year bond bull market that had seen the price of long-term Treasuries rise by a factor of two-and-a-half in twenty years. In both cases, everyone knew there was going to be a "correction", but nobody wanted to be the first player out of the market, who might then have to sit and watch the bull-run continue for another year.
Between January 2000 and October 2002, the Dow Jones Industrials index declined by almost exactly 38 percent as irrational exuberance gave way to more rational gloom. By April of this year, it was becoming easy to imagine a similar correction to the bond market. Now it seems to have happened. Or has it? Most analysts attribute the recent rise in yield to growing optimism about economic growth and the stock market rather than to fiscal pessimism. This implies that the full magnitude of the government's fiscal plight has not yet sunk in. Much therefore depends on what bond traders and investors expect the government to do about its $45 trillion black hole. When rational gloom sets in, the U.S. economy will likely "go critical."
The Inflation Scenario
The printing press is the time-honored last resort of governments that cannot pay their bills out of current tax revenue or new bond sales. It leads, of course, to inflation and, potentially, hyperinflation. The higher the anticipated rate of inflation, the higher interest rates will rise, because nobody wants to lend money and be paid back in undervalued banknotes. The process whereby current fiscal policy influences expectations about future inflation is a dynamic one with powerful feedback effects. If consumers in financial markets decide a country is broke and is going to inflate, they act in ways that actually catalyze such an outcome. By pushing up interest rates, they raise the cost of financing the government's debt and hence worsen its fiscal position. Higher interest rates may also depress business activity. Firms stop borrowing and start laying off workers. The attendant recession lowers tax receipts and drives the government into a deeper fiscal hole. In desperation, the government starts printing money and lending it, via the banking system, to the private sector. The additional money leads to inflation and the higher inflation rates assumed by the market turn into a self-fulfilling prophecy. Thus, the private sector and the government find themselves in a game of chicken: If the government can convince the private sector it can pay its bills without printing money, interest rates stay down; if it cannot, interest rates go up, and the government may be forced to print money sooner rather than later.
This suggests one possible scenario. Bondholders will start to sell off as soon as a critical mass of them recognizes that the government's implicit and explicit liabilities are too much for it to handle with conventional fiscal policy, and concludes that the only way the government will be able to pay its bills is by printing money. What commonly triggers such shifts in expectations is an item of financial news. In Germany in May 1921-to give an extreme example-it was the announcement of a staggering postwar reparations burden of 132 billion deutschemarks that convinced investors the government's fiscal position was incompatible with currency stability. The assassination of the liberal foreign minister, Walther Rathenau, in July of the following year delivered the coup de grace, sending both interest rates and exchange rates sky-rocketing.
America today is certainly a long way from being the Weimar Republic. But an item of fiscal news could nevertheless conceivably cause a major shift in inflationary expectations and hence in long-term interest rates. The first pinprick in the bond market bubble came in July from the publication of a government deficit number significantly higher than had been forecast by the CBO. Another hole might be made by Alan Greenspan's retirement at some point in the next two years, though judging by the muted reaction to the 77-year-old's warning earlier this year about the Bush Administration's "lack of fiscal discipline", his power to move the markets is not what it was. And there is always the possibility of another major terrorist attack or a serious deterioration of the situation in postwar Iraq.
The panic may not begin among American investors, however. According to data published in September 2002, foreign investors currently own close to two-fifths of the federal debt held in private hands. The much-vaunted "hyperpower" would quickly find itself humbled if foreigners were to express their anti-Americanism by dumping U.S. Treasuries. Conventional wisdom has it that there is "nowhere else to go" for international investors seeking low-risk securities in the world's reserve currency. However, this overlooks the growing importance of euro-denominated securities in the wake of European Monetary Union (EMU). The volume of euro-denominated government bonds was very large even before the single currency was introduced: the outstanding volume of Eurozone government bonds was roughly half the outstanding volume of U.S. government bonds in 1998. But, as the rapid convergence of Eurozone bond yields clearly shows, monetary union has greatly reduced pre-1999 country risk, so that (in effect) all Eurozone countries' bonds are regarded as being almost as good as the old German bonds. As a result, the European bond market has been significantly boosted by EMU: according to the Bank for International Settlements, about 44 percent of net international bond issuance has been denominated in euros since the first quarter of 1999, compared with 48 percent in dollars. For the equivalent period before the introduction of the euro, the respective shares were 29 percent and 53 percent.
EMU may not have boosted economic growth in the Eurozone, but it has certainly enhanced fiscal and monetary credibility for the member-states. For all its crudeness, the Growth and Stability Pact imposes tight national constraints on the fiscal policies of Eurozone members, though it remains to be seen whether the rule restricting deficits to 3 percent of GDP will be enforced this year. Moreover, unlike the United States, the Eurozone runs a balance of payments surplus. The possibility that investors may come to regard the euro as being as good as the dollar when it comes to denominating low-risk securities cannot be excluded. Indeed, it may already be happening. Since February last year, the dollar has declined against the euro by 27 percent.
A plausible sequence of events might therefore run like this. Long-term interest rates edge up further as investors in Europe and Asia start dumping long-term Treasuries on the market. The IMF formally criticizes U.S. fiscal imbalances (something the IMF's chief economist, Kenneth Rogoff, has already done informally). Long-term interest rates rise further. Inflation picks up due to higher import prices, which is due to the weaker dollar. Long-term interest rates move into double-digits. The Fed starts printing money to lower rates, but this raises long-term rates even further. The economy moves into recession. Deficits now exceed 5 percent of GDP. Inflation hits double digits. Ultimately, the government is forced to raise taxes, depressing the economy further.
This scenario has at least superficial plausibility because it echoes past events. Although few bond traders have history degrees, they recollect that the high bond yields of the early 1980s were in large measure a consequence of the inflationary fiscal and monetary policies of the previous decade. Nor do the 1970s furnish the only historical precedent for inflationary outcomes of fiscal crises. As is well known, printing money helps a government in fiscal difficulties in three ways. First, the government gets to exchange intrinsically worthless pieces of paper for real goods and services. Second, inflation waters down the real value of official debt. (At the end of World War I, all the major European combatants had accumulated public debts in excess of around two years' national income. But, by 1923, the Germans had rid themselves of nearly all their debt by printing so much money that the real value of government bonds fell close to zero.) Third, if the salaries of government workers are paid with a lag or only partially adjusted for inflation, inflation will lower their real incomes. The same holds true for welfare, Social Security and other government transfer payments, provided they are not index-linked. In January 1992, for example, Russian inflation hit its post-Communist peak of 296 percent a month, but increases in government transfer payments (especially pensions and some salaries) lagged far behind.
But a 1970s-style inflation is not the only way America's coming fiscal crisis can unfold, for three reasons. First, only a modest proportion of the federal government's $45 trillion budget gap would actually be reduced through a jump in inflation of the sort described above. Much of the government's tradable debt is of short maturity-indeed, fully a third of it has a maturity of one year or less. This makes it much harder to inflate away debt since any increase in inflationary expectations will force the government to pay much higher interest rates when it seeks to renew these short-dated bonds. Second, Social Security benefits are protected against inflation via an annual inflation adjustment. Medicare benefits are also effectively inflation-proof because the government unquestioningly pays whatever bills it receives.
So, if a rerun of the 1970s would not solve the federal government's fiscal problems, but only compound them, then what are the alternatives? The Bush Administration's approach to the impending federal fiscal crisis appears, surprisingly, to be a variation on Lenin's old slogan: "The worse the better." Faced with the perfect fiscal storm, the President and his men appear to have decided to punch a hole in the boat by pushing through not one but three major tax cuts. Administration spokesmen have often defended such measures as designed to stimulate economic activity, a version of the "voodoo economics" once upon a time derided by the President's father. Sadly, in the real world, cutting taxes raises consumption, which lowers saving and investment, thereby reducing the amount of equipment and other capital per worker. This, in turn, lowers workers' wages and tax payments. This reduction in the tax base reinforces the direct loss in revenues associated with cutting tax rates.
Some proponents of a tax cut as a stimulus argue that reducing certain taxes, like dividend taxes, gives people a greater incentive to save. This is not, however, how people behave, nor how economic theory predicts they should behave. Yes, lower taxes on dividends are an incentive to consume less today in order to consume more tomorrow. But they also provide an incentive to consume more, because tax cuts have income effects as well as substitution (incentive) effects. Even if they did not, the expansion of the tax base from cutting taxes would need to be very large to offset the direct loss of revenues associated with lowering tax rates.
One viable fiscal solution to generational imbalance has already been implemented in Britain: that is, simply to break the link between the state pension and wages. In 1979, the newly elected government of Margaret Thatcher discreetly reformed the long-established basic state pension, which was increased each year in line with the higher of two indices: the retail price index or the average earnings index. In her first budget, Thatcher amended the rule for increasing the basic pension so that it would rise in line with the retail price index only, breaking the link with average earnings. The short-run fiscal saving involved was substantial, since the growth of earnings was much higher than inflation after 1980 (around 180 percent to 1995, compared with a consumer price inflation rate of 120 percent). The long-run saving was greater still: the United Kingdom's unfunded public pension liability today is a great deal smaller than those of most continental governments-as little as 5 percent for the period to 2050, compared with 70 percent for Italy, 105 percent for France and 110 percent for Germany. This and other Thatcher reforms are the reason the United Kingdom is one of the elite developed economies not currently facing a major hole in their generational accounts. (Interestingly, the others are nearly all ex-British colonies: Australia, Canada, Ireland and New Zealand. According to international comparisons done in 1998, each of these countries could have achieved generational balance with tax increases of less than 5 percent.)
Could it happen in the United States? In view of the growing political organization and self-consciousness of elderly Americans, it seems unlikely. If you spend some time in Florida, you are bound to see scores of bumper stickers that read: "I'm Spending My Kids' Inheritance." Fifty years ago, such sentiments were seldom uttered, but attitudes and behavior have changed. Economic research shows conclusively that the elderly as a group are indeed consuming with next to no regard for their adult children. The federal government has spent half a century taking ever larger sums from workers and handing them to retirees in the form of Social Security, Medicare and Medicaid benefits. The result has been a doubling of consumption per retiree relative to consumption per worker. Thus, the absence of voluntary transfers of wealth between the old and the young helps explain why Social Security is sometimes referred to as the "third rail": any politician who suggests a cut in benefits will receive a violent political shock from the American Association of Retired Persons (AARP) and related interest groups.
The Return to Fiscal Sanity
So are there any policies an American president can adopt without risking electoral oblivion? The first goal must be to discipline Medicare spending, which is responsible for the lion's share-82 percent-of the $45 trillion budget black hole. Since 1970, the rate of growth of real Medicare benefits per beneficiary has exceeded that of labor productivity by 2.4 percentage points. The $45 trillion figure assumes, optimistically, that in the future the growth rate of Medicare benefits per beneficiary will exceed productivity growth by only 1 percentage point. Just cutting the growth rate of Medicare benefits per beneficiary by half a percentage point per year would shave $15 trillion off the $45 trillion long-term budget gap. There must be a way to cap the program's growth without jeopardizing its ability to deliver critically important health insurance protection to the elderly.
Unfortunately, the President's new policy-which effectively bribes the elderly with a drug benefit to join HMOs-has three flaws. First, the benefit he proposes is fabulously expensive: between $400 billion and $1 trillion over ten years. Second, his scheme retains the traditional and very expensive fee-for-service Medicare system and permits the elderly to switch back to it whenever they like. Unfortunately, they are likely to switch back just when they are becoming expensive to treat. Finally, the HMOs are free to shut down and ship their customers back to the traditional plan whenever their clients become too expensive.
The key, then, to meaningful Medicare reform is to eliminate entirely the traditional fee-for-service option and give all Medicare participants a voucher to purchase private health insurance. But would this not leave them at the mercy of the market, which favors insuring only the healthiest among them? The answer is no, provided the vouchers handed to the elderly are weighted according to their health status. Thus an 80-year-old with pancreatic cancer might get a $100,000 voucher, while an 80-year-old who is in perfect shape might get only a $5,000 voucher. The vouchers would be determined each year in light of the participant's health status at the end of that year. Having set a rigid cap on total Medicare expenditures, the government can readily determine the amount of each voucher. (The major objection to this proposal is the loss of each participant's privacy, since he will have to reveal his medical history to a government-appointed doctor. But this seems a small price to pay to regain some measure of fiscal sanity.)
The second key policy is to privatize Social Security, but in such a way that the current elderly help rather than hinder reform. One way to do this would be to close down the old system at the margin and enact a federal retail sales tax to pay off, through time, its accrued liabilities. What workers would otherwise have paid in payroll taxes would now be invested in special private retirement accounts, to be split fifty-fifty between spouses. The government would make matching contributions for poor workers. And it would contribute fully on behalf of the disabled and the unemployed. Finally, all account balances would be invested in a global, market-weighted index of stocks, bonds and real estate.
Will either of these policies be implemented? We are not optimistic, since each would entail sacrifices by retired Americans, as the AARP would no doubt hasten to point out. Social Security reform appears likely to remain a taboo subject on the presidential campaign trail. And with the enactment of the drug benefit, Medicare has supposedly been dealt with.
There is, however, one other, more drastic possibility. It is usually assumed that outright default on the government's implicit liabilities is unlikely. Is it? Suppose a major change in expectations about America's fiscal future is looming on the horizon. If the bond market does "go critical"--if, in other words, investors suddenly start to fear an inflationary outcome of the federal fiscal crisis--then a president like this one, who is as attracted to reductions in Social Security as he is to reductions in taxation, might seize the moment of national emergency. And it would indeed be a national emergency. A government facing a steep increase in its borrowing costs would confront a large and powerful social group determined to defend their entitlements.
Such a scenario has one obvious historical precedent. In ancien regime France, the biggest burden on royal finances did not take the form of bonds but of salaries due to tens of thousands of officeholders, men who had simply bought a government sinecure and expected in return to be paid a salary for life. All attempts to reduce these implicit liabilities within the existing political system simply failed. It was only after the outbreak of the Revolution, arguably a direct consequence of the monarchy's fiscal crisis, that the offices were abolished. The officeholders were compensated by cash payments in a new currency, the assignats, which were rendered worthless within a few years by the revolutionary printing presses. This parallel has two implications: first, there can be big political consequences when fiscal systems go critical; and second, vested interests that resist necessary fiscal reforms can end up losing much more heavily from a revolutionary solution.
PERHAPS, then, Paul Kennedy was not so wrong after all to draw parallels between modern America and pre-revolutionary France. Bourbon France, like America today, had pretensions to imperial grandeur but was ultimately wrecked by a curious kind of overstretch. It was not overseas adventures that did in the Bourbons. Indeed, Louis XVI's last foreign war, in support of the rebellious American colonists, was a huge strategic success. Rather, the overstretch was internal, and at its very heart was a black hole of implicit liabilities.
In the same way, the decline and fall of America's undeclared empire will be due not to terrorists at our gates nor to the rogue regimes that sponsor them, but to a fiscal crisis of the welfare state. The government finds itself between the falling rock of market sentiment and the hard place of vested interests. Political expediency rules out fiscal reform; but if the bond markets foresee a spiral of deficit finance, sooner or later they will mark down the price of U.S Treasuries even further. And rising yields will only increase the cost of rolling over the government's explicit debt.
This fiscal crisis is not, of course, a problem unique to America. It afflicts the world's second and third largest economies even more seriously, since neither Japan nor Germany can compensate for the senescence of their populations with American-style immigration. But neither Japan nor Germany has pretensions to be a global hegemon or hyperpower. Their decline into economic old age has minimal strategic implications. That is not true in the American case.
As we write, the crisis of the American welfare state remains a latent one. Few people, least of all in the government, wish to believe it is real. But the crisis could manifest itself with dramatic suddenness if there is a significant shift in the expectations of financial markets at home or abroad. And when the finances of the United States "go critical", there will inevitably be moves to cut back any federal program that lacks strong popular support. Though relatively inexpensive, and not in themselves a cause of American overstretch, "nation-building" projects in far-away countries will surely be among the first things to be axed.
After all, what could be more "discretionary" than the cost of running Kosovo, Afghanistan or Iraq? We are already seeing in Afghanistan, and will soon see in Iraq, how little America is actually willing to spend on postwar reconstruction. By May 2003, the United States had disbursed a paltry $5 million to the main Afghan Interim Administration Fund, a tiny fraction of the $20 billion the Afghan government says it needs to achieve economic and political stabilization. This is especially astounding given the indisputable fact that it was in the anarchy of post-Cold War Afghanistan that Al-Qaeda took root and flourished.
In short, the colossus that currently bestrides the world has feet of clay. The latent fiscal crisis of the American welfare state implies, at best, an empire run on a shoestring, at worst a retreat from nation-building as swift as the original advance towards it. As Edward Gibbon once wrote, "the finances of the declining empire" do indeed make an interesting subject.
Niall Ferguson is Herzog Professor of Financial History in the Stern School of Business at New York University, and senior research fellow at Jesus College, Oxford University. Laurence J. Kotlikoff is professor of economics at Boston University.
Essay Types: Essay