America's Innovation Edge Is at Risk

America's Innovation Edge Is at Risk

Only the government has the resources and long-term interest to accept the risks inherent in funding basic science.

 

It seems a growing number of people refuse to accept living in a post-fact world. Saturday’s March for Science was billed by its organizers as necessary “to defend the vital role science plays" in our health, safety, economies, and governments.”

It is no coincidence that the march was held during the opening weeks of Donald Trump’s presidency, and on Earth Day to boot. Many of the nation’s scientists fear a serious threat to both science itself and federal policies rooted in science. Look no further than the hatchet the Trump administration hopes to take to the federal government’s research and development budgets, or to its goal of eviscerating or weakening evidence-based regulations across a number of areas, most prominently in environmental policy.

 

How has it come to this? The fact that scientists feel a need to organize mass marches is a sign of something gone very wrong. As in so many areas of contemporary American life, science has become a political football. For whatever reason, a sizable portion of our body politic no longer equates science with progress and prosperity. That sentiment has been years in the making.

This situation is dangerous because it undermines America’s preeminent role in the world, writ large. Principally, it endangers the most successful tech-driven innovation system in human history. Since the early postwar period, that system has consisted of three main interlocking components: pure lab-based scientific research, practical technological development and commercially viable innovation. Many of the amazing technologies we take for granted, including silicon chips, the computer, the internet, the global positioning system (GPS) and mobile communications, came from this innovation system. Some of these technologies are critical to addressing national-security threats and well-being: whether advanced lasers to shoot down North Korean missiles, biotech breakthroughs to cure cancer or prevent Alzheimer’s, or clean energy to help solve climate change.

After World War II, the U.S. government, together with state and local governments, research universities and the private sector, set into motion America’s modern innovation system. The initial goal was outlined in a 1945 report by Vannevar Bush, one of America’s top engineers and a key administrator behind the Manhattan Project. Titled Science—The Endless Frontier, Bush’s report drew on wartime experience and called for a robust postwar federal presence in scientific research. The primary motive was obvious: maintaining America’s military edge, hence its geopolitical advantage, over all rivals would require sustained technological superiority over them as well. Building on this logic, over the following decades the federal government crafted the basic scientific research infrastructure that would enable much of the nation’s technical innovation. The federal government became the nation’s primary investor in basic scientific research, principally through funneling merit-based research grants through universities and a burgeoning series of federal research laboratories.

At different points during the Cold War, the federal government strengthened this presence in American science. For example, in 1957 the Soviet Union’s Sputnik satellite launch frightened the U.S. government and much of America’s citizenry to boot. A few months later, President Eisenhower created a risk-taking, forward-thinking research agency, known as the Defense Advanced Research Projects Agency, and tasked it with ensuring that the United States be the world’s greatest technological disruptor. DARPA has been behind America’s greatest postwar inventions, including the internet and GPS. It funnels huge amounts of federal dollars through the nation’s universities and federal research labs, much of which is spent on blue-sky ideas that may or may not pay off—which is the point of risk-taking, after all.

Public research funding is essential because only the government has both the resources and long-term interest to accept the risks inherent in funding basic science, the wellspring of all technological breakthroughs. The private sector, by contrast, tends to invest more of its research dollars into technologies that are almost mature and can be commercialized near-term given that basic scientific research may never pay off. The public and private roles are therefore complementary: the government funds scientific research that gives rise to technological breakthroughs, while the private sector focuses on developing the most commercially viable technologies.

That brings us to the present day. The Trump administration, through its proposed budget, appears to be unaware of either this history or the consequences of kneecapping American science. Sadly, the president’s budget is but the latest and most visible attempt to erode the research and development (R&D) components of America’s innovation system. Over the past several decades, federal funding for R&D has been declining in relative terms (defined as federal research spending as a share of the nation’s GDP), from around 1.2 percent in 1976 to about 0.8 percent in 2016.

Declining federal research funding is not the end of this story. Over the same time period, state governments have been chipping away at public university funding, transferring the cost of higher education from the general taxpayer to students. This counterproductive strategy is a double whammy for America. As universities conduct the bulk of the nation’s scientific research, squeezing them strangles the basic research upon which technical innovation rests. Saddling students and their families with massive school debt also harms America’s ability to fully educate its workforce and prepare them for this century’s economy.

If we are to retain our standing in the world, including our economic and military edge over our rivals, then America needs to take a hard look at itself. Earlier this month, the Atlantic Council released a report that tries to do just that. The report analyzes America’s innovation system and covers diverse issues, such as university tech-transfer practices, immigration, capital investment patterns and the nation’s patent law system.

Besides calling for a reinvigoration of federal and state support for scientific research and the institutions that conduct it, the report stresses that America’s tech-innovation system is not inclusive enough. While some people in some places have gotten rich from this system, in recent years a great many more in other places have not. University-based scientists and lab technicians have benefited. So, too, have coders and engineers in places like Silicon Valley and Boston. But your average worker in manufacturing or retailing or coal mining likely has not. Many workers find themselves on the outside looking in, their careers having been disrupted (for the worse) by, say, digital technologies or robotics. Some groups have had a harder time getting into the system (women, minorities and older workers in particular). And America’s highly uneven economic geography means that residents of many towns and cities have little or no chance of working at, say, a tech startup.

 

Put another way, perhaps the erosion of popular belief in science springs, at least in part, from the fact that few Americans benefit directly from the country’s tech-driven innovation system, while many have become victims of its relentless disruption of traditional industries.

America is therefore left in a seemingly paradoxical position. On the one hand its tech-driven innovation system is the envy of the world and the basis of the country’s edge over its rivals. On the other hand, that technology has uneven—and sometimes unwelcome—consequences for its citizens.

What, then, to do? Our prescription is to expand that system, not shrink it, so as to sustain America’s technological edge while allowing more citizens in more parts of the country to prosper from it. Among other things, America needs to pay much more attention to both labor and place. The United States needs to treat the constant upskilling of its entire workforce in strategic terms, befitting the fact that human capital will define which countries win and lose in this century’s global economy. Additionally, America needs to take economic geography far more seriously. It should create more dynamic, innovative tech hubs so as to combat the hollowing out of the national economy. Examples are out there: Austin, Texas; Denver-Boulder, Colorado; Madison, Wisconsin; Pittsburgh, Pennsylvania; Chattanooga, Tennessee. The country just needs to find a way to build more of them. That is key to keeping the U.S. innovation edge.

Peter Engelke and Robert A. Manning are senior fellows at the Atlantic Council and its Foresight, Strategy and Risks Initiative.

Image: Clint Hobart evaluates the F6A explosive ordnance disposal ("Mighty Mouse") robot at the Sandia Robotic Vehicle Range.​ Sandia Labs Flickr