The Deconstruction of Death
Mini Teaser: Most public fears about the "genome era" have been overly alarmist. Yet the political consequences may be revolutionary.
Since the eighteenth century, a succession of technological revolutions has transformed the human condition and the course of history. First, the steam engine took center stage. By the end of the nineteenth century, the multifaceted applications of electricity had begun to change the world. During the second half of the last century, computer technology transformed scientific research, economic activity, military forces and nearly every aspect of human affairs. Now the mapping of the genome signals that a new wave of technology-driven change is coming.
The genome project highlights the recent progress in genetics and the other life sciences, which in turn inspires and sustains continuing advances in biotechnology. By promising to satisfy the most elemental human yearnings -- the desire for good health and for the postponement of death -- biotechnology attracts the kind of deep-rooted political support and strong financial backing that few other fields of science enjoy. It can therefore maintain a momentum capable of generating a stream of scientific-technological developments that governments and international organizations will find hard to control. And there is now little doubt about where this is leading: to human intervention in the process of evolution itself.
Some of these developments, it can be safely predicted, will pose new and fundamental challenges to prevailing religious doctrines and teachings. Longevity, combined with good health, is a goal that democratic governments cannot oppose. Who would want to block the path to possible eradication of hereditary sickle cell anemia, or to medications that promise a cure to Parkinson's and Alzheimer's diseases? But when such universally acceptable goals have been reached, science will not come to a full stop, even if religious organizations, ethics advocates or politicians should want to draw a line beyond which human nature must not be altered.
The good and the bad that the era of the genome promises to bring will often be inseparable. Consider this simple example: experts predict confidently that progress in biotechnology will make it possible, probably well before the end of this century, to extend people's active life span by twenty years or more. This, most people would agree, will be a good thing. But if this prediction comes true, one consequence will be that entrenched dictators will live longer, thus postponing the leadership successions that until now have so often offered the sole means of relief from tyrannical regimes. Stalin, for one, comes to mind as a fellow who would not have volunteered to retire had his doctors been able to keep him active and fit to the age of, say, 120. If he could have benefited from the medical technology that seems likely to be available a few decades hence, he would have ruled his evil empire until just about now, and his unfortunate subjects would have suffered many more campaigns of terror. And if biotechnology could have offered Mao Zedong and Deng Xiaoping the same extended life span, Deng would still be waiting in the wings for an opportunity to implement his reforms.
The Genome and Globalization
Globalization can only hasten the era of the genome. The Internet is facilitating the spread of the latest scientific discoveries in genetics and biotechnology, while the pressures for free trade are breaking down the barriers that Luddite movements erect to keep out products derived from these discoveries. When legislators in one country pass a law to prohibit an application of biotechnology that they judge to be politically incorrect, they will have to contend with the virtual certainty that other countries will happily exploit the new application. For instance, should the U.S. Congress decide to prohibit implants for human patients of organs derived from cloned sheep or pigs, Britain and Japan might allow the production and use of these "lifesaving" implants. In the long run, the approach that succeeds in offering people better health and longer lives is bound to prevail.
Democracies, however, are obliged to be tolerant of citizens who fall in love with irrational causes. For a long time, many communities in the United States fiercely rejected the fluoridation of drinking water, an intervention that had been clearly proven to reduce tooth decay without any harmful side effects. To prevail over such opposition, those promoting a technological innovation in democracies must offer benefits that a large majority of the people truly covets. To the chagrin of American producers, agricultural uses of genetic engineering have not so far met this test in Europe, and genetically modified foods remain banned there. The well-fed people in these well-to-do democracies have no desire and little tolerance for genetically altered agricultural products because they see no compelling need for "tampering" with nature. Thus, as things stand, the winning lobbies oppose innovation.
This need not be considered as tragic as the advocates of genetically modified foods now claim. In the event of a serious famine resulting from a worldwide food shortage (as opposed to today's famines, which are the consequence of a maldistribution of plentiful food), the weight of political advocacy would instantly shift. (A parallel comes to mind regarding the lobbies that have managed to close down nuclear reactors. In Germany, the Greens recently forced a phase-out of all electricity-producing reactors. So Germany will have to burn environmentally harmful lignite and perhaps import electricity from unsafe nuclear reactors in Eastern Europe. For economists and other rationalizers, it is irritating that, of all people, environmentalists want to abolish clean nuclear energy and prohibit pesticide-free bioengineered foods. For the political philosopher, these quirks are accepted as just one of the smaller costs of democracy.)
The Retirement of Democracies
The most widely and vigorously debated policy issue in America, Europe and Japan is not taxes, defense strategy or foreign policy, but the retirement system. Can and will the government pay for the maintenance of the steadily growing proportion of the population that is of "retirement age", and hence entitled to a lifetime annuity and largely free medical care? And who will be left to care for those with chronic degenerative diseases who cannot care for themselves?
The threshold that establishes the retirement entitlements was set a long time ago, in some European nations more than a hundred years ago. In 1882, for example, the Bismarck administration in Germany enacted retirement pensions that started at age 65, hardly budget-breaking since at that time the life expectancy at 65 was less than 9 years. In the wealthy countries, people who have reached this threshold of "old" age on the average now live five to ten years longer than at the beginning of the twentieth century; and, according to quite modest predictions about the coming advances in biotechnology, this life span will be extended another five to ten years by mid-century. Already, millions of people entitled to receive retirement pensions could continue to work productively for another five to ten years, and many do. Yet powerful interest groups, as well as the anxieties of politicians about alienating voters, block an appropriate raising of the retirement age. It will probably take an acute and painful fiscal or monetary crisis to narrow this widening gap between the changing biological facts and the unchanging political rules for "old" age.
A darker prospect also needs to be confronted. Although the average life span will increase significantly in well-to-do societies, it cannot be taken for granted that the added years will mostly be healthy ones. Too little is known today about the gains in longevity expected from progress in biotechnology to make useful forecasts about the percentage of older people that might be seriously disabled. If, unhappily, this percentage should markedly increase, the psychological, social and political consequences would be painful. With a growing proportion of a nation's citizens both retired and seriously disabled, the part of the national budget pre-empted by welfare entitlements -- already a serious constraint on budgetary flexibility -- would expand significantly. This, in turn, would mean that democratic governments would have fewer resources for other goals such as educating the young, national defense and foreign policy. Additionally, public attention and political energies in the democracies would increasingly be focused on these urgent new needs at home, while totalitarian and authoritarian regimes could still give priority to their militaries and to foreign adventures.
Deconstructing Death
A mistier question relates to the effect on a nation's psyche should biotechnology alter the aging of societies. Since antiquity, the outer limit of human life has changed little. What is still today understood as "old" age has been experienced in a psychological and cultural sense for thousands of years. The Roman consul and orator, Cato, died at age 86 in 149 bc; Saint Augustine died at age 75, shortly after he finished writing The City of God; Michelangelo was creative until close to his death, a few months shy of 89; Voltaire lived to the age of 83 and Benjamin Franklin to 84.
During a full and natural life cycle, our emotional experience -- of the world around us as well as of ourselves -- moves through changing seasons. In the springtime of youth, says the philosopher Michael Oakeshott, "everybody's young days are a dream, a delightful insanity, a sweet solipsism. . . . The world is a mirror in which we seek the reflection of our own desires. The allure of violent emotion is irresistible." In maturity, our emotional experience dwells longer in a single mood, and the light and shadows have harsher edges, as on a dry, hot summer day. In the autumn of our lives, our feelings and sentiments become more subdued, yet are also enriched by joys and sorrows that we recall from the past. And for those who can reach the last season in fair health, the sentiment of yearning -- the mind's strongest grip on life -- becomes becalmed and eventually fades.
If indeed it enables more and more people to live in good health up to age 120 or even beyond, biotechnology will alter this emotive melody of life, and thus transform the temperament of whole societies. When that happens, the nations so affected are likely to change significantly their character and behavior. For one of the lessons history has taught with painful clarity is that emotions matter a great deal in shaping political thought and action.
In the last century, youthful activists provided much of the frenzied energy of the revolutionary and quasi-revolutionary movements, both on the Right and the Left. Benito Mussolini was a master at mobilizing Italian youth and encouraged a sophisticated cult of youth, a cult celebrated by the national anthem of fascist Italy, with its "giovinezza! giovinezza!" as the opening cry. Similarly, today the fighting vanguards of repressive fundamentalist movements are made of teenagers and activists in their twenties. When the Taliban conquered Afghanistan, it first had the characteristics of an idealistic "student" movement. As Samuel Huntington has observed, "young people are the protagonists of protest, instability, reform, and revolution."
Happily, in the last two decades of the twentieth century the historic role of youth was on balance beneficial, as throughout the Soviet empire, where young people with their enthusiasm for new moral causes and their boldness of action took the lead in toppling the tyrannies of arteriosclerotic communism. But it has not usually been so, and it is not fanciful to envisage a new cleavage opening up in the world order, aligning on one side the "bioengineered" nations, with their becalmed temperament suitable for guarding the status quo, and on the other side youthful movements zealously impatient to smash the inequities of the world.
This is not the only change in politically important emotions that biotechnology is likely to bring. It will also pose a formidable challenge to world religions, by tempting societies to employ new medical techniques for manipulating "natural" death. Biotechnology will make it safer and easier to replace failing vital organs. Eventually, an almost endless series of life-prolonging interventions will become practicable. In contrast to today's respirators, feeding tubes and permanent kidney dialyses, these interventions will be less onerous for patients, and hence likely to find more unquestioning support among the medical profession and the general public. Wealthy societies -- and their ethical or religious monitors -- will thus be faced with choices for a progressive deconstruction of death.
An unraveling of the natural end point of human life is bound to pose a challenge to the major faiths. For religious creeds, the beginning and end of an individual's earthly existence are the unambiguous boundaries drawn by God to delimit the sanctity of human life. For the Christian religion in particular, such a deconstruction of death might turn out to be more corrosive than Galileo's discoveries or Darwin's explanation of evolution. What science taught the faithful about the structure of the solar system or the evolution of humans from animal primates has since been sloughed off as a peripheral adjustment of doctrine. A science-driven deconstruction of human mortality, however, would loosen, perhaps even sever, the link between one of the most elemental human fears -- the fear of death -- and the edifying sentiments that the major religions have evoked for thousands of years.
Angst of Eugenics
Several nightmarish worries about the era of the genome have been circulating in academic conferences and editorials. Some of these might become serious threats in the more distant future, others are less plausible. No good case has been made, for example, to justify great anxiety about the genetic interventions that rich parents might purchase to "improve" their children. Similarly, the concept of human cloning has led to hand-wringing that exaggerates the societal impact of this rather outlandish idea. It is the angst engendered by the historic misuse of eugenics that lies behind many of these concerns, an abhorrence that became deeply implanted in modern democratic societies as they began to comprehend fully what Hitler's regime had perpetrated. Curiously, it took Sweden -- that paragon of a benevolent, liberal democracy -- until 1975 to halt its rather crude eugenics project. Only in 1997, after a leading newspaper revealed that thousands of sterilizations of institutionalized patients had been carried out between 1935 and 1975, did the Swedish government address this recent past (with the customary response -- by appointing a commission).
The expected progress of genetics and biotechnology has now revived the angst of eugenics by lifting an idea from science fiction into the realm of the probable. This is the idea of "improving" innate human characteristics by altering the genetic make-up of the immediate offspring or, more ambitiously, of a succession of future generations. Such a project differs radically from the old type of eugenics that was held in high esteem by many British and American scientists and intellectuals a century ago (and was in favor more recently in Sweden). The earlier kind was as crude as it was simple: it halted the propagation of serious hereditary diseases or defects by preventing further offspring.
The new version, from what can be said about it at this time, does not appear to be simple, nor could it easily be proven harmless and effective. To gain a calmer perspective, those who now worry that this new eugenics might transform humanity ought to do a little arithmetic on its demographics. By the time the first few beneficiaries of genetic modifications would have survived infancy, the world's population would have increased well beyond the present six billion. Thus, by that time this kind of eugenics would have altered but a minuscule fraction of all humanity then living -- rather like putting one altered grain of sand on a beach. Moreover, while with a jar of fruit flies the genetics student can observe the generational cascade of mutations within a few days, the generational succession of humans takes much longer and the demographic multiplication of this "grain of sand" would work quite slowly. But to know that the new type of eugenics will not trigger some catastrophic side effect in old age would necessitate monitoring the initial recipients of the treatment (who supposedly would have acquired some improved traits) for sixty years or longer. By that time, over ten billion people will be living on the planet. Unlike a jar of fruit flies, therefore, human society need have no fear of being transformed into a modern variant of Aldous Huxley's Brave New World.
Puttering About the Brain
Projects to enhance the capacity of the human mind are likely to become even more important for mankind's future than the developments that will enable people to enjoy healthier and longer lives. To enhance a people's mental faculties, societies have traditionally pursued two complementary approaches. One uses tools that aid the human mind from the outside (pictures, written texts, an abacus, or the latest computer); the other approach strengthens the workings of the mind on the inside (by problem solving and memorizing, and with more tangible interventions such as the use of caffeine or medications to reduce learning disabilities). Although this "outside-inside" distinction is rather banal, it will help to introduce a most revolutionary prospect more gently.
Within the lifetime of today's school-age children, advances in biotechnology, neurology, cognitive psychology and other disciplines will increasingly converge to yield a deeper and richer understanding of the relationship between the functioning of the brain as a living organ, and the functioning of the human mind in all its intellectual powers. Some fifty years ago, well before these new possibilities for studying and influencing the brain had been realized, computer scientists became intrigued by the prospect of building computers that could compete with the full intellectual capacity of the human brain. This quest for "artificial intelligence" has not reached its goal. However, computer scientists have since made enormous progress in designing systems that can perform more limited but highly practical "intelligence" tasks, examples of which include recognizing and categorizing patterns, such as fingerprints, handwritten text or human speech; and translating languages. Although computers can do this work much faster and far more reliably than humans, they are inferior in coping with ambiguity and novelty, and in exercising judgment in situations where the advantages and disadvantages cannot be sorted out by pre-programmed rules.
The initial attempts to build "thinking machines" that would rival the human mind used electronic, mechanical and other lifeless components. Critics of computer-based "artificial intelligence" have long stressed that the full intellectual powers of the human mind cannot exist in a machine, but require the context and re-inforcement of a living body to endow the mind with its unique powers of reasoning, judgment, intuition and creativity. More recent findings lend support to this view. The presence of emotions seems to be one of the essential ingredients that a machine lacks.
Thanks to growing knowledge about the functioning of the brain, it will become possible to integrate the computer-based assistance to human intelligence that works "from the outside" with new ways to enhance the power of the brain "from the inside", and thus to lower the barrier between the two approaches. Already, devices linked to computers have been inserted into the brain on an exploratory basis, and living brain tissue from animals has been inserted into computers. More important in the long term will be the substantial research effort now under way on technologies for treating diseases of the brain, such as gene therapies or the use of stem cells. Intelligence, in its diverse aspects, is surely governed by a multiplicity of genes, but this need not rule out the possibility of strengthening some important aspect of intelligence by targeting a single gene. Particularly significant in this respect would be a major enhancement of human memory. Experiments with mice have already demonstrated genetically-induced improvements in memory.
Many research projects are now under way to develop new ways of teaming brain power with computers. These projects tend to be small and draw on different scientific disciplines. They address problems of perception, memory, reasoning, consciousness, emotion and other mental phenomena. Experiments have been conducted with embryonic nerve cells from the spinal marrow of mice, kept alive by nutrient solutions, to create a neural web that emits distinct electronic signals in response to contact with different chemicals. By connecting an appropriate computer to this web, a new type of sensor can be constructed to read and categorize these signals. (One early and comparatively modest practical development of such a sensor might be the creation of sniffers capable of outperforming the specially trained dogs that customs officers use to search for narcotics and explosives.)
Proceeding along a different line, researchers at NASA's Ames Research Center have successfully tested a "bio-computer" that links a computer to an aircraft pilot by way of sensors, which pick up tiny electrical impulses from the pilot's forearm muscles and nerves. With such a system, the pilot could, without using a keyboard, directly instruct the computers that control the aircraft. The goal of this work, in the words of NASA administrator Daniel Goldin, is to develop "hybrid systems that combine the best features of biological processes" with opto-electronic and other non-organic devices.
Hundreds of such ongoing projects are reported in scientific literature today in what amounts to only a modest beginning. It seems plausible that over time such linkages of living things with computer systems will enrich the capabilities of traditional computers with some of the unique cognitive capabilities of humans or animals. Precisely because these projects are so innocuous in their stated ambitions, they can proceed and receive financial support without stirring up ethical objections or risking a government-imposed moratorium on further research. Thus, while the general public is being treated to fanciful stories about human cloning and doubling the length of human life, claims that biotechnology might double or triple the powers of the human mind have been rather restrained -- among the experts in the field as well as among the popularizers.
Mr. Brain Marries Miss Computer
An exception to this restraint is provided by stories about robots with superhuman skills and intelligence. Such ideas are now a mainstay of science fiction and have famous precursors in Mary Shelley's Frankenstein and H.G. Wells' The Island of Doctor Moreau. However, most of these stories have "smart" scientists stupidly creating artifacts in the crude likeness of a human being, monsters with clumsy robot feet, husky voices and a predilection to kill or enslave their creators.
The lone mad scientist and roaming robot are not apt metaphors for the age of the genome. The human genome project is a large, international undertaking whose latest discoveries are broadcast daily on the Internet. As British Prime Minister Tony Blair takes care to emphasize: "We, all of us, share a duty to ensure that the common property of the human genome is used freely for the good of the whole human race." Likewise, increasingly more "intelligent" computer technologies are being marketed throughout the world by some of the most globalized businesses. This open and international scientific competition in biotechnology, computer science and other disciplines is bound to lead to a better understanding of triangular interactions between mind, brain and advanced computers. Many competent research teams will continue to explore the mysterious cohabitation of the brain and the human mind. Here, as elsewhere, competition is the motor of progress. Well before the end of this century, the technologically advanced societies might begin to debate the most tempting, but perhaps most dangerous, ambition in all of the history of science: whether to design and build an entity that would enhance the most advanced computers with biological processes and living organisms to achieve an intelligence truly superior to that of human beings today. In contrast -- or in addition -- to the "artificial intelligence" of computer systems, such entities would need to be capable of insight, creative discoveries, flexible learning and judgments informed by the appreciation of a changing environment and changing values.
Such a "super-brain" would of course include the vast memories and other capabilities of the best computers. But that will not be the end of it. Human thought, to be creative and purposeful, requires willpower and the subtle involvement of emotions. At the center of this dynamic there might even have to be a process that resembles the mysterious innermost sovereign of the human mind -- its self-consciousness. If such a super-brain project proves capable of being started, the designers would undoubtedly try to capture these ultimate attributes of human thinking as well, at least to the extent needed for the super-brain to surpass the full panoply of human intelligence.
As of now, no one can describe the specific theoretical and technical problems that would have to be solved for this project to be started in earnest. It seems certain that it would require a large-scale interdisciplinary effort with generous financing. Today, there is no meaningful support for such a venture in any country with the scientific and engineering talents sufficient even to contemplate the first steps. This could change, however. An ambitious dictator, in control of a country strong in biotechnology and computer science, for example, could gamble on a crash project with the objective of acquiring a decisive advantage. And if testing on live human beings served to expedite the super-brain project, the more unscrupulous the ruler the greater would be his advantage over more inhibited and law-abiding governments. International treaties would not stop the ruthless, much as North Korea and Iraq have not been prevented from building nuclear and biological weapons, in clear violation of the treaty obligations they had freely assumed.
Such a turn of events would in all likelihood set in motion a "brain race." If this seems far-fetched, recall that America's costly project for the manned mission to the moon -- something for which there had previously been little enthusiasm -- easily received congressional support once it appeared that the Soviet Union was about to accomplish the feat first. Recall, too, that it was primarily to prevent Nazi Germany from acquiring nuclear weapons first that President Roosevelt authorized the initial steps to start the Manhattan Project late in 1941. The fear of a Nazi A-bomb was sufficient reason for the United States to launch this immense and uncertain venture, even though none of the scientists involved at the outset could have outlined the full research and development program that produced the first atomic bomb four years later.
The important point here is that a stream of research projects -- in themselves innocuous or even beneficial -- may well be moving toward such an outcome. Should the apparition of a super-brain begin to emerge from the fog of speculation -- or perhaps it is more realistic to say when such an apparition emerges -- the political debate whether or not to proceed will probably become far more convulsive than the debate after 1949 about building the H-bomb, or the contemporary disagreements about some ethically troubling uses of "gene therapies." Those who will call for a prohibition of further work on the super-brain will be derided as Luddites trying to halt scientific progress, or as wimps prepared to put their country at a political disadvantage as dangerous as that in a vital arms race. Those in favor of moving ahead will have to admit that, if the project succeeds, human civilization would be transformed more profoundly than by any of the great transitions in mankind's past.
Indeed, as the peoples of the world begin to comprehend the cosmic immensity of the project, they might start to fight with each other to deny the God-like powers of the super-brain to their enemies. In the end -- like in the biblical story of the Tower of Babel -- the whole project might be left in ruins.
Essay Types: Essay