Book Notes: Recent Works on the Promise and Peril of Genetic Engineering (3 of 4)

With this April issue of the Journal of Lutheran Ethics, we bring the third installment in our series of notes on books addressing genetic engineering. For interested readers, the previous two columns appeared in the September and December, 2003 issues.

Works reviewed in this month’s column:

Gordon Graham, Genes

Editors of Scientific American, Understanding Nanotechnology

Evelyn Fox Keller, The Century of the Gene

Susan Brooks Thistlethwaite, editor, Adam, Eve, and the Genome

GENES: A Philosophical Inquiry
by Gordon Graham (New York: Routledge, 2002), 196pp.

[1] Gordon Graham, Regius Professor of Moral Philosophy at the University of Aberdeen, has written a wide-ranging book that addresses aspects of science and religion, the social role of technology, and the ethics of genetic engineering. He is not a Christian but wants to take seriously those concerns that often arise from a religious consciousness. A particular concern that expresses itself throughout his discussion is the tendency of science and technology to overreach itself. He sees this most notably today in genetics, where broad claims are being made to explain everything from eye color to disease to social behavior (as witness the emergence of sociobiology and evolutionary psychology). Prominent representatives of this development whom he cites are E.O. Wilson (Sociobiology), Stephen Pinker (How the Mind Works), and above all Richard Dawkins (The Selfish Gene, The Blind Watchmaker, River Out of Eden).

[2] Several other works we’ve discussed in this column have made this same observation about the claims of molecular biologists; it has become routine to read assertions about profound break-throughs in genetics, featuring new and startling revelations about human nature. But the intriguing thing about this development is that it is often accompanied by the claim that this knowledge puts us on the edge of fashioning or engineering a new humanity – as witness our discussion of Gregory Stock in the December column. It is this larger impact of technology that gains Graham’s attention, where the prospect of a genetic nightmare (conveyed early on in the literary figure of Frankenstein) has to be taken seriously.

[3] Graham devotes a lengthy chapter to biological evolution, noting the fact that it is a concept that has itself evolved in recent years. While recent theorists have made universal claims for evolution’s explanatory powers, Graham notes that it has not offered a viable explanation concerning the origins of life, for example. Biochemistry as a post-Darwin development may also pose irreducible complexities that are not susceptible to the gradualist explanations of evolution. (65) Another problem Graham sees is psychology, where evolution seeks to explain the existence and function of the mind wholly in natural terms (biological and physical). These “naturalistic explanations of mental phenomena are easily sketched but hard to detail convincingly.” (78) Thus Graham remains skeptical of what he regards as exaggerated claims by today’s more ardent advocates of evolutionary explanation.

[4] But the major concerns raised by contemporary genetics lie not so much in the realm of explanation as in technology. Graham devotes a chapter to the consideration of genetic technology in the progression “from genetic screening through genetic modification to genetic design,” a progression that many think to be inevitable. He raises a question not often considered – whether it makes any sense to talk about what may become technologically possible without considering the social context that determines its practical feasibility. In isolation a genetic intervention may look desirable, but can it be incorporated into a health program? Problems of resource allocation as well as long-term outcomes are but two factors that will have a decisive impact on a procedure’s viability.

[5] In regard to genetic information, Graham makes a good case for genetic ignorance. When there’s nothing that can be done about it, there’s no good reason why people should want to know about their genetic defects. “There is nothing irrational in preferring lower levels of anxiety to higher levels of knowledge.” (104) He takes a different approach than commonly taken in regard to insurance problems raised by people with genetic diseases, arguing that insurers may face greater expenses but they also know that such people are likely to die sooner. It’s all based on probabilities rather than certainties, of course, but Graham does not think that the genetically compromised will necessarily be burdened by excessive premiums; it may be a kind of wash compared with those who are relatively healthy but live long lives, with the attendant expenses associated with old age. He may have a point, but the record as of now seems to give little support to this argument.

[6] Graham quite aptly refers to “ethnic cleansing” in characterizing attempts at “sexual cleansing,” where a society might adopt a policy of extracting a person’s homosexual gene and replacing it with a heterosexual one. He refers to the research of Dean Hamer (Science, vol. 261 [1993], 231-237) in identifying a genetic link to chromosome region Xq28, but acknowledges that this is far from identifying a homosexual gene. In any event, even if it were such a proof, it would be fruitless to establish a policy of this kind because the gene would show itself only in the homozygote and not in the heterozygote, which would still be carrying the gene in a recessive form. Graham uses this example in challenging the notion that we can fashion a world according to our preferences through genetic manipulation. It’s a matter of “how far we have really understood the nature of the world we propose to re-fashion.” (113)

[7] In turning to genetic modification, Graham provides some historical perspective by noting that genetic engineering as such is not something new; efforts at selective breeding go back for centuries and genetically modified organisms (GMOs) “have been with us a very long time.” It’s also the case that the goals of the past continue to be the goals of the present – making things more satisfying, convenient, and efficient for humans. Does the emergence of genetics as a scientific discipline and our capacity today to engineer outcomes with greater speed and precision make any ethical difference? Graham is not impressed with arguments that we are “tinkering” with nature, or that the possibility of a genetic catastrophe requires that we refrain from pursuing germline therapy. At this point he adopts a “costs and benefits” approach that would examine each proposed intervention on its own merits rather than establishing a general policy.

[8] Whether they are made on psychological or logical grounds, Graham is also dubious about “slippery slope” arguments that caution against human embryo research because it will lead inevitably to further experimenting on humans or efforts to produce “designer babies.” Nor does he think that “sanctity of life” arguments are appropriate in regard to embryo research, for “there is very good reason to hold that human embryos are not persons in any sense whatsoever.” (138) He makes a decisive distinction between the embryo and the fetus, noting that for the first 14 days the former is not a distinguishable individual but “a collection of cells” that could become one, two, or three people (the same argument made by a number of Roman Catholic scholars in challenging their Church’s stance on abortion).

[9] At the same time, Graham acknowledges that specifically religious concerns, often expressed in cautions about “playing God” or ignoring the “sanctity of life,” get some resonance even among secularized people in the Western world. He suggests that a concept like “genetic trespassing” might be a secularized version of these ideas, but he remains distrustful of what it implies, namely, a discernible dividing line between what is “natural” and what is not. He is not optimistic about establishing a normative concept of human nature such as conservatives like Leon Kass are intent on doing. The alternative for Graham remains a rejection of any principle that rules out a priori the necessary process of experimentation on a case-by-case basis, with each case being assessed as to its potential risk on a cost-benefits scale.

[10] Lest readers now throw up their hands because of Graham’s apparent unwillingness to establish some meaningful boundaries for genetic engineering, there is more that he has to say. He turns to “human rights” as a concept that could be construed as a secular version of the sanctity of life. One basis for our rights is a belief in human equality, but without God providing the basis for that equality in a secular world, Graham proposes an alternative in terms of what we all have in common: “…no one is in a position to decide that the life of another is not worth living.” (151) While we would all undoubtedly regard as “better” the lives of healthy, intelligent, attractive and talented people over those who are sickly, intellectually weak, deformed and talentless, the point is that “they are not morally better.” He then applies this principle to cloning and designer babies, concluding after a lengthy argument that children are a “basic human good,” a gift that we can neither demand nor expect a right to. We may be sovereign over plants and animals, but that does not allow “one set of human beings to…determine the why and wherefore of the existence of others.” (170)

[11] Graham’s principle has obvious implications for abortion as well, a point he acknowledges. He does not use the principle to make a blanket rejection of abortion without regard to the stage of pregnancy or the reason for it, but allows it to function as a cautionary principle that would compel moral responsibility in making a decision to abort (recognizing, for example, the moral difference between aborting where a severe genetic disease is involved in contrast to where the woman simply decides she doesn’t want a child). Applying this same principle to designer babies, it would exclude the attempt to fashion “better” human beings because it would be a rejection of the first-person right of judgment of the “non-designed” person. For Graham this would amount to “an indefensible hubris.” Thus he concludes that the religious concern about society “playing God” has validity in regard to any attempt at genetic improvement of our progeny.

[12] The boundary that Graham fashions will not satisfy those opponents of abortion who take an absolutist stand without reference to the empirical situation, nor will it satisfy those who seek a moral and theological rationale for the exclusion of every kind of genetic intervention. What he is proposing is a principle that excludes efforts to “improve” on human beings without at the same time ruling out therapeutic interventions. He can be taken to task for not addressing the difficulties in distinguishing between therapy and enhancement, but I would commend him for his conclusion that a distinction must be made. He eschews theological arguments in taking this stance, but he provides a good example of a secular philosopher who believes that certain concerns raised by the religious mind concerning genetic enhancement are important. His response is to find a moral rationale that will address those concerns in speaking to a secularized culture.

UNDERSTANDING NANOTECHNOLOGY
From the Editors of Scientific American
(New York: Warner Books, Inc., 2002), 149pp.

[13] The emergence of nanoscience during the past several decades has been one of the most significant developments in the scientific world. The work being done in this field is laying the foundation for nanotechnology, which remains largely a vision at this point. In order to familiarize people with the nature of the “nanoworld” and what it may promise in the decades ahead, the editors of Scientific American published a series of articles on the topic in 2000-2001 and then collected them for publication in this little book. Written by recognized scientists in the field, this collection constitutes an excellent introduction to the subject for the educated layperson.

[14] With nanoscience we are dealing with the essence of small. A nanometer is a billionth of a meter, one thousandth the length of a typical bacterium, one millionth the size of a pinhead. This world is a kind of borderland between the realm of individual atoms and molecules (where quantum mechanics rules) and the macroworld that we bump up against in daily living. It concerns the properties and behaviors of aggregates of atoms and molecules – the realm of the mesoscale – where properties of matter are governed by a complex combination of classical physics and quantum mechanics. Our access to this world is now possible through the invention of some remarkable instruments – like the scanning tunneling microscope and atomic force microscope, among others – that are capable of creating pictures of individual atoms or moving them from place to place. With nanotechnology, the intent is to fashion nanostructures that may contain superior electrical, chemical, mechanical or optical properties, with potentially widespread commercial uses.

[15] A particularly significant development occurred in the early nineties when a Japanese scientist first noticed peculiar nanoscopic threads lying in a smear of soot. Made of pure carbon, these macromolecules became known as nanotubes and have been the object of intense scientific study ever since. They possess remarkable qualities, such as exceptional resilience, tensile strength and thermal stability, because carbon atoms bond together with amazing strength. This has given rise to some fantastic predictions of microscopic robots, cars that bounce rather than crash in a wreck, and buildings that sway rather than snap in an earth-quake. In the electronics industry, carbon nanotubes should play the same role as silicon does in electric circuits, but at a molecular scale where silicon and other standard semiconductors cease to work. With the process of miniaturization in computer technology – now at the micro-level – soon reaching its limits, the prospect of wires and functional devices tens of nanometers or smaller in size, made from nanotubes and incorporated into electronic circuits (and working far faster and on much less power), is an exciting prospect.

[16] There has been considerable hype during the last twenty years or so concerning the revolutionary prospects of nanotechnology. The futurist, K. Eric Drexler (Engines of Creation,1986), has contributed significantly to the excitement with his depiction of self-replicating nanomachines capable of producing virtually any material good, as well as enabling the reversal of global warming, curing disease, and radically extending the human life span. While most scientists would dismiss these speculations as irresponsible, they would likely not dispute that the nano concept could lay the foundation for a new industrial revolution. One of the areas in which this technology holds considerable promise is biomedicine, which makes it pertinent to the interests of this column.

[17] Nanometer-scale objects made of inorganic materials, such as magnetic crystals, may be quite useful in biomedical research, including the diagnosis of disease and even therapy. Nanoscale particles can be put to work as tags, or labels, increasing the effectiveness of biological tests; a nanotube-tipped atomic force microscope can trace a strand of DNA and identify chemical markers that reveal which of several possible variants of a gene is present in the strand. Nanoscale particles could be used to deliver drugs just where they are needed, including places that standard drugs do not reach easily, and also avoiding the harmful side effects often created by potent medicines. Problems could be detected earlier, at more treatable stages; for example, revealing tumors just a few cells in size. Artificial nanoscale building blocks may eventually be used to help repair skin, cartilage and bone tissues, and even help patients to regenerate organs. Other goals include new aids for vision and hearing, rapid tests for detecting disease susceptibility, and responses to drugs.

[18] The reader may well feel bewildered in trying to visualize the kind of “machinery” that could be utilized at such an infinitesimal level as the nanoworld. It involves a paradigm shift from the current electronic world of silicon chips and circuit boards to the realm of chemistry. The methods used to produce molecular devices are the same as those of the pharmaceutical industry, where chemists start with a compound and then gradually transform it by adding prescribed reagents whose molecules are known to bond to others at specific sites. Through chemical reactions, such as oxidation reduction, molecules can be made to conduct electricity and to act as transistors that can switch an electric current on or off. But we are still far from the kind of electronic control we desire because our capacity for assembly and organization at the molecular level remains quite primitive; nanostructures at this point are much less reliable than their microelectronic counterparts.

[19] In the books we have been reviewing over the past several months, some in particular have projected future scenarios in biomedical practice that have stretched our imagination and challenged our sense of what is ethically acceptable. It is clear that much of the impetus for these scenarios – including possible enhancements of human attributes and pushing back the boundaries of our mortality – can be ascribed to the prospects raised by nanotechnology. We don’t know with many of these projections whether they belong to science fiction or to the world of future generations. What is clear is that scientists and engineers are moving full steam ahead to exploit the possibilities of the nanoworld, and that their efforts will compel society to address increasingly radical questions about human identity and what will ultimately serve the common good. It is incumbent upon the church to be thinking about these prospects and to engage in the kind of free and open dialogue that will help to forge a consensus on what we want to encourage and what we feel compelled to discourage in this realm of biotechnology.

THE CENTURY OF THE GENE
by Evelyn Fox Keller
(Cambridge, MA: Harvard University Press, 2000), 186pp.

[20] This book by Evelyn Fox Keller, Professor of History and Philosophy of Science at MIT, brings together an unusually rich combination of historical background and current research in molecular genetics. The author is also blessed with the ability to write about a dense subject with remarkable clarity. While the subject is the development of the scientific understanding of the gene during the last century, Fox’s aim is to spell out the revolutionary understanding of the gene that has emerged in recent decades. This understanding is challenging the traditional understanding of the gene, both in scientific circles and in the popular mind.

[21] Keller maintains that the new science of genomics, given a tremendous boost by the Human Genome Project, has actually “undermined” the basic concept of genetics, the concept of the gene. One consequence of this transformation is something she wants to celebrate: Contrary to all expectations, the common assumption that the more we learn about genetics the more we will be compelled to recognize that we are “determined” by our genes is now being challenged. Reductionist theories no longer have the scientific support that many have assumed, with increasing recognition of the gap between genetic “information” and the biological meaning we have inferred from it. With the primacy of the gene receding in the twenty-first century, Keller acknowledges that the gene’s role in the twentieth century as an explanatory framework has been most important; our task now is to develop a “new lexicon” that will serve the same purpose in coming years.

[22] What are these developments that are revolutionizing the discipline of genetics? Keller charts out the progress from the 1940s in determining what it is that genes do, a process that seemed to get a definitive answer in the 1950s and 1960s. As Francis Crick put it in 1957, “DNA makes RNA, RNA makes protein, and proteins make us.” DNA was seen as the book of life. But there were some “minor wrinkles” in this thesis from the outset, and during the last couple of decades these wrinkles have grown into “major chasms.” One important development was the recognition that there seems to be more than one kind of gene – not just a “structural” gene that produces protein but a “regulator” gene as well. Genes do not simply act, but are activated, and it is here that regulator genes come into play. In recent years these regulatory genes have proliferated in number and kind as molecular biologists have discerned a variety of functions taking place, quite apart from the actual “making” of anything. (55ff.) In fact, the percentage of structural genes in the human genome seems to be very small, raising the question of identity: What counts as a gene?

[23] Another problem is split genes which code for proteins but turn out to be fragmented, with long non-coding regions (often called “junk DNA”) that at first – mistakenly – were assumed to have no function at all. For genes that are fragmented in this way, there is no strict one-to-one correspondence between the sequence of a gene and the protein it gives rise to. Indeed, the number of different proteins at least hypothetically associated with a particular gene can now reach into the hundreds. What determines the proteins a gene makes, and under what circumstances? Apparently it isn’t the gene that “decides,” but the regulatory dynamics of the cell itself. And the picture gets still more complicated when we turn to the function of proteins. It was thought that the function of a gene had been identified when the amino-acid sequence of its protein was determined, but a protein can function in many different ways, depending on its context. Cells have in fact developed sophisticated means for switching between the functions of these proteins.

[24] What all this means is that much of what geneticists have learned in recent years falls outside the original picture. It has become exceptionally difficult to define the gene as a structural unit. As one scientist working in molecular genetics puts it, the gene may be “a concept past its time,” where its use “might in fact be a hindrance to our understanding.” (68) There is considerable irony in this situation because the gene has never been more prominent in both the scientific and public press than it is today. The excitement over locating the genes responsible for a variety of genetic diseases has been steadily growing, just at the time when the apparent plasticity of the gene threatens its very meaning. One painful consequence of this fact is that we are no longer as sure as we once were that our ability to diagnose genetic diseases will soon pave the way to significant medical benefits. On the contrary, we still have much to learn about the processes that link the defective gene to the onset of disease. We need a better understanding of what it is that genes do, which has led to the focus today on functional genetics in contrast to structural genetics.

[25] In pursuing the function of genes, the term genetic program has come into vogue. It reflects the impact of computer science, where the metaphor of a program has also been helpful. But there remains an ambiguity on the precise role of the gene as to whether it is the subject or the object of the program, whether it is the source of the program or that upon which the program acts. Keller now enlarges her focus from gene to the organism, noting what we have learned from the reprogramming efforts involved in cloning by nuclear transfer (resulting in the sheep “Dolly”). There still remain basic questions about whether there is a centralized program emanating from the brain that governs the development of organisms, as some believe, or whether it is not more accurate to locate developmental dynamics more “locally” in the ways in which genes interconnect with each other; it appears that complex regulatory mechanisms are at work in a dynamic process that determines when and where a particular gene will be expressed. The wonder, expressed several times by the author, is that the development of an organism is as steady and predictable as it is, given the dynamic character of its component parts. For persons of faith, it becomes yet one more dimension in appreciating the Psalmist’s words, “I praise you, for I am fearfully and wonderfully made.” (Ps. 139:14a)

[26] What distinguishes the organism appears to be its self-organizing character, a point first made in modern times by the eighteenth century philosopher, Immanuel Kant. Capacities for self-regulation and even self-formation characterize the organism: “An organism is a material entity that is transformed into an autonomous and self-generating ‘self’ by virtue of its peculiar and particular organization.” (108) This language is remarkable in its reference to self-directed, purposive action in describing a material entity of flesh and blood, presumably functioning according to physicochemical reactions. It reflects what might well be called the “mystery” of the human organism, whose nature still eludes the scientific categories used to understand it. A new and significant turn has occurred since World War II with the development of the electronic computer and the study of control and communication in machines and living beings – what Norbert Wiener termed cybernetics. The attempt to build machines that resemble living organisms has led to a fruitful interaction between the world of computer engineering and biology, with metaphors from each field influencing the other. A new breed of biologists has emerged (particularly instrumental in facilitating the Human Genome Project) in which both biological and computational skills are united in the field of bioinformatics, and spawning new perspectives in molecular biology.

[27] In her concluding chapter, Keller acknowledges that the gene will continue to be an indispensable part of genetic discourse, in spite of the outmoded understandings that people bring to it. While noting the considerable irony in this situation, given the public obsession with the gene in the wake of the Human Genome Project, Keller recognizes that the ambiguity of the gene need not impair its usefulness in public discourse. It will continue to function as a kind of shorthand – an umbrella term – covering the more precise, experimental contexts in which biologists work. Nevertheless, Keller is convinced that we must forge new ways of thinking about biological organization that are politically and scientifically more realistic. The way in which genetics has captured the popular mind requires this change if we are to avoid the false hopes and anxieties that present language too often inspires.

ADAM, EVE, AND THE GENOME: The Human Genome Project and Theology
Edited by Susan Brooks Thistlethwaite (Minneapolis: Fortress Press, 2003), 200pp.

[28] The purpose of this volume is “to provide some theological reflection on the human being by means of a dialogue with the newer advances in human genetics, the Human Genome Project.” The chapters are edited versions of class presentations made in a team-taught course offered to students at Chicago Theological Seminary and the Division of Biological Sciences at the University of Chicago. The instructors are faculty at CTS, with the lone exception being Lainie Friedman Ross, a pediatrician at the U. of Chicago, who provides the scientific material with two chapters on Mendelian and post-Mendelian genetics.

[29] Editor Thistlethwaite, a systematic theologian and President of CTS, provides the major effort (through the introduction and two concluding chapters) in bringing theology into conversation with the scientific material. In doing so, she proposes that liberation and specifically feminist theology can provide a more “innovative theological dialogue partner” with genetic science. Like process theology, feminist theology opposes a “static” view of the human as well as emphasizing the continuity of humanity with creation. She cites the “patriarchal,” Enlightenment heritage as responsible for two understandings that prevent a fruitful connection of theology with nature: the soul/body dichotomy and the exalting of rationality as the distinctive mark of humanity.

[30] Another systematician at CTS, Theodore Jennings, contributes a helpful chapter by proposing several analogies as a way of bridging the gap between theology and genetics. Utilizing a text by the Cappadocian Father, Gregory of Nyssa, he discusses solidarity, speciation, and individuation as three concepts that are both significant for theological anthropology and helpful in forging the unity and cohesiveness between humanity and the rest of creation. Both Jennings and Thistlethwaite emphasize the need to understand human distinctiveness within the continuity between humanity and the biosphere, not in contrast to it along the lines of much theology in the past. Both Tillich and Reinhold Niebuhr are cited as recent examples of a gnostic strain that runs through much of Christian theology. Jennings also discusses the significance of the resurrection of the body in contrast to the immortality of the soul as affirming the importance of the mortal body to Christian eschatology. Just what all this means is not pursued by Jennings; the direction of his thought might lead one to conclude that resurrection must be the resuscitation of a corpse, but I suspect he doesn’t want to say that. Resurrection recognizes our identity as linked with our individuality and our individuality with our bodies, but that truth is also affirmed in the mystery that Saint Paul expresses with the notion of a “spiritual body.” In whatever context we discuss this holistic character of theological anthropology, it gives rise to tensions that cannot be neatly resolved.

[31] The above point is illustrated, I believe, in the alternatives posed by sociobiologists and evolutionary psychologists on the one hand, who would reduce the human to the workings of genetic material, and the traditional Christian anthropology on the other hand that would secure the human by defining it as soul or spirit that is separated from the body. Each view shares the same fallacy in denying the psychosomatic unity of the human, but each can at least take some satisfaction in its logical consistency. One theme running through this book is the importance of affirming the continuity of the human with creation, thus constructing a point of fruitful connection between them and avoiding the fallacy in each of the above views. However, I would have appreciated some more reflection on spelling out the meaning of human distinctiveness within this context of continuity and unity. An important point is made, but some of its implications are left hanging.

[32] Other contributors include Associate Professor of Theology, Ethics and Culture, Laurel C. Schneider, who provides an interesting overview of the history of science in which she identifies the critical issue between science and theology today as the claim that DNA is “a reductive blueprint for the whole of human life…and the key to human destiny…”; an Old Testament theologian, Ken Stone, who raises the subject of etiology as it relates to Old Testament stories as well as to genetic research; and Lee H. Butler, Jr., an African American pastoral theologian who focuses on the misuse of genetic research in documenting supposed theories of racial inferiority. He concludes with the arresting assertion, “…the Genome Project may have the unconscious effect of fueling the genocidal impulse that runs so deep within America.”

[33] Dominant themes in this work echo points made by a number of authors whom we have considered in this series of review articles. Thistlethwaite observes that the new biology is tempted “to attribute far too much causality to genes,” a major point in Ruth Hubbard and Elijah Wald’s Exploding the Gene Myth, to which she refers (see my column in the September, 2003 issue of JLE). She identifies Hubbard’s emphasis on the “embedded” character of genes with her own liberationist/feminist perspective, which looks for a broader understanding of the chemistry of gene expression. Causes of cancer, for example, must include consideration of the stress and strain of everyday living – if our society is sick, then so are we. The nature/nurture divide is in need of being collapsed. This contextual, holistic approach and the rejection of any kind of dualism, both metaphysical and societal, provide the unifying themes that run throughout this work.

Paul T. Jersild

The Rev. Dr. Paul T. Jersild is Emeritus Professor of Theology and Ethics at Lutheran Theological Southern Seminary, Columbia, South Carolina