Maurer And Zoloth Short

  • Uploaded by: Diogenes Lab
  • 0
  • 0
  • April 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Maurer And Zoloth Short as PDF for free.

More details

  • Words: 2,170
  • Pages: 3
Opinions BY STEPHEN M. MAURER & LAURIE ZOLOTH

synthesizing biosecurity Protecting the public from the risks of synthetic biology depends on the scientific community’s will, capacity, and commitment to regulate itself. embers of the synthetic biology community met this past July in Zurich, Switzerland. Synthetic Biology 3.0, as the conference was called, celebrated the discipline’s continued vigor, along with Europe’s entry into what had been a largely U.S. subject. Prominent on the conference agenda were 10 talks devoted to public policy, four of which involved biosecurity, and an entire session devoted to ethics. Speakers agreed that the field’s social and ethical consequences would be as far-reaching as the science. Indeed, most scientists agree that synthetic biology is (or at least could be) something new under the sun. Ten years ago, genetic engineers had to cut and paste DNA from existing organisms to make new drugs such as human growth hormone or synthetic insulin. Today, scientists and engineers can create DNA from scratch. The challenge for synthetic biologists is to exploit this power not just to mimic nature’s designs but also to create new ones. So far, most of the results are proof of principle—organisms that

M

Stephen M. Maurer is an adjunct professor at the University of California-Berkeley’s Goldman School of Public Policy and director of its IT and Homeland Security Project. Laurie Zoloth is director of the Center for Bioethics, Science and Society and professor of medical ethics and humanities at Northwestern University. 16

BU LLETI N OF TH E ATOM IC SC IEN TISTS

oscillate with clock-like regularity or generate odors. But synthetic biology is developing rapidly. Industrial-scale programs for building organisms that secrete precursors for anti-malarial drugs and turn sunlight into energy are under way. Even more ambitious plans—for example, organisms that hunt down and attack cancer cells—are on the horizon. Inevitably, there is a dark side to such technological developments. Just as personal computers brought computing power to the masses, synthetic biology methods reduce the cost and difficulty in genetic engineering and thus may present ways to slash the cost of making biologically based, genetically engineered weapons that are vaccine resistant or combine toxins from different organisms. Furthermore, synthetic biologists have already shown how terrorists could obtain life forms that now exist only in carefully guarded facilities, such as polio and 1918 influenza samples. Finally, synthetic biology’s efforts to reprogram life have raised concerns in some quarters that the technology could one day be used to make radically new weapons, such as pathogens that could be narrowly targeted toward populations with known genetic susceptibilities. Synthetic biologists have been debating “precautionary measures” against misuse since at least 2004. Building on the example of software engineers, who turned their World Wide Web Consortium conventions into a surprisingly

N OVEMBER/DECEMBER 2 0 07

effective governance body, synthetic biologists have explored using the field’s annual conventions to do the same for biosecurity. Yet the community struggles to identify and pursue the best strategy for guarding against misuse of synthetic biology. Many times before, science has inspired remarkable hope for human welfare, and remarkable fears. During the Cold War, scientists could do nothing to stop the Soviet Union and the West from developing nuclear weapons, despite individual efforts. Every policy lever that mattered—funding, regulation, treaties—required government action. The terrorist threat and the emergence of new nuclear powers changed that calculus. To succeed today, terrorists and some nations will need help (most likely, unwittingly) from science’s decentralized, freewheeling, globalized networks. The A. Q. Khan network, which sold nuclear technology to Iran, Libya, and North Korea, was in many ways the prototype for this scenario. The science has changed, too. The twentieth century saw the emergence of the scientific pursuits that will dominate this century: molecular biology and its technological cousin, biotechnology. Biotechnology has already collapsed the traditional distinction between basic research and products. Security experts rightly worry that biotech laboratories could similarly create organisms that are immediately useful as weapons, obviating the multibilliondollar testing and development cycles

Vol. 63, No. 6, pp. 16-18 DOI: 10.2968/063006004

ALLAN BURCH

that used to deter non-state actors. Like the discovery of nuclear fission seven decades ago, what scientists do in their labs has a global impact on security. Government officials are right to think that regulating academic and commercial science today is nearly impossible. Synthetic biology faces unique challenges as an interdisciplinary field, with elements of both engineering and biology cultures that embrace strong values of open sourcing, social responsibility, and academic freedom. In this environment, initiatives developed by the synthetic biology community may be more effective than government regulation precisely because they are more likely to be respected and taken seriously. From a policy standpoint, too, building a nongovernmental body for implementing biosecurity policy seems like a good investment. First, community decisions can be made in months, not years. Second,

and community-based initiatives are inherently international. If synthetic biologists from the United States, Europe, and Asia all agree to do something, national borders won’t matter. Success depends upon the scientific community’s will, capacity, and commitment to regulate itself. When questions arose about the security implications of their work, synthetic biologists’ initial response was to organize workshops and hold meetings. But academic discourse has limits in a field in the midst of a larger biosecurity debate. By 2005, members of the synthetic biology community were suggesting that the fledgling community use its second conference to adopt concrete security measures. A separate group of researchers (including the authors) received grants from the Carnegie and MacArthur foundations to facilitate the process by working with the community to identify options that

solutions that require scientists’ consent are likely to be far less disruptive for working labs than externally imposed rules. Finally, biosecurity policy needs to be consistent across countries,

not only could be implemented as grass roots initiatives but also already enjoyed widespread support within the community. The resulting list eventually included developing improved

screening of DNA sequence orders likely to produce known pathogens; creating mechanisms for expert outside review before starting “experiments of concern” that might lead to improved weapons; establishing an online database for tracking accidents and emerging biosecurity issues; and strengthening existing lab-training, peer-review, and openness norms to discourage rogue scientists from diverting synthetic biology for malicious purposes. Despite the enthusiasm for developing security mechanisms, the second conference stopped short of a formal vote. Conference participants had planned to vote on instituting a range of security steps, but in the weeks before the meeting, several members privately started to oppose the idea of using the conference as an expanded town hall with a vote at the end. Some needed more time to think about the ideas. Others were concerned that the conference needed a constitution before it could vote, or that a vote might be divisive. Some participants hesitated out of respect for the fierce opposition of activists, some of whom announced the morning before the meeting that they strongly opposed a vote—even at a public meeting that they could attend. The idea of scientists deciding how to regulate their work, the activists argued, was inherently exclusionary and intolerable. Conference organizers did not, however, give up. Instead, they successfully urged the conference audience to suggest ideas for a declaration that would be posted on the web. Building on the options that we

N OVEMBER /D ECEMBER 2 0 07

B UL L E TIN O F THE ATO MIC SC IE N T I S T S

17

Opinions had identified, the declaration also announced that a working group would build second-generation screening tools and urged members to stop doing business with companies that failed to screen sequenced DNA orders. Later that summer, the Federation of American Scientists hosted a workshop that began designing the proposed software. By fall, industry had formed a collaboration called the International Consortium for Polynucleotide Synthesis as an umbrella organization for moving the agenda forward, and a German company called Entelechon had begun work on a related open-source software project. This is where things stood in July 2007 when the scientists and engineers met in Zurich. While some at the meeting were clearly unaware of earlier discussions, groups from MIT, the J. Craig Venter Institute, and newly emerging European communities

DULY

reported new work to identify ideas for self-regulation and guidelines. Organizers also reached out by inviting an activist group that had protested the previous meeting to speak at a plenary session. Finally, conference organizers held an event that gave local citizens a chance to question the nature, goals, and meaning of this emerging discipline. Talk, of course, is no good unless it leads to action. Here, too, the Zurich meeting provided at least limited grounds for optimism. While no one tried to resurrect the idea of using the conferences as a governance body, a few members spoke of creating an entirely new professional society to fill the gap. More importantly, members were clearly making progress on second-generation screening tools. The most concrete remarks came from industry, when Ralf Wagner, the head of Germany’s Geneart,

noted

ECONOMIES OF INFECTION | “The larger the pandemic, in terms of proportion of the population infected, the greater the economic impact. For infection rates up to 1 percent of the world’s population, a decrease in global [gross domestic product] of 5 percent could be expected, with an additional loss of 1 percent per additional percentage increase in infection rate. Once a critical infection rate was reached, the cumulative economic disruption would produce a shut-down of the global economy, similar to that seen in the United Kingdom’s agricultural economy following the 2001 outbreak of foot-and-mouth disease. . . .”—The World Health Report 2007: A Safer Future: Global Public Health Security in the 21st Century, August 2007.

WELCOME, TO THE MACHINES | “Silicon machines can now play chess better than any protein machines can. Big deal. This calm and reasonable reaction, however, is hard for most people to sustain. They don’t like the idea that their brains are protein machines.”—Daniel C. Dennett, “Higher Games,” Technology Review, September/October 2007.

ENERGY SEPPUKU | “As soon as it is discovered that Japan has diverted [nuclear materials and technologies] to weapons development, Japan will not only see its supplies of nuclear fuel cut off but will also be required to return all the material provided theretofore. This will lead to a virtual suspension of all nuclear activities in Japan, causing serious economic damage to a country in which nuclear energy accounts for a third of total power generation.”—Tetsuya Endo, former vice chairman of the Atomic Energy Commission of Japan, speculating about the international reaction if Japan were to develop nuclear weapons, in a publication of the Association of the Japanese Institutes of Strategic Studies.

THE NUCLEAR SEE | “In the difficult crossroads in which humanity finds itself, the commitment. . . to support the use of peaceful and safe nuclear technology. . . is always more present and urgent.” —Pope Benedict XVI congratulating the International Atomic Energy Agency on its fiftieth anniversary on July 29, 2007.

18

BU LLETI N OF TH E ATOMIC SCIEN TISTS

N OVEMBER/DECEMBER 2 0 07

described his company’s detailed efforts to implement biosecurity screening as simply “the cost that should be involved in doing this.” It is likely that the community will eventually press all companies to follow his lead. A four-university consortium (Berkeley, Northwestern, Duke, and Maryland) is also developing an online portal where synthetic biologists can get advice before embarking on “experiments of concern.” One of the open research questions for policy makers and bioethicists who study emerging science remains: What is the proper academic role of the “stranger” discipline—such as law, moral philosophy, or political theory— when scholars of these fields participate in the governance of other fields? Are we members, participant observers, or objective critics of scientific communities? Does this change if the research emerges as a matter of public policy and public attention, such as stem cells, neuroscience, or synthetic biology? The role of profession-based democratic processes is also uncertain. The example of synthetic biologists shows how tentative—and easily derailed— the process remains. Still, democracy is largely a matter of habit. If the current community-based initiatives succeed, members will understand that part of science is taking both personal and collective responsibility for its consequences. At that point, governance will start to look much more natural. There is nothing like a push to get things started. The National Institutes of Health’s National Science Advisory Board for Biosecurity has told biologists that they need to put their house in order. Anyone who has been to Washington in the past six months can testify to similar, if not blunter, warnings. If synthetic biologists do not take action (and perhaps even if they do), government will not shrink from imposing its own regulations. Synthetic biologists have always wanted to do the right thing, but there is now a new urgency. After four years of discussion and debate, can they implement and enforce biosecurity initiatives as successfully and elegantly as they reengineer life forms? 

Related Documents

Short
May 2020 25
Short
June 2020 21

More Documents from ""