Sandberg et al 2008 – *James Martin Research Fellow at the Future of Humanity Institute at Oxford University, postdoctoral research assistant for the EU Enhance project, **PhD candidate in Health Policy and Management at Johns Hopkins Bloomberg School of Public Health, special consultant to the Center for Biosecurity at the University of Pittsburgh Medical Center and co-founder of New Harvest, **senior research associate at the Astronomical Observatory of Belgrade, assistant professor of physics at the University of Novi Sad in Serbia and Montenegro (Anders Sandberg, Jason G. Matheny, Milan M. Ćirković, “How Can We Reduce the Risk of Human Extinction”, http://thebulletin.org/web-edition/features/how-can-we-reduce-the-risk-of-human-extinction)
The risks from anthropogenic hazards appear at present larger than those from natural ones. Although great progress has been made in reducing the number of nuclear weapons in the world, humanity is still threatened by the possibility of a global thermonuclear war and a resulting nuclear winter. We may face even greater risks from emerging technologies. Advances in synthetic biology might make it possible to engineer pathogens capable of extinction-level pandemics. The knowledge, equipment, and materials needed to engineer pathogens are more accessible than those needed to build nuclear weapons. And unlike other weapons, pathogens are self-replicating, allowing a small arsenal to become exponentially destructive. Pathogens have been implicated in the extinctions of many wild species. Although most pandemics "fade out" by reducing the density of susceptible populations, pathogens with wide host ranges in multiple species can reach even isolated individuals. The intentional or unintentional release of engineered pathogens with high transmissibility, latency, and lethality might be capable of causing human extinction. While such an event seems unlikely today, the likelihood may increase as biotechnologies continue to improve at a rate rivaling Moore's Law. Farther out in time are technologies that remain theoretical but might be developed this century. Molecular nanotechnology could allow the creation of self-replicating machines capable of destroying the ecosystem. And advances in neuroscience and computation might enable improvements in cognition that accelerate the invention of new weapons. A survey at the Oxford conference found that concerns about human extinction were dominated by fears that new technologies would be misused. These emerging threats are especially challenging as they could become dangerous more quickly than past technologies, outpacing society's ability to control them. As H.G. Wells noted, "Human history becomes more and more a race between education and catastrophe." Such remote risks may seem academic in a world plagued by immediate problems, such as global poverty, HIV, and climate change. But as intimidating as these problems are, they do not threaten human existence. In discussing the risk of nuclear winter, Carl Sagan emphasized the astronomical toll of human extinction: A nuclear war imperils all of our descendants, for as long as there will be humans. Even if the population remains static, with an average lifetime of the order of 100 years, over a typical time period for the biological evolution of a successful species (roughly ten million years), we are talking about some 500 trillion people yet to come. By this criterion, the stakes are one million times greater for extinction than for the more modest nuclear wars that kill "only" hundreds of millions of people. There are many other possible measures of the potential loss--including culture and science, the evolutionary history of the planet, and the significance of the lives of all of our ancestors who contributed to the future of their descendants. Extinction is the undoing of the human enterprise. There is a discontinuity between risks that threaten 10 percent or even 99 percent of humanity and those that threaten 100 percent. For disasters killing less than all humanity, there is a good chance that the species could recover. If we value future human generations, then reducing extinction risks should dominate our considerations. Fortunately, most measures to reduce these risks also improve global security against a range of lesser catastrophes, and thus deserve support regardless of how much one worries about extinction. These measures include: Removing nuclear weapons from hair-trigger alert and further reducing their numbers; Placing safeguards on gene synthesis equipment to prevent synthesis of select pathogens; Improving our ability to respond to infectious diseases, including rapid disease surveillance, diagnosis, and control, as well as accelerated drug development; Funding research on asteroid detection and deflection, "hot spot" eruptions, methane hydrate deposits, and other catastrophic natural hazards; Monitoring developments in key disruptive technologies, such as nanotechnology and computational neuroscience, and developing international policies to reduce the risk of catastrophic accidents.