Oxford’s Global Priorities Project has compiled a list of catastrophes – both natural and self-engineered – that could kill off 10 percent or more of the human population: natural pandemics and extreme conventional warfare are the most important global risks, with artificial intelligence (AI) and natural disasters also posing serious threats.
The researchers of the Global Challenges Foundation have ranked the most dangerous threats facing our civilization according to probability.
The likeliest risks include nuclear war and pandemics (both natural and deliberately engineered), followed by disasters stemming from runaway climate change, geoengineering run amok, and disruptions posed by artificial intelligence.
The report, entitled Global Catastrophic Risks 2016, also included low-probability — but high impact —events, like asteroid impacts and supervolcanic eruptions.
The authors defined Global Catastrophic Risk as a “risk of events or processes that would lead to the deaths of approximately a tenth of the world’s population, or have a comparable impact.”
These dangers are qualitatively distinct from existential risks, which are severe enough to wipe out all human life from the planet.
“The global catastrophic risks in this report can be divided into two categories. Some are ongoing and could potentially occur in any given year,” the paper says.
“Others are emerging and may be very unlikely today but will become significantly more likely in the coming decades.”
The most significant ongoing risks are natural pandemics and nuclear war, whereas the most significant emerging risks are catastrophic climate change and risks stemming from emerging technologies, it states.
While explaining the risks of natural pandemics, the authors cited as an example the damage caused by the Spanish influenza pandemic of 1918, Ebola, SARS or H5N1 influenza virus.
The invention of nuclear weapons, the report explains, ushered in a new era of risks created by human action.
“A large nuclear war between major powers would likely kill tens or hundreds of millions in the initial conflict, and perhaps many more if a nuclear winter were to follow,” it forecasts.
Climate change is a well-known anthropogenic risk. Even if the world succeeds in limiting emissions, scientists expect significant climate change to occur. This could bring a host of global challenges including environmental degradation, migration, and the possibility of resource conflict.
While explaining the risks stemming from emerging technologies, the authors stated that “emerging technologies promise significant benefits, but a handful could also create unprecedented risks to civilization and the biosphere alike.”
For example, “splitting the atom brought the promise of clean power, but also led to the nuclear bomb, which has brought humanity to the brink of catastrophe on more than one occasion”.
The burning of fossil fuels has brought huge improvements in human welfare, but unless strong action is taken soon, there is an unacceptable chance that our children and grandchildren will face catastrophic global warming.
Rapid developments in biotechnology could enable scientists to develop new therapies to reduce the global burden of disease and feed a growing population, but might also in the future give malicious groups the capacity to synthesize devastating pathogens.
Biotechnology could enable the creation of pathogens far more damaging than those found in nature, while in the longer run, artificial intelligence could cause massive disruption.
There are also exogenous risks, which arise independent of human activity (whereas even natural epidemics are spread by humans).
Among those are supervolcano eruptions, threats from asteroids and comets.
“The historical evidence suggests that exogenous global catastrophic risks cannot be too frequent, and may therefore be much less likely than some of the anthropogenic (caused by humans) risks,” the report therefore states.