Sunteți pe pagina 1din 16

Global catastrophic risk

Scope

(Cosmic)

Transgenerational

Global

Local

Artists impression of a major asteroid impact. An asteroid with


an impact strength of a billion atomic bombs may have caused
the extinction of the dinosaurs.[1]

Personal

Loss of
one beetle
species

Drastic loss
of biodiversity

Global
warming
by 0.01 K

Destruction
of ozone
layer

Aging?

Congestion
from one
extra vehicle

Recession
in one
country

Genocide

Loss of
one hair

Ones car
is stolen

Fatal car
crash

Imperceptible

Endurable

Terminal

Existential
risk

Global
catastrophic
risk

Severity
(Hellish)

A global catastrophic risk is a hypothetical future event


with the potential to seriously damage human well-being Scope/intensity grid from Bostroms paper Existential Risk Pre[8]
on a global scale.[2] Some events could destroy or crip- vention as Global Priority
ple modern civilization. Other, even more severe, events
could cause human extinction.[3] These are referred to as
is one that either destroys humanity entirely or prevents
existential risks.
any chance of civilization recovering. Bostrom considers
Natural disasters, such as supervolcanoes and asteroids, existential risks to be far more signicant.[9]
pose such risks if suciently powerful. Events caused by
Bostrom identies four types of existential risk. Bangs
humans could also threaten the survival of intelligent life
are sudden catastrophes, which may be accidental or deon Earth. Such anthropogenic events could include catasliberate. He thinks the most likely sources of bangs are
[4]
trophic global warming, nuclear war, or bioterrorism.
malicious use of nanotechnology, nuclear war, and the
The Future of Humanity Institute believes that human expossibility that the universe is a simulation that will end.
tinction is more likely to result from anthropogenic causes
Crunches are scenarios in which humanity survives but
[5][6]
than natural causes.
civilization is irreversibly destroyed. The most likely
Researchers experience diculty in studying human ex- causes of this, he believes, are exhaustion of natural retinction directly, since humanity has never been destroyed sources, a stable global government that prevents technobefore.[7] While this does not mean that it will not be in logical progress, or dysgenic pressures that lower average
the future, it does make modelling existential risks di- intelligence. Shrieks are undesirable futures. For excult, due in part to survivorship bias.
ample, if a single mind enhances its powers by merging
with a computer, it could dominate human civilization,
which could be bad. Bostrom believes that this scenario
is most likely, followed by awed superintelligence and a
1 Classications of risk
repressive totalitarian regime. Whimpers are the gradThe philosopher Nick Bostrom classies risks according ual decline of human civilization or current values. He
to their scope and intensity.[6] He considers risks that thinks the most likely cause would be evolution changing
[3]
are at least global in scope and endurable in inten- moral preference, followed by extraterrestrial invasion.
Similarly, in Catastrophe: Risk and Response, Richard
Posner singles out and groups together events that bring
about utter overthrow or ruin on a global, rather than a
local or regional scale. Posner singles out such events
as worthy of special attention on cost-benet grounds because they could directly or indirectly jeopardize the sur-

sity to be global catastrophic risks. Those that are at least


trans-generational (aecting all future generations) in
scope and terminal in intensity are classied as existential risks. While a global catastrophic risk may kill the
vast majority of life on earth, humanity could still potentially recover. An existential risk, on the other hand,
1

3 MORAL IMPORTANCE OF EXISTENTIAL RISK

vival of the human race as a whole.[10] Posners events observation selection eects. Unlike with most events,
include meteor impacts, runaway global warming, grey the failure of a complete extinction event to occur in the
goo, bioterrorism, and particle accelerator accidents.
past is not evidence against their likelihood in the future, because every world that has experienced such an
extinction event has no observers, so regardless of their
no civilization observes existential risks in its
2 Probability of an existential frequency,
history.[7] These anthropic issues can be avoided by lookcatastrophe
ing at evidence that does not have such selection eects,
such as asteroid impact craters on the Moon, or directly
See also: Doomsday argument, a controversial philo- evaluating the likely impact of new technology.[8]
sophical argument based on the Anthropic principle
The following are examples of individuals and institutions
that have made probability predictions about existential
events. Some risks, such as that from asteroid impact,
with a one-in-a-million chance of causing humanitys extinction in the next century,[11] have had their probabilities predicted with considerable precision (though some
scholars claim the actual rate of large impacts could be
much higher than originally calculated).[12] Similarly, the
frequency of volcanic eruptions of sucient magnitude
to cause catastrophic climate change, similar to the Toba
Eruption, which may have almost caused the extinction
of the human race,[13] has been estimated at about 1
in every 50,000 years.[14] The relative danger posed by
other threats is much more dicult to calculate. In
2008, a group of experts on dierent global catastrophic
risks at the Global Catastrophic Risk Conference at the
University of Oxford suggested a 19% chance of human
extinction over the next century. However, the conference report cautions that the methods used to average responses to the informal survey is suspect due to the treatment of non-responses. The probabilities estimated for
various causes are summarized below.
Table source: Future of Humanity Institute, 2008.[15]

There are signicant methodological challenges in estimating these risks with precision. Most attention has
been given to risks to human civilization over the next
100 years, but forecasting for this length of time is difcult. The types of threats posed by nature may prove
relatively constant, though new risks could be discovered.
Anthropogenic threats, however, are likely to change dramatically with the development of new technology; while
volcanoes have been a threat throughout history, nuclear
weapons have only been an issue since the 20th century.
Historically, the ability of experts to predict the future
over these timescales has proved very limited. Manmade threats such as nuclear war or nanotechnology are
harder to predict than natural threats, due to the inherent
methodological diculties in the social sciences. In general, it is hard to estimate the magnitude of the risk from
this or other dangers, especially as both international relations and technology can change rapidly.

2.1 Fermi paradox


In 1950 Enrico Fermi, the Italian physicist, wondered
why humans had not yet encountered extraterrestrial civilizations. He asked, Where is everybody?[16] Given the
age of the universe and its vast number of stars, unless the
Earth is very atypical, extraterrestrial life should be common. So why was there no evidence of extraterrestrial
civilizations? This is known as the Fermi paradox.
One of the many proposed reasons, though not widely accepted, that humans have not yet encountered intelligent
life from other planets (aside from the possibility that it
does not exist), could be due to the probability of existential catastrophes. Namely, other potentially intelligent
civilizations have been wiped out before humans could
nd them or they could nd Earth.[7][17][18]

3 Moral importance of existential


risk
Some scholars have strongly favored reducing existential
risk on the grounds that it greatly benets future generations. Derek Part argues that extinction would be a great
loss because our descendants could potentially survive for
a billion years before the expansion of the Sun makes the
Earth uninhabitable.[19][20] Bostrom argues that there is
even greater potential in colonizing space. If future humans colonize space, they may be able to support a very
large number of people on other planets, potentially lasting for trillions of years.[9] Therefore, reducing existential
risk by even a small amount would have a very signicant
impact on the expected number of people that will exist
in the future.
Little has been written arguing against these positions, but
some scholars would disagree. Exponential discounting
might make these future benets much less signicant.
Gaverick Matheny has argued that such discounting is inappropriate when assessing the value of existential risk
reduction.[11]

Some economists have discussed the importance of


global catastrophic risks, though not existential risks.
Existential risks pose unique challenges to prediction, Martin Weitzman argues that most of the expected ecoeven more than other long-term events, because of nomic damage from climate change may come from the

4.1

Anthropogenic

small chance that warming greatly exceeds the mid-range


expectations, resulting in catastrophic damage.[4] Richard
Posner has argued that we are doing far too little, in general, about small, hard-to-estimate risks of large scale
catastrophes.[21]

or persuade them to change their behavior towards its


own interests, or it may merely obstruct their attempts
at interference.[28] In Bostroms book, Superintelligence:
Paths, Dangers, Strategies, he denes this as the control
problem.[29]

Scope insensitivity inuences how bad people consider


the extinction of the human race to be. For example,
when people are motivated to donate money to altruistic causes, the quantity theyre willing to give does not
increase linearly with the magnitude of the issue: people are as concerned about 200,000 birds getting stuck in
oil as they are about 2,000.[22] Similarly, people are often more concerned about threats to individuals than to
larger groups.[23]

Vernor Vinge has suggested that a moment may come


when computers and robots are smarter than humans.
He calls this "the Singularity.[30] He suggests that
it may be somewhat or possibly very dangerous for
humans.[31] This is discussed by a philosophy called
Singularitarianism.

Potential sources of risk

Existential risks, and other risks to civilization, may come


from natural or man-made sources. It has been argued
that many existential risks are currently unknown.[24]

4.1

Anthropogenic

Some potential existential risks are consequences of manmade technologies.


In 2012, Cambridge University created The Cambridge
Project for Existential Risk which examines threats to
humankind caused by developing technologies.[25] The
stated aim is to establish within the University a multidisciplinary research centre, Centre for the Study of Existential Risk, dedicated to the scientic study and mitigation
of existential risks of this kind.[25]
The Cambridge Project states that the greatest threats
to the human species are man-made; they are articial intelligence, global warming, nuclear war and rogue
biotechnology.[26]
4.1.1

Articial intelligence

See also: Technological singularity Existential_risk


It has been suggested that learning computers that
rapidly become superintelligent may take unforeseen actions or that robots would out-compete humanity (one
technological singularity scenario).[27] Because of its exceptional scheduling and organizational capability and
the range of novel technologies it could develop, it is
possible that the rst Earth superintelligence to emerge
could rapidly become matchless and unrivaled: conceivably it would be able to bring about almost any possible outcome, and be able to foil virtually any attempt
that threatened to prevent it achieving its objectives.[28]
It could eliminate, wiping out if it chose, any other challenging rival intellects; alternatively it might manipulate

Physicist Stephen Hawking, Microsoft founder Bill Gates


and SpaceX founder Elon Musk have expressed concerns
about the possibility that AI could evolve to the point that
humans could not control it, with Hawking theorizing that
this could spell the end of the human race.[32] In 2009,
experts attended a conference hosted by the Association
for the Advancement of Articial Intelligence (AAAI) to
discuss whether computers and robots might be able to
acquire any sort of autonomy, and how much these abilities might pose a threat or hazard. They noted that some
robots have acquired various forms of semi-autonomy,
including being able to nd power sources on their own
and being able to independently choose targets to attack with weapons. They also noted that some computer viruses can evade elimination and have achieved
cockroach intelligence. They noted that self-awareness
as depicted in science-ction is probably unlikely, but
that there were other potential hazards and pitfalls.[30]
Various media sources and scientic groups have noted
separate trends in diering areas which might together
result in greater robotic functionalities and autonomy,
and which pose some inherent concerns.[33][34][35] Eliezer
Yudkowsky believes that risks from articial intelligence
are harder to predict than any other known risks. He also
argues that research into articial intelligence is biased
by anthropomorphism. Since people base their judgments of articial intelligence on their own experience,
he claims that they underestimate the potential power of
AI. He distinguishes between risks due to technical failure of AI, which means that awed algorithms prevent
the AI from carrying out its intended goals, and philosophical failure, which means that the AI is programmed
to realize a awed ideology.[36]
Some experts and academics have questioned the use of
robots for military combat, especially when such robots
are given some degree of autonomous functions.[37] There
are also concerns about technology which might allow
some armed robots to be controlled mainly by other
robots.[38] The US Navy has funded a report which indicates that as military robots become more complex, there
should be greater attention to implications of their ability to make autonomous decisions.[39][40] One researcher
states that autonomous robots might be more humane, as
they could make decisions more eectively. However,
other experts question this.[41]

POTENTIAL SOURCES OF RISK

On the other hand, a friendly AI could help reduce turing is cheap and humans may not be needed on the
existential risk by developing technological solutions to battleeld.[43]
threats.[36]
Since self-regulation by all state and non-state actors
In PBS's O Book, Gary Marcus asks what happens if seems hard to achieve,[49] measures to mitigate war(AIs) decide we are not useful anymore?" Marcus argues related risks have mainly been proposed in the area of
that AI cannot, and should not, be banned, and that the international cooperation.[43][50] International infrastrucsensible thing to do is to start thinking now about AI ture may be expanded giving more sovereignty to the
ethics.[42]
international level. This could help coordinate eorts
for arms control.[51] International institutions dedicated
specically to nanotechnology (perhaps analogously to
the International Atomic Energy Agency IAEA) or gen4.1.2 Nanotechnology
eral arms control may also be designed.[50] One may also
jointly make dierential technological progress on deSee also: Grey goo
fensive technologies, a policy that players should usually favour.[43] The Center for Responsible NanotechnolMany nanoscale technologies are in development or cur- ogy also suggest some technical restrictions.[52] Improved
rently in use.[43] The only one that appears to pose a sig- transparency regarding technological capabilities may be
nicant global catastrophic risk is molecular manufactur- another important facilitator for arms-control.[53]
ing, a technique that would make it possible to build comA grey goo is another catastrophic scenario, which was
plex structures at atomic precision.[44] Molecular manproposed by Eric Drexler in his 1986 book Engines of
ufacturing requires signicant advances in nanotechnolCreation[54] and has been a theme in mainstream meogy, but once achieved could produce highly advanced
dia and ction.[55][56] This scenario involves tiny selfproducts at low costs and in large quantities in nanofactoreplicating robots that consume the entire biosphere usries weighing a kilogram or more.[43][44] When nanofacing it as a source of energy and building blocks. Nanotech
tories gain the ability to produce other nanofactories proexperts including Drexler now discredit the scenario. Acduction may only be limited by relatively abundant factors
cording to Chris Phoenix a So-called grey goo could only
such as input materials, energy and software.[43]
be the product of a deliberate and dicult engineering
Molecular manufacturing could be used to cheaply pro- process, not an accident.[57]
duce, among many other products, highly advanced,
durable weapons.[43] Being equipped with compact computers and motors these could be increasingly au- 4.1.3 Biotechnology
tonomous and have a large range of capabilities.[43]
Phoenix and Treder classify catastrophic risks posed by Biotechnology can pose a global catastrophic risk in the
nanotechnology into three categories: (1) from augment- form of natural pathogens or novel, engineered ones.
ing the development of other technologies such as AI and Such a catastrophe may be brought about[58]by usage
Terrorbiotechnology; (2) by enabling mass-production of poten- in warfare, terrorist attacks or by accident.
ist applications of biotechnology have historically been
tially dangerous products that cause risk dynamics (such
[58]
to a lack of caas arms races) depending on how they are used; (3) from infrequent. To what extent this is due
pabilities or motivation is not resolved.[58]
uncontrolled self-perpetuating processes with destructive
eects. At the same time, nanotechnology may be used Exponential growth has been observed in the
to alleviate several other global catastrophic risks.[43]
biotechnology sector and Noun and Chyba predict
biotechnological
Several researchers state that the bulk of risk from nan- that this will lead to major increases in[58]
They argue
capabilities
in
the
coming
decades.
otechnology comes from the potential to lead to war, arms
that
risks
from
biological
warfare
and
bioterrorism
are
[43][45][46]
races and destructive global government.
Sevdistinct
from
nuclear
and
chemical
threats
because
eral reasons have been suggested why the availability of
nanotech weaponry may with signicant likelihood lead biological pathogens are easier to mass-produce and
to unstable arms races (compared to e.g. nuclear arms their production is hard to control (especially as the
are becoming available even to
races): (1) A large number of players may be tempted to technological capabilities
[58]
individual
users).
[43]
enter the race since the threshold for doing so is low;
(2) the ability to make weapons with molecular manufacturing will be cheap and easy to hide;[43] (3) therefore lack
of insight into the other parties capabilities can tempt
players to arm out of caution or to launch preemptive
strikes;[43][47] (4) molecular manufacturing may reduce
dependency on international trade,[43] a potential peacepromoting factor;[48] (5) wars of aggression may pose a
smaller economic threat to the aggressor since manufac-

Given current development, more risk from novel, engineered pathogens is to be expected in the future.[58]
It has been hypothesized that there is an upper bound
on the virulence (deadliness) of naturally occurring
pathogens.[59] But pathogens may be intentionally or unintentionally genetically modied to change virulence
and other characteristics.[58] A group of Australian researchers e.g. unintentionally changed characteristics of

4.1

Anthropogenic

the mousepox virus while trying to develop a virus to sterilize rodents.[58] The modied virus became highly lethal
even in vaccinated and naturally resistant mice.[46][60] The
technological means to genetically modify virus characteristics are likely to become more widely available in the
future if not properly regulated.[58]

5
weather events and weather-related disasters. Eects of
global warming include loss of biodiversity, stresses to
existing food-producing systems, and increased spread of
infectious diseases such as malaria.

It has been suggested that runaway global warming


(runaway climate change) might cause Earth to become
Noun and Chyba propose three categories of measures to searing hot like Venus. In less extreme scenarios it could
reduce risks from biotechnology and natural pandemics: cause the end of civilization, as we know it.[69]
Regulation or prevention of potentially dangerous re- Using scenario analysis, the Global Scenario Group
search, improved recognition of outbreaks and develop- (GSG), a coalition of international scientists convened by
ing facilities to mitigate disease outbreaks (e.g. better Paul Raskin, developed a series of possible futures for
and/or more widely distributed vaccines).[58]
the world as it enters a Planetary Phase of Civilization.
4.1.4

Warfare and mass destruction

Further information: Nuclear holocaust

One scenario involves the complete breakdown of civilization as the eects of global warming become more
pronounced, competition for scarce resources increases,
and the rift between the poor and the wealthy widens.
The GSGs other scenarios, such as Policy Reform, EcoCommunalism, and Great Transition avoid this societal
collapse and eventually result in environmental and social
sustainability. They claim the outcome is dependent on
human choice[70] and the possible formation of a global
citizens movement which could inuence the trajectory
of global development.[71]

The scenarios that have been explored most frequently


are nuclear warfare and doomsday devices. Although the
probability of a nuclear war per year is slim, Professor
Martin Hellman, described it as inevitable on the long
run; unless the probability approaches zero, inevitably
there will come a day when civilizations luck runs out.[61]
During the Cuban missile crisis, president Kennedy estimated the odds of nuclear war as being somewhere be- 4.1.6 Ecological disaster
tween one out of three and even.[62] The United States
and Russia have a combined arsenal of 15,315 nuclear Main article: Environmental disaster
weapons,[63] and there are an estimated total of 16,400
nuclear weapons in existence worldwide.[64]
An ecological disaster, such as world crop failure and
While popular perception sometimes takes nuclear war collapse of ecosystem services, could be induced by
as the end of the world, experts assign low proba- the present trends of overpopulation, economic develbility to human extinction from nuclear war.[65][66] In opment,[72] and non-sustainable agriculture. Most of
1982, Brian Martin estimated that a USSoviet nu- these scenarios involve one or more of the following:
clear exchange might kill 400-450 million directly and Holocene extinction event, scarcity of water that could
maybe several hundred million more through follow-up lead to approximately one half of the Earths populaconsequences.[65]
tion being without safe drinking water, pollinator deNuclear war could yield unprecedented human death cline, overshing, massive deforestation, desertication,
tolls and habitat destruction. Detonating such a large climate change, or massive water pollution episodes. A
amount of nuclear weaponry would have a long-term ef- very recent threat in this direction is colony collapse dis[73]
a phenomenon that might foreshadow the imfect on the climate, causing cold weather and reduced order,
[74]
[67]
of the Western honeybee. As the
minent
extinction
sunlight
that may generate signicant upheaval in ad[68]
bee
plays
a
vital
role
in pollination, its extinction would
vanced civilizations.
severely disrupt the food chain.
4.1.5

Global warming

4.1.7 World population and agricultural crisis


Main articles: Eects of global warming, Runaway
climate change and Avoiding dangerous climate change Main articles: Malthusian catastrophe and Human
overpopulation
Global warming refers to the warming caused by human
technology since at least the 19th century. Global warming reects abnormal variations to the expected climate
within the Earths atmosphere and subsequent eects on
other parts of the Earth. Projections of future climate
change suggest further global warming, sea level rise, and
an increase in the frequency and severity of some extreme

The 20th century saw a rapid increase in human population due to medical developments and massive increase
in agricultural productivity[75] made by the Green Revolution.[76] Between 1950 and 1984, as the Green Revolution transformed agriculture around the globe, world
grain production increased by 250%. The Green Revolu-

tion in agriculture helped food production to keep pace


with worldwide population growth or actually enabled
population growth. The energy for the Green Revolution was provided by fossil fuels in the form of fertilizers (natural gas), pesticides (oil), and hydrocarbon fueled
irrigation.[77] David Pimentel, professor of ecology and
agriculture at Cornell University, and Mario Giampietro,
senior researcher at the National Research Institute on
Food and Nutrition (INRAN), place in their study Food,
Land, Population and the U.S. Economy the maximum
U.S. population for a sustainable economy at 200 million.
To achieve a sustainable economy and avert disaster, the
United States must reduce its population by at least onethird, and world population will have to be reduced by
two-thirds, says the study.[78]
The authors of this study believe that the mentioned agricultural crisis will only begin to impact us after 2020,
and will not become critical until 2050. Geologist Dale
Allen Pfeier claims that coming decades could see spiraling food prices without relief and massive starvation
on a global level such as never experienced before.[79][80]
Wheat is humanitys 3rd most produced cereal. Extant
fungal infections such as Ug99[81] (a kind of stem rust)
can cause 100% crop losses in most modern varieties.
Little or no treatment is possible and infection spreads
on the wind. Should the worlds large grain producing areas become infected then there would be a crisis in wheat
availability leading to price spikes and shortages in other
food products.[82]

4.1.8

Experimental technology accident

Further information: Grey goo and Bioterrorism


Nick Bostrom suggested that in the pursuit of knowledge,
humanity might inadvertently create a device that could
destroy Earth and the Solar System.[83] Investigations in
nuclear and high-energy physics could create unusual conditions with catastrophic consequences. For example,
scientists worried that the rst nuclear test might ignite
the atmosphere.[84][85] More recently, others worried that
the RHIC[86] or the Large Hadron Collider might start
a chain-reaction global disaster involving black holes,
strangelets, or false vacuum states. These particular concerns have been refuted,[87][88][89][90] but the general concern remains.
Biotechnology could lead to the creation of a pandemic,
chemical warfare could be taken to an extreme,
nanotechnology could lead to grey goo in which out-ofcontrol self-replicating robots consume all living matter
on earth while building more of themselves - in both
cases, either deliberately or by accident.[91]

4.2

Non-anthropogenic

POTENTIAL SOURCES OF RISK

4.2.1 Global pandemic


Main article: Pandemic
The death toll for a pandemic is equal to the virulence
(deadliness) of the pathogen or pathogens, multiplied by
the number of people eventually infected. It has been
hypothesised that there is an upper limit to the virulence of naturally evolved pathogens.[59] This is because a
pathogen that quickly kills its hosts might not have enough
time to spread to new ones, while one that kills its hosts
more slowly or not at all will allow carriers more time
to spread the infection, and thus likely out-compete a
more lethal species or strain.[92] This simple model predicts that if virulence and transmission are not linked in
any way, pathogens will evolve towards low virulence and
rapid transmission. However, this assumption is not always valid and in more complex models, where the level
of virulence and the rate of transmission are related, high
levels of virulence can evolve.[93] The level of virulence
that is possible is instead limited by the existence of
complex populations of hosts, with dierent susceptibilities to infection, or by some hosts being geographically
isolated.[59] The size of the host population and competition between dierent strains of pathogens can also alter
virulence.[94] Interestingly, a pathogen that only infects
humans as a secondary host and usually infects another
species (a zoonosis) may have little constraint on its virulence in people, since infection here is an accidental event
and its evolution is driven by events in another species.[95]
There are numerous historical examples of pandemics[96]
that have had a devastating eect on a large number of
people, which makes the possibility of global pandemic a
realistic threat to human civilization.
4.2.2 Climate change
Climate change refers to Earths natural variations in climate over time. The climate has changed slowly such
as during ice ages, and warmer periods when palm trees
grew in Antarctica. It has been hypothesized that there
was also a period called "snowball Earth" when all the
oceans were covered in a layer of ice. These global climatic changes occurred slowly, prior to the rise of human
civilization about 10 thousand years ago near the end of
the last Major Ice Age when the climate become more stable. However, abrupt climate change on the decade time
scale has occurred regionally. Since civilization originated during a period of stable climate, a natural variation
into a new climate regime (colder or hotter) could pose a
threat to civilization.
Ice age Main article: Ice age
In the history of the Earth, twelve ice ages are known to
have occurred. More ice ages will be possible at an in-

4.2

Non-anthropogenic

terval of 40,000100,000 years. An ice age would have a


serious impact on civilization because vast areas of land
(mainly in North America, Europe, and Asia) could become uninhabitable. It would still be possible to live in the
tropical regions, but with possible loss of humidity and
water. Currently, the world is existing in an interglacial
period within a much older glacial event. The last glacial
expansion ended about 10,000 years ago, and all civilizations evolved later than this. Scientists do not predict that
a natural ice age will occur anytime soon.

4.2.3

Volcanism

7
4.2.4 Megatsunami
Main article: Megatsunami
Another possibility is a megatsunami. A megatsunami
could, for example, destroy the entire East Coast of the
United States. The coastal areas of the entire world could
also be ooded in case of the collapse of the West Antarctic Ice Sheet.[103] While none of these scenarios are likely
to destroy humanity completely, they could regionally
threaten civilization. There have been two recent highfatality tsunamisafter the 2011 Thoku earthquake and
the 2004 Indian Ocean earthquake, although they were
not large enough to be considered megatsunamis. A
megatsunami could have astronomical origins as well,
such as an asteroid impact in an ocean.

Main article: Supervolcano


4.2.5 Geomagnetic reversal
A geological event such as massive ood basalt, volcanism, or the eruption of a supervolcano[97] leading to a socalled volcanic winter, similar to a nuclear winter. One
such event, the Toba eruption,[98] occurred in Indonesia
about 71,500 years ago. According to the Toba catastrophe theory,[99] the event may have reduced human populations to only a few tens of thousands of individuals.
Yellowstone Caldera is another such supervolcano, having undergone 142 or more caldera-forming eruptions in
the past 17 million years.[100] A massive volcano eruption would produce extraordinary intake of volcanic dust,
toxic and greenhouse gases into the atmosphere with serious eects on global climate (towards extreme global
cooling; volcanic winter when in short term, and ice age
when in long term) or global warming (if greenhouse
gases were to prevail).

Main article: Geomagnetic reversal


The magnetic poles of the Earth shifted many times in geologic history. The duration of such a shift is still debated.
Theories exist that during such times, the Earths magnetic eld would be weakened or nonexistent, threatening
electrical civilization or even several species by allowing
radiation from the Sun, especially solar wind, solar ares
or cosmic radiation to reach the surface. These theories
have been somewhat discredited, as statistical analysis
shows no evidence for a correlation between past reversals and past extinctions.[104][105]

4.2.6 Asteroid impact


When the supervolcano at Yellowstone last erupted
640,000 years ago, the magma and ash ejected from the Main articles: Asteroid-impact avoidance and Impact
caldera covered most of the United States west of the event
Mississippi river and part of northeastern Mexico.[101]
Another such eruption could threaten civilization.
Several asteroids have collided with earth in recent geoSuch an eruption could also release large amounts of logical history. The Chicxulub asteroid, for example, is
gases that could alter the balance of the planets carbon theorized to have caused the extinction of the non-avian
dioxide and cause a runaway greenhouse eect, or enough dinosaurs 66 million years ago at the end of the Cretapyroclastic debris and other material might be thrown into ceous. If such an object struck Earth it could have a serithe atmosphere to partially block out the sun and cause a ous impact on civilization. It is even possible that humanvolcanic winter, as happened in 1816 following the erup- ity would be completely destroyed. For this to occur the
tion of Mount Tambora, the so-called Year Without a asteroid would need to be at least 1 km (0.62 mi) in diamSummer. Such an eruption might cause the immediate eter, but probably between 3 and 10 km (26 miles).[107]
deaths of millions of people several hundred miles from Asteroids with a 1 km diameter have impacted the Earth
the eruption, and perhaps billions of deaths[102] world- on average once every 500,000 years.[107] Larger asterwide, due to the failure of the monsoon, resulting in major oids are less common. Small near-Earth asteroids are regcrop failures causing starvation on a massive scale.[102]
ularly observed.
A much more speculative concept is the Verneshot: a hypothetical volcanic eruption caused by the buildup of gas
deep underneath a craton. Such an event may be forceful
enough to launch an extreme amount of material from the
crust and mantle into a sub-orbital trajectory.

In 1.4 million years, the star Gliese 710 is expected to


start causing an increase in the number of meteoroids in
the vicinity of Earth when it passes within 1.1 light years
of the Sun and perturbing the Oort cloud. Dynamic models by Garca-Snchez predict a 5% increase in the rate of

PRECAUTIONS AND PREVENTION

impact.[108] Objects perturbed from the Oort cloud take occur.[117]


millions of years to reach the inner Solar System.
A solar superstorm, which is a drastic and unusual decrease or increase in the Suns power output, could have
severe consequences for life on Earth. (See solar are)
4.2.7 Extraterrestrial invasion
If our universe lies within a false vacuum, a bubble of
lower-energy vacuum could come to exist by chance or
Main article: Alien invasion
otherwise in our universe, and catalyze the conversion of
our universe to a lower energy state in a volume expanding
Extraterrestrial life could invade Earth[109] either to exter- at nearly the speed of light, destroying all that we know
minate and supplant human life, enslave it under a colo- without forewarning.[118] Such an occurrence is called a
nial system, steal the planets resources, or destroy the vacuum metastability event.
planet altogether.
Although evidence of alien life has never been documented, scientists such as Carl Sagan have postulated
that the existence of extraterrestrial life is very likely. In
1969, the "Extra-Terrestrial Exposure Law" was added to
the United States Code of Federal Regulations (Title 14,
Section 1211) in response to the possibility of biological
contamination resulting from the U.S. Apollo Space Program. It was removed in 1991.[110] Scientists consider
such a scenario technically possible, but unlikely.[111]

4.2.8

Cosmic threats

A number of astronomical threats have been identied.


Massive objects, e.g. a star, large planet or black hole,
could be catastrophic if a close encounter occurred in
the Solar System. In April 2008, it was announced that
two simulations of long-term planetary movement, one
at Paris Observatory and the other at University of California, Santa Cruz indicate a 1% chance that Mercury's
orbit could be made unstable by Jupiters gravitational
pull sometime during the lifespan of the Sun. Were this
to happen, the simulations suggest a collision with Earth
could be one of four possible outcomes (the others being
Mercury colliding with the Sun, colliding with Venus, or
being ejected from the Solar System altogether). If Mercury were to collide with Earth, all life on Earth could be
obliterated: an asteroid 15 km wide is believed to have
caused the extinction of the non-avian dinosaurs, whereas
Mercury is 4,879 km in diameter.[112]

5 Discredited scenarios
The belief that the Mayan civilization's Long Count calendar ended abruptly on December 21, 2012 was a misconception due to the Mayan practice of using only ve
places in Long Count Calendar inscriptions. On some
monuments the Mayan calculated dates far into the past
and future but there is no end of the world date. There
was a Piktun ending (a cycle of 13,144,000 day Bak'tuns)
on December 21, 2012. A Piktun marks the end of a
1,872,000 day or approximately 5125 year period and is
a signicant event in the Mayan calendar. However, there
is no historical or scientic evidence that the Mayans believed it would be a doomsday. Some believe it was just
the beginning of another Piktun.[119]
The cataclysmic pole shift hypothesis was formulated in
1872. Revisited repeatedly in the second half of the 20th
century, it proposes that the axis of the Earth with respect
to the crust could change extremely rapidly, causing massive earthquakes, tsunamis, and damaging local climate
changes. The hypothesis is contradicted by the mainstream scientic interpretation of geological data, which
indicates that true polar wander does occur, but very
slowly over millions of years. Sometimes this hypothesis is confused with the accepted theory of geomagnetic
reversal in which the magnetic poles reverse, but which
has no inuence on the axial poles or the rotation of the
solid earth.

Another threat might come from gamma ray bursts.[113]


Both threats are very unlikely in the foreseeable
future.[114]
A similar threat is a hypernova, produced when a
hypergiant star explodes and then collapses, sending
vast amounts of radiation sweeping across hundreds of
lightyears. Hypernovas have never been observed; however, a hypernova may have been the cause of the
OrdovicianSilurian extinction events. The nearest hypergiant is Eta Carinae, approximately 8,000 light-years
distant.[115] The hazards from various astrophysical radiation sources were reviewed in 2011.[116]

6 Precautions and prevention

Planetary management and respecting planetary boundaries have been proposed as approaches to preventing
ecological catastrophes. Within the scope of these approaches, the eld of geoengineering encompasses the
deliberate large-scale engineering and manipulation of
the planetary environment to combat or counteract anthropogenic changes in atmospheric chemistry. Space
colonization is a proposed alternative to improve the odds
of surviving an extinction scenario.[120] Solutions of this
If the Solar System were to pass through a dark nebula, a scope may require megascale engineering. Food storage
cloud of cosmic dust, severe global climate change would has been proposed globally, but the monetary cost would

6.2

Global catastrophic risk reduction organizations

be high. Furthermore, it would likely contribute to the


The Future of Life Institute aims to support research
current millions of deaths per year due to malnutrition.
and initiatives for safeguarding life considering new
David Denkenberger and Joshua Pearce have proposed in
technologies and challenges facing humanity.[124]
Feeding Everyone No Matter What a variety of alternate
foods for global catastrophic risks such as nuclear winter, volcanic winter, asteroid/comet impact, and abrupt 6.2 Global catastrophic risk reduction organizations
climate change.[121] The alternate foods convert fossil fuels or biomass into food. Asteroid deection has been
Global Catastrophic Risk Institute:GCR Institute proposed to reduce this impact risk. Nuclear disarmaA think tank for all things catastrophic risk
ment has been proposed to reduce the nuclear winter risk.
Some precautions that people are already taking for a cat Millennium Alliance for Humanity & The Bioaclysmic event include:
sphere - Growing a global community to advocate
for sustainable practices
Some survivalists have stocked survival retreats with
X Center - Researching how to reason about unexmultiple-year food supplies.
pected events, both small and large scale
The Svalbard Global Seed Vault is a vault buried 400
WHO Global Alert and Response - Monitoring for
feet inside a mountain in the Arctic with over ten
epidemics
tons of seeds from all over the world. 100 million
seeds from more than 100 countries were placed in Connecting Organizations for Regional Disease
side as a precaution to preserve all the worlds crops.
Surveillance - NTI is focused on reducing the risk
A prepared box of rice originating from 104 counfrom Weapons of Mass Destruction, and containtries was the rst to be deposited in the vault, where
ment of damage after the fact
it will be kept at 18 C (0 Fahrenheit). Thousands
more plant species will be added as organizers at USAID Emerging Pandemic Threats Program tempt to get specimens of every agricultural plant in
A US government program seeking to prevent
the world. Cary Fowler, executive director of the
and contain naturally generated pandemics at their
Global Crop Diversity Trust said that by preservsource
ing as many varieties as possible, the options open
Lawrence Livermore National Laboratory Global
to farmers, scientists and governments were maxiSecurity - Researching GCR for the US defense demized. The opening of the seed vault marks a hispartment
toric turning point in safeguarding the worlds crop
diversity, he said. Even if the permafrost starts to
Center for International Security and Cooperation melt, the seeds will be safe inside the vault for up to
Focusing on political cooperation to reduce CGR
200 years. Some of the seeds will even be viable for
World Institute for Nuclear Security - Focused on
a millennium or more, including barley, which can
security and safety training for those who are inlast 2,000 years, wheat (1,700 years), and sorghum
volved in the nuclear industry
(almost 20,000 years).[122]
Bulletin of the Atomic Scientists

6.1

Existential risk reduction organizations

7 See also

The Centre for the Study of Existential Risk studies


four major technological risks: articial intelligence,
biotechnology, global warming and warfare.

10 Ways to End the World

The Foresight Institute aims to increase the benets


and reduce the risks of nanotechnology.

Apocalyptic and post-apocalyptic ction

The Machine Intelligence Research Institute researches the risks of articial intelligence.
The Future of Humanity Institute researches the
questions of humanitys long-term future, particularly existential risk.
The Lifeboat Foundation has a website focusing on
global catastrophic risks and futurology.[123]

Anarcho-primitivism

Degeneration
Doomsday Clock
Eschatology
Future of the Earth
Future of the Solar System
Global catastrophic risk institute

10

8 NOTES

New tribalism
Outside Context Problem
Rare events
Survivalism
Timeline of the far future
Ultimate fate of the universe
The Sixth Extinction: An Unnatural History (nonction book)

Notes

[1] Schulte, P. et al. (5 March 2010). The Chicxulub Asteroid Impact and Mass Extinction at the
Cretaceous-Paleogene Boundary.
Science 327
(5970): 12141218.
Bibcode:2010Sci...327.1214S.
doi:10.1126/science.1177265. PMID 20203042.
[2] Bostrom, Nick (2008). Global Catastrophic Risks. Oxford
University Press. p. 1.
[3] Bostrom, Nick (March 2002). Existential Risks: Analyzing Human Extinction Scenarios and Related Hazards.
Journal of Evolution and Technology 9.

[14] Rampino, M.R. and Ambrose, S.H. (2002). Super eruptions as a threat to civilizations on Earth-like planets.
*Icarus*, 156, 562-569
[15] Global Catastrophic Risks Survey, Technical Report,
2008, Future of Humanity Institute
[16] Jones, E. M. (March 1, 1985). ""Where is everybody?"
An account of Fermis question"". Los Alamos National
Laboratory (LANL), United States Department of Energy. Retrieved January 12, 2013.
[17] Ventrudo, Brian (5 June 2009). So Where Is ET, Anyway?". Universe Today. Retrieved 10 March 2014. Some
believe [the Fermi Paradox] means advanced extraterrestrial societies are rare or nonexistent. Others suggest they
must destroy themselves before they move on to the stars.
[18] Vinn, O (2014). Potential incompatibility of inherited
behavior patterns with civilization. PublishResearch: 1
3. Retrieved 2014-03-05.
[19] Part, Derek (1984). Reasons and Persons. Oxford University Press. pp. 453454.
[20] http://news.bbc.co.uk/2/hi/sci/tech/specials/
washington_2000/649913.stm
[21] Posner, Richard (2004). Catastrophe: risk and response.
Oxford University Press.

[4] Weitzman, Martin (2009). On modeling and interpreting the economics of catastrophic climate change.
The Review of Economics and Statistics 91 (1): 119.
doi:10.1162/rest.91.1.1.

[22] Desvousges, W.H., Johnson, F.R., Dunford, R.W., Boyle,


K.J., Hudson, S.P., and Wilson, N. 1993, Measuring natural resource damages with contingent valuation: tests of
validity and reliability. In Hausman, J.A. (ed), *Contingent Valuation:A Critical Assessment,* pp. 91159 (Amsterdam: North Holland).

[5] Frequently Asked Questions. Existential Risk. Future of


Humanity Institute. Retrieved 26 July 2013.

[23] Eliezer Yudkowsky, 2008, Cognitive Biases potentially


aecting judgments of global risks

[6] Bostrom, Nick. Existential Risk Prevention as a Global


Priority. Existential Risk. Future of Humanity Institute.
Retrieved 23 July 2013.

[24] Karnofsky, Holden.


Possible Global Catastrophic
Risks. GiveWell Blog. GiveWell. Retrieved 24 July
2013.

[7] Observation Selection Eects and Global Catastrophic


Risks, Milan Cirkovic, 2008

[25] The Cambridge Project for Existential Risk. Cambridge


University.

[8] Bostrom, N (2013). Existential Risk Prevention as


Global Priority. Global Policy 4. doi:10.1111/17585899.12002. Retrieved 2014-07-07.

[26] "'Terminator center' to open at Cambridge University.


Fox News. 2012-11-26.

[9] Bostrom, Nick. Astronomical Waste: The opportunity


cost of delayed technological development. Utilitas 15
(3): 308314. doi:10.1017/s0953820800004076.

[27] Bill Joy, Why the future doesn't need us. In:Wired magazine. See also technological singularity.Nick Bostrom
2002 Ethical Issues in Advanced Articial Intelligence
http://www.nickbostrom.com

[10] Posner, Richard A. (2006). Catastrophe : risk and response. Oxford: Oxford University Press. ISBN 9780195306477., Introduction, What is Catastrophe?"

[28] Nick Bostrom 2002 Ethical Issues in Advanced Articial


Intelligence http://www.nickbostrom.com
[29] Superintelligence: Paths, Dangers, Strategies

[11] Matheny, Jason Gaverick (2007). Reducing the Risk of


Human Extinction. Risk Analysis 27 (5).
[12] Asher, D.J., Bailey, M.E., Emelyanenko, V., and Napier,
W.M. (2005). Earth in the cosmic shooting gallery. *The
Observatory*, 125, 319-322.
[13] Ambrose 1998; Rampino & Ambrose 2000, pp. 71, 80.

[30] Scientists Worry Machines May Outsmart Man By JOHN


MARKOFF, NY Times, July 26, 2009.
[31] The Coming Technological Singularity: How to Survive
in the Post-Human Era, by Vernor Vinge, Department of
Mathematical Sciences, San Diego State University, (c)
1993 by Vernor Vinge.

11

[32] Rawlinson, Kevin. Microsofts Bill Gates insists AI is a


threat. BBC News. Retrieved 30 January 2015.
[33] Gaming the Robot Revolution: A military technology expert weighs in on Terminator: Salvation., By P. W. Singer,
slate.com Thursday, May 21, 2009.
[34] Robot takeover, gyre.org.
[35] robot page, engadget.com.
[36] Yudkowsky, Eliezer. Articial Intelligence as a Positive
and Negative Factor in Global Risk. Retrieved 26 July
2013.
[37] Call for debate on killer robots, By Jason Palmer, Science
and technology reporter, BBC News, 8/3/09.
[38] Robot Three-Way Portends Autonomous Future, By
David Axe wired.com, August 13, 2009.
[39] New Navy-funded Report Warns of War Robots Going Terminator, by Jason Mick (Blog), dailytech.com,
February 17, 2009.
[40] Navy report warns of robot uprising, suggests a strong
moral compass, by Joseph L. Flatley engadget.com, Feb
18th 2009.
[41] New role for robot warriors; Drones are just part of a bid
to automate combat. Can virtual ethics make machines
decisionmakers?, by Gregory M. Lamb / Sta writer,
Christian Science Monitor, February 17, 2010.
[42] The Rise of Articial Intelligence. PBS O Book. 11
July 2013. Event occurs at 6:29-7:26. Retrieved 24 October 2013. ...what happens if (AIs) decide we are not
useful anymore? I think we do need to think about how
to build machines that are ethical. The smarter the machines gets, the more important that is... [T]here are so
many advantages to AI in terms of human health, in terms
of education and so forth that I would be reluctant to stop
it. But even if I did think we should stop it, I don't think
its possible... if, lets say, the US Government forbade
development in kind of the way they did development of
new stem cell lines, that would just mean that the research
would go oshore, it wouldn't mean it would stop. The
more sensible thing to do is start thinking now about these
questions... I don't think we can simply ban it.
[43] Chris Phoenix; Mike Treder (2008). Chapter 21: Nanotechnology as global catastrophic risk. In Bostrom,
Nick; Cirkovic, Milan M. Global catastrophic risks. Oxford: Oxford University Press. ISBN 978-0-19-8570509.
[44] Frequently Asked Questions - Molecular Manufacturing. foresight.org. Retrieved 19 July 2014.
[45] Drexler, Eric. A Dialog on Dangers. foresight.org. Retrieved 19 July 2014.
[46] Sandberg, Anders. The ve biggest threats to human existence. http://theconversation.com/''. Retrieved 13 July
2014.
[47] Drexler, Eric. ENGINES OF DESTRUCTION (Chapter 11)". http://e-drexler.com/''. Retrieved 19 July 2014.

[48] Tomasik, Brian. Possible Ways to Promote Compromise. http://foundational-research.org/''. Retrieved 19


July 2014.
[49] Dangers of Molecular Manufacturing. crnano.org. Retrieved 19 July 2014.
[50] The Need for International Control. crnano.org. Retrieved 19 July 2014.
[51] Tomasik, Brian. International Cooperation vs. AI Arms
Race. foundational-research.org. Retrieved 19 July
2014.
[52] Technical Restrictions May Make Nanotechnology
Safer. crnano.org. Retrieved 19 July 2014.
[53] Tomasik, Brian. Possible Ways to Promote Compromise. http://foundational-research.org/''. Retrieved 22
July 2014.
[54] Joseph, Lawrence E. (2007). Apocalypse 2012. New
York: Broadway. p. 6. ISBN 978-0-7679-2448-1.
[55] Rincon, Paul (2004-06-09). Nanotech guru turns back
on 'goo'". BBC News. Retrieved 2012-03-30.
[56] Hapgood, Fred (November 1986). Nanotechnology:
Molecular Machines that Mimic Life. Omni. Retrieved
19 July 2014.
[57] Leading nanotech experts put 'grey goo' in perspective.
crnano.org. Retrieved 19 July 2014.
[58] Ali Noun; Christopher F. Chyba (2008). Chapter
20: Biotechnology and biosecurity. In Bostrom, Nick;
Cirkovic, Milan M. Global Catastrophic Risks. Oxford
University Press.
[59] Frank SA (March 1996). Models of parasite virulence.
Q Rev Biol 71 (1): 3778. doi:10.1086/419267. PMID
8919665.
[60] Jackson, Ronald J.; Ramsay, Alistair J.; Christensen, Carina D.; Beaton, Sandra; Hall, Diana F.; Ramshaw, Ian
A. (2001). Expression of Mouse Interleukin-4 by a Recombinant Ectromelia Virus Suppresses Cytolytic Lymphocyte Responses and Overcomes Genetic Resistance
to Mousepox. Journal of virology 75 (3): 12051210.
doi:10.1128/jvi.75.3.1205-1210.2001. Retrieved 13 July
2014.
[61] http://www-ee.stanford.edu/~{}hellman/opinion/
inevitability.html
[62] Nuclear Weapons and the Future of Humanity: The
Fundamental Questions by Avner Cohen, Steven Lee
- http://books.google.ro/books?id=gYmPp6lZqtMC&
printsec=frontcover&hl=ro#v=onepage&q&f=false
page 237
[63] List of states with nuclear weapons
[64] List of states with nuclear weapons#Statistics and force
conguration
[65] Martin, Brian (1982). Critique of nuclear extinction. Journal of Peace Research 19 (4): 287300.
doi:10.1177/002234338201900401. Retrieved 25 October 2014.

12

[66] Shulman, Carl (5 Nov 2012). Nuclear winter and human


extinction: Q&A with Luke Oman. Overcoming Bias.
Retrieved 25 October 2014.
[67] http://climate.envsci.rutgers.edu/pdf/acp-7-1973-2007.
pdf
[68] Bostrom 2002, section 4.2.
[69] Isaac M. Held, Brian J. Soden, Water Vapor Feedback and
Global Warming, In: Annu. Rev. Energy Environ 2000.
available online. Page 449.
[70] World Lines: Pathways, Pivots, and the Global Future.
Paul Raskin. 2006. Boston:Tellus Institute
[71] Dawn of the Cosmopolitan: The Hope of a Global Citizens Movement Orion Kriegman. 2006. Boston:Tellus
Institute
[72] Chiarelli, B. (1998). Overpopulation and the Threat
of Ecological Disaster: the Need for Global Bioethics.
Mankind Quarterly 39 (2): 225230.
[73] Evans-Pritchard, Ambrose (6 February 2011). Einstein
was right - honey bee collapse threatens global food security. The Daily Telegraph (London).
[74] Lovgren, Stefan. "Mystery Bee Disappearances Sweeping
U.S." National Geographic News. URL accessed March
10, 2007.
[75] The end of Indias green revolution?". BBC News. 200605-29. Retrieved 2012-01-31.
[76] April 8th, 2000 by admin (2000-04-08).
Food
First/Institute for Food and Development Policy. Foodrst.org. Retrieved 2012-01-31.
[77] How peak oil could lead to starvation. Web.archive.org.
2009-05-27. Retrieved 2012-01-31.
[78] Eating Fossil Fuels. EnergyBulletin.net. 2003-10-02.
Retrieved 2012-01-31.
[79] The Oil Drum: Europe. Agriculture Meets Peak Oil.
Europe.theoildrum.com. Retrieved 2012-01-31.
[80] http://www.uncommonthought.com/mtblog/archives/
2007/11/08/drawing-momentu.php

8 NOTES

[88] http://www.aps.org/units/dpf/governance/reports/
upload/lhc_saftey_statement.pdf
[89] Safety at the LHC.
[90] J. Blaizot et al., Study of Potentially Dangerous Events
During Heavy-Ion Collisions at the LHC, CERN library
record CERN Yellow Reports Server (PDF)
[91] Eric Drexler, Engines of Creation, ISBN 0-385-19973-2,
available online
[92] Brown NF, Wickham ME, Coombes BK, Finlay BB
(May 2006). Crossing the Line: Selection and Evolution of Virulence Traits. PLoS Pathogens 2 (5):
e42. doi:10.1371/journal.ppat.0020042. PMC 1464392.
PMID 16733541.
[93] Ebert D, Bull JJ (January 2003). Challenging the
trade-o model for the evolution of virulence: is virulence management feasible?". Trends Microbiol. 11 (1):
1520. doi:10.1016/S0966-842X(02)00003-3. PMID
12526850.
[94] Andr JB, Hochberg ME (July 2005). Virulence evolution in emerging infectious diseases. Evolution 59 (7):
140612. doi:10.1554/05-111. PMID 16153027.
[95] Gandon S (March 2004). Evolution of multihost parasites. Evolution 58 (3): 45569. doi:10.1111/j.00143820.2004.tb01669.x. PMID 15119430.
[96] Near Apocalypse Causing Diseases, a Historical Look:".
postapocalypticsurvival.com. Retrieved 2012-05-05.
[97] Kate Ravilious (2005-04-14). What a way to go. The
Guardian.
[98] 2012 Admin (2008-02-04). Toba Supervolcano. 2012
Final Fantasy.
[99] Science Reference. Toba Catastrophe Theory. Science
Daily.
[100] Breining, Greg (2007). Super Volcano: The Ticking
Time Bomb Beneath Yellowstone National Park. Voyageur
Press. p. 256. ISBN 978-0-7603-2925-2.

[81] Cereal Disease Laboratory : Ug99 an emerging virulent


stem rust race. Ars.usda.gov. Retrieved 2012-01-31.
[101] Breining, Greg (2007). Distant Death. Super Volcano: The Ticking Time Bomb Beneath Yellowstone Na[82] Durable
Rust
Resistance
in
Wheat.
tional Park. St. Paul, MN.: Voyageur Press. p. 256 pg.
Wheatrust.cornell.edu. Retrieved 2012-01-31.
ISBN 978-0-7603-2925-2.
[83] Bostrom 2002, section 4.8
[102] Breining, Greg (2007). The Next Big Blast. Super Vol[84] Richard Hamming. Mathematics on a Distant Planet.
cano: The Ticking Time Bomb Beneath Yellowstone National Park. St. Paul, MN.: Voyageur Press. p. 256 pg.
[85] Report LA-602, ''Ignition of the Atmosphere With NuISBN 978-0-7603-2925-2.
clear Bombs'" (PDF). Retrieved 2011-10-19.
[86] New Scientist, 28 August 1999: A Black Hole Ate My [103]
Planet
[104]
[87] Konopinski, E. J; Marvin, C.; Teller, Edward (1946).
Ignition of the Atmosphere with Nuclear Bombs (PDF)
(Declassied February 1973) (LA602). Los Alamos National Laboratory. Retrieved 23 November 2008.

US West Antarctic Ice Sheet initiative


Plotnick, Roy E. (1 January 1980).
Relationship between biological extinctions and geomagnetic reversals.
Geology 8 (12):
578.
Bibcode:1980Geo.....8..578P.
doi:10.1130/00917613(1980)8<578:RBBEAG>2.0.CO;2.

13

[105] Glassmeier, Karl-Heinz; Vogt, Joachim (29 May [122] Lewis Smith (2008-02-27). Doomsday vault for worlds
2010). Magnetic Polarity Transitions and Biospheric
seeds is opened under Arctic mountain. London: The
Eects. Space Science Reviews 155 (1-4): 387410.
Times Online.
Bibcode:2010SSRv..155..387G. doi:10.1007/s11214[123] About the Lifeboat Foundation. The Lifeboat Founda010-9659-6.
tion. Retrieved 26 April 2013.
[106] U.S.Congress (Spring 2013). Threats From Space: a
[124] The Future of Life Institute. Retrieved May 5, 2014.
Review of U.S. Government Eorts to Track and mitigate Asteroids and Meteors (Part I and Part II) - Hearing
Before the Committee on Science, Space, and Technology House of Representatives One Hundred Thirteenth
Congress First Session. United States Congress (Hearings held 19 March 2013 and 10 April 2013). p. 147.
Bostrom, Nick (March 2002). Existential Risks:
Retrieved 3 May 2014.

9 References

[107] Bostrom 2002, section 4.10


[108] Garca-Snchez, Joan et al. (February 1999). Stellar
Encounters with the Oort Cloud Based on HIPPARCOS
Data. The Astronomical Journal 117 (2): 10421055.
Bibcode:1999AJ....117.1042G. doi:10.1086/300723.
[109] Twenty ways the world could end suddenly, Discover
Magazine
[110] Urban Legends Reference Pages: Legal Aairs (E.T.
Make Bail)
[111] Bostrom 2002, section 7.2
[112] Ken Croswell, Will Mercury Hit Earth Someday?,
Skyandtelescope.com April 24, 2008, accessed April 26,
2008

Analyzing Human Extinction Scenarios and Related


Hazards. Journal of Evolution and Technology 9
(1).

Corey S. Powell (2000). Twenty ways the world


could end suddenly, Discover Magazine
Martin Rees (2004). OUR FINAL HOUR: A Scientists warning: How Terror, Error, and Environmental Disaster Threaten Humankinds Future in This
Century On Earth and Beyond. ISBN 0-46506863-4
Jean-Francois Rischard (2003). High Noon 20
Global Problems, 20 Years to Solve Them. ISBN 0465-07010-8
Edward O. Wilson (2003). The Future of Life.
ISBN 0-679-76811-4

[113] Explosions in Space May Have Initiated Ancient Extinction on Earth, NASA.
[114] Bostrom 2002, section 4.7

10 Further reading

[115] Wanjek, Christopher (2005-04-06). Explosions in Space


May Have Initiated Ancient Extinction on Earth. NASA.

Derrick Jensen (2006) Endgame. ISBN 1-58322730-X

[116] Melott, A.L. and Thomas, B.C. (2011).


Astrophysical Ionizing Radiation and the Earth:
A
Brief Review and Census of Intermittent Intense Sources.
Astrobiology 11:
343361.
arXiv:1102.2830.
Bibcode:2011AsBio..11..343M.
doi:10.1089/ast.2010.0603.

Jared Diamond (2005). Collapse: How Societies


Choose to Fail or Succeed. ISBN 0-670-03337-5

[117] Fraser Cain (2003-08-04). Local Galactic Dust is on the


Rise. Universe Today.
[118] Coleman, Sidney;
De Luccia, Frank (1980Gravitational eects on and of vac06-15).
uum decay.
Physical Review D D21 (12):
33053315.
Bibcode:1980PhRvD..21.3305C.
doi:10.1103/PhysRevD.21.3305.

Huesemann, Michael H., and Joyce A. Huesemann


(2011). Technox: Why Technology Wont Save Us
or the Environment, Chapter 6, Sustainability or
Collapse, New Society Publishers, Gabriola Island,
British Columbia, Canada, ISBN 0865717044, 464
pp.
Joel Garreau, Radical Evolution, 2005
John Leslie (1996). The End of the World. ISBN
0-415-14043-9

[119] Apocalypse 2012 - Tall tales that the End of Days is coming in 2012. by Brian Dunning

Martin Rees, Our Final Hour (UK title: Our Final


Century), 2003, ISBN 0-465-06862-6

[120] Mankind must abandon earth or face extinction: Hawking, physorg.com, August 9, 2010, retrieved 2012-01-23

Meadows, Donella H. (1972). The Limits to Growth.


ISBN 0-87663-165-0

[121] D.C. Denkenberger and J. M. Pearce. Feeding Everyone


No Matter What: Managing Food Security After Global
Catastrophe, Elsevier, San Francisco, 2014.

Tainter, Joseph A, (1990). The Collapse of Complex


Societies, Cambridge University Press, Cambridge,
UK.

14

11

11

External links

Last Days On Earth (TV documentary) ABC News


2-hour Special Edition of 20/20 on 7 real end-ofthe-world scenarios (Wed. Aug 30 2006)
What a way to go from The Guardian. Ten scientists name the biggest danger to Earth and assesses
the chances of it happening. April 14, 2005.
Confronting the New Misanthropy, by Frank
Furedi in Spiked, April 18, 2006
Ted.com (video) - Stephen Petranek: 10 ways the
world could end
Armageddon Online, A collection of doomsday scenarios and daily news
Doomsday Guide, a directory devoted to end times
theories
Top 10 Ways to Destroy Earth
Several potential world ending scenarios
Countdown to Doomsday with Today Show Host
Matt Lauer. SciFi.com (Syfy). 2006.
- A website about existential risk by Nick Bostrom.
Cognitive biases potentially aecting judgment of
global risks - A paper by Eliezer Yudkowsky discussing how various observed cognitive biases hamper our judgement of existential risk.
Why the future doesn't need us, Wired.com, April
2000 - Bill Joy's inuential call to relinquish dangerous technologies.
Being present in the face of existential threat: The
role of trait mindfulness in reducing defensive responses to mortality salience.

EXTERNAL LINKS

15

12
12.1

Text and image sources, contributors, and licenses


Text

Global catastrophic risk Source: http://en.wikipedia.org/wiki/Global%20catastrophic%20risk?oldid=656931077 Contributors: Bryan


Derksen, Roadrunner, Zadcat, Stevertigo, Edward, Nealmcb, Dante Alighieri, Gabbe, IZAK, Slug, William M. Connolley, Emperor,
Jeandr du Toit, Timwi, Dbabbitt, Raul654, Pakaran, Jerzy, Owen, Chuunen Baka, Chealer, Lowellian, YBeayf, Bkell, Millosh, Netje, Timvasquez, Fastssion, Everyking, Curps, Alison, Mboverload, Chameleon, Beland, Jossi, Kuralyov, Icairns, Nickptar, NoPetrol,
Neutrality, Gerrit, Jbshaler, Discospinster, Rich Farmbrough, Leibniz, Amot, Stbalbach, Bender235, RJHall, Mr. Billion, El C,
Cphoenix, Mwanner, Tom, C1k3, Sietse Snel, Bobo192, Harley peters, Walkiped, Viriditas, Cmdrjameson, ZayZayEM, James barton,
I9Q79oL78KiL0QTFHgyc, Roy da Vinci, Manu.m, Hackle577, Tablizer, Slugmaster, Yamla, Hinotori, Tache, Bart133, Coblin, Schaefer, Isaac, SidP, Super-Magician, Knowledge Seeker, Tony Sidaway, Amorymeltzer, Cheyinka, Kitch, Y0u, Bacteria, Yeastbeast, Jeffrey O. Gustafson, Woohookitty, Mindmatrix, Mazca, ^demon, KevinOKeee, Steinbach, GregorB, Teemu Leisti, Rufous, SilhouetteSaloon, Siqbal, Graham87, BD2412, Josh Parris, Drbogdan, Rjwilmsi, Scandum, Marasama, Eyu100, Fred Bradstadt, Fish and karate,
FayssalF, Wragge, Itinerant1, Jaxal1, Losecontrol, Simishag, LeCire, Zotel, Glenn L, Idaltu, CJLL Wright, Elfguy, Measure, Wavelength, Rt66lt, Kwarizmi, Rxnd, Hydrargyrum, Tenebrae, Iamfscked, Gaius Cornelius, Ritchy, NawlinWiki, Keithonearth, Nirvana2013,
Robertvan1, Daanschr, Banes, JessicaX, PonyToast, Epipelagic, Deku-shrub, MrBark, CalgaryWikifan, Jkelly, WAS 4.250, Perspicacious, Arthur Rubin, Xaxafrad, JQF, Wsiegmund, Petri Krohn, GraemeL, CWenger, Shawnc, Katieh5584, Paranoiaagent13, Robertd,
SmackBot, AaronM, Zazaban, KnowledgeOfSelf, McGeddon, Ramdrake, Leki, Pkirlin, Delldot, Hardyplants, William Case Morris, GraemeMcRae, FMB, Gilliam, Portillo, OrionK, Ohnoitsjamie, Betacommand, Carl.bunderson, Squiddy, Jero77, Chris the speller, Kidane,
Miguel Andrade, MovGP0, Gabrinae, WikiPedant, Jahiegel, PeteShanosky, Akhilleus, Nixeagle, JonHarder, Thisisbossi, Ruw1090, TKD,
LouScheer, Auno3, Cybercobra, MureninC, Savidan, John D. Croft, SpacemanAfrica, Paul H., Badgerpatrol, Puball, Doodle77, Metamagician3000, Will Beback, Abi79, Ojophoyimbo, Bcasterline, Anlace, SuperTycoon, Loodog, Gobonobo, Ishmaelblues, Sir Nicholas
de Mimsy-Porpington, JoshuaZ, JorisvS, Woer$, Kar-ma, Kransky, Scetoaux, Milomind, Yurez, Ckatz, Light Mp5, Funnybunny, David
Souther, Hectorian, Stephen B Streater, NEMT, Joseph Solis in Australia, Cro..Scream, J Di, StephenBuxton, Philip ea, Majora4, Ewulp,
Zaphody3k, Winston Spencer, Dia^, Kendroche, Wolfdog, CRGreathouse, Liam Skoda, Runningonbrains, Neelix, Blue403, Ajkgordon,
Shanoman, Cydebot, Fluence, Trasel, Asymptote, GRBerry, Chasingsol, Griyn1987, Dr.enh, Teratornis, Fourthhorseman, Lewisskinner,
Omicronpersei8, Cancun771, Maziotis, Mathpianist93, BetacommandBot, Epbr123, Barticus88, Biruitorul, AlexanderTG, Jimmymags,
Java13690, Bksimonb, PJtP, Dfrg.msc, Mailseth, Boxyisaturtle, Sturm55, Natalie Erin, E30e, The pink panther, Stonemaccas, AnemoneProjectors, Primium mobile, TimVickers, Zero g, Blair Bonnett, LonTheCleaner, Kent Witham, Zidane tribal, JAnDbot, MER-C, Planetary, The Transhumanist, Hello32020, Smulthaup, Angelofdeath275, Magioladitis, VoABot II, Mbarbier, Doug Coldwell, Tedickey, David
Scorpio, Mwalimu59, Indon, Strategest, BatteryIncluded, A3nm, Undaunted, JDavidW, TheRanger, GalaxyDancer, Kheider, DGG, Oroso,
Gwern, Nicholas Weiner, Stephenchou0722, MartinBot, Shoeshinecs, Ghostwo, Ekotekk, Sm8900, R'n'B, MapleTree, AlexanderBres,
Smokizzy, OStewart, RaccoonFox, RockMFR, Manticore, J.delanoy, AstroHurricane001, Rlsheehan, Bogey97, Keithkml, Thaurisil, OttoMkel, McSly, Thomas Larsen, P4k, Mikael Hggstrm, Comp25, Judawi, Skullers, Bobianite, Phirazo, Burkhard.Plache, Jevansen,
Pdcook, Useight, GIBBOUS3, X!, CWii, Murderbike, Jiznoplyer1, Trex21, RingtailedFox, Wrev, Zulroth, Antiarchangel, Stagyar Zil
Doggo, EnterStanman, Murph311, Davehi1, Thepulse2007, 8doodles, Pwnage8, Allouba, Abbail, Weverka, Iscariot Ex Machina, Wiikipedian, Melsaran, MackSalmon, Terrashi, Gsydnor, Bentley4, SkifAlef, Jennythereader, Steve3849, Wingedsubmariner, Knightshield,
Imadeitmyself, Wasted Sapience, Vikrant42, Spinningspark, Northfox, Cosprings, SveinMarvin, Tresiden, GeocideForums, Winchelsea,
Andrewjlockley, Voldemore, Tiptoety, Radon210, Apple 30, AngelOfSadness, Sasawat, Hobartimus, Senor Cuete, Jruderman, Likeminas,
Wonkipedian, Belligero, Turchin, Sindala, Freewayguy, ImageRemovalBot, Mko2okm, Jw2034, Martarius, Hard2Win, ClueBot, Artichoker, Carlos10132, Fyyer, Plastikspork, Lawrence Cohen, Michel47, Tomasdemul, Otolemur crassicaudatus, Phenylalanine, Puchiko,
Awickert, Verwoerd, Eeekster, L.tak, Tnxman307, SchreiberBike, Vanished user gq38pi53aq, Joe N, Thingg, Aitias, Terminator484, Stercorariuscyaneusparvulacataphractus, NJGW, Editor2020, Akmikey82, DumZiBoT, Zombie Hunter Smurf, XLinkBot, Jytdog, Laurips,
Feyrauth, AlexNebraska, JCDenton2052, Christine T, Atoric, NCDane, Addbot, DOI bot, Jdtapaboc, Zellfaze, Brekass, Cst17, Tweshadhar, Glane23, Michaelwuzthere, Fottry55i6, Kyle1278, Patton123, Lightbot, OlEnglish, Robo56, Legobot, Alexander Wideeld, Yobot,
Legobot II, Yngvadottir, Ayrton Prost, Empireheart, Apollofox, Richboy33lb, AnomieBOT, Ssaalltru2, RayvnEQ, Citation bot, Avocats, Clark89, LilHelpa, JimVC3, Capricorn42, Villa88, Nasnema, DownAirStairsConditioner, Gap9551, J04n, Abce2, Colfulus, Realson, Amaury, Spellage, HRIN, Green Cardamom, FrescoBot, Surv1v4l1st, Steve Quinn, Bromley86, Citation bot 1, Dogaru Florin,
Pinethicket, Tom.Reding, Thinking of England, Serols, Jandalhandler, James Doehring, Full-date unlinking bot, UnderHigh, Vrenator,
Diannaa, Tbhotch, Hans3778, RjwilmsiBot, Noommos, Buggie111, WikitanvirBot, SpatialGiant, Slightsmile, Islamuslim, AnthroMimus,
Thecheesykid, Life in General, Illegitimate Barrister, Will9194, Brahma Kumari Pari, Anir1uph, AOC25, Bahudhara, Yiosie2356, Sodmy,
Wiggles007, Insidetheoutsidebox, TurtleMelody, ThePowerofX, K. the Surveyor, Terra Novus, Jhfound`oush jlkdsv njugerhufghew, ClueBot NG, Morgankevinj huggle, Ypnypn, Coastwise, Koornti, Handwerker, Funnyman653, Savetheworldfromislam, Widr, Theonesean,
Helpful Pixie Bot, Pennykohl, Gob Lofa, Bibcode Bot, Jeraphine Gryphon, BG19bot, NewsAndEventsGuy, Starship.paint, Brian Tomasik,
MusikAnimal, Olev Vinn, North911, Harizotoh9, Cac692, Wannabemodel, BattyBot, Curleekate, Lamilo1562, EuroCarGT, Axentoke,
Mysterious Whisper, Mogism, Aaronrulz12, Coolmanjosh, ZX95, Eyesnore, Madreterra, Rolf h nelson, Slavoslavski, Gruekiller, TeddyLiu, GGGudex, Aerolit, Annaproject, YiFeiBot, Ginsuloft, My-2-bits, Lrieber, StevenD99, Monkbot, Philomathtomas, Dsprc, Maidenhead1, ClaudeNelson, Welcome1To1The1Jungle, Jkramar, Polar Mermaid, Ulimit, FloraWilde, SoerenMind, Airsheezy, Elenchus vertejas,
Researcher1010, SherryCa, Julietdeltalima, Kisds, Stonejm9, Thwomps, Sirsurvivealot, KenTancwell, Lear419 and Anonymous: 631

12.2

Images

File:ExtinctDodoBird.jpeg Source: http://upload.wikimedia.org/wikipedia/commons/e/e0/ExtinctDodoBird.jpeg License: Public domain Contributors: ? Original artist: ?


File:Folder_Hexagonal_Icon.svg Source: http://upload.wikimedia.org/wikipedia/en/4/48/Folder_Hexagonal_Icon.svg License: Cc-bysa-3.0 Contributors: ? Original artist: ?
File:Impact_event.jpg Source: http://upload.wikimedia.org/wikipedia/commons/c/cb/Impact_event.jpg License: Public domain Contributors: Transferred from en.wikipedia; transfer was stated to be made by User:Vojtech.dostal. Original artist: Original uploader was Fredrik
at en.wikipedia

16

12

TEXT AND IMAGE SOURCES, CONTRIBUTORS, AND LICENSES

File:X-risk-chart-en-01a.svg Source: http://upload.wikimedia.org/wikipedia/commons/6/64/X-risk-chart-en-01a.svg License: CC BYSA 3.0 Contributors: Own work (Based on X-risk chart.jpg) Original artist: Wrev0

12.3

Content license

Creative Commons Attribution-Share Alike 3.0

S-ar putea să vă placă și