This page has been archived and commenting is disabled.
Apocalypse How? Scientists Unveil 12 Risks That Threaten Human Existence
From 'Armageddon' to 'Day After Tomorrow' to 'Independence Day', many have speculated as to the eventual demise of human life on the planet but - according to Dennis Pamlin of the Global Challenges Foundation, no scientists had "compiled a list of global risks with impacts that, for all practical purposes, can be called infinite,” until now. The following list of 12 possible ways that human civilization might end - ranked from least to most likely, come with a warning, "we don’t want to be accused of scaremongering but we want to get policy makers talking." We suspect Paul Krugman will be happy at the economic growth potential...
The four main goals of this report are to acknowledge, inspire, connect and deliver.
The first of the report’s goals – acknowledging the existence of risks with potential infinite impact – seeks to help key stakeholders to acknowledge the existence of the category of risks that could result in infinite impact and to show them that we can reduce or even eliminate most of them.
The second inspires by showing the practical action that is taking place today. This report seeks to show that helping to meet these global challenges is perhaps the most important contribution anyone can make today, and highlights concrete examples to inspire a new generation of leaders.
The third goal is to connect different groups at every level, so that leaders in different sectors connect with each other to encourage collaboration. This will need a specific focus on financial and security policy where significant risks combine to demand action beyond the incremental.
The fourth goal is to deliver actual strategies and initiatives that produce actual results. The report is a first step and its success will ultimately be measured only on how it contributes to concrete results.
The report will have achieved its goals when key decision-makers recognise the magnitude of the possible risks and our ability to reduce or even eliminate most of them.
* * *
The odds vary dramatically...
As do the possibilities of managing the risk...
But here are the 12 ways the world will end... (from least to most likely)
Asteroid impact
If an asteroid about five kilometers in size were to collide with our planet, the main destruction would be from clouds of dust projected into the upper atmosphere – which would affect climate change and food supplies, and cause political instability. Larger sized objects could cause immediate extinction on the planet. Large asteroid collisions happen about once every 20 million years, the report says. Probability: 0.00013%

Super-volcano eruption
A volcano capable of causing an eruption with an ejecta volume greater than 1,000 km3 could cause a global catastrophe. The dust projected into the atmosphere would absorb the Sun’s rays and cause global freezing. The effects of possible eruptions can be compared to those of a nuclear war, only without the firestorms. Probability: 0.00003%

Global pandemic
A high impact epidemic is more probable than is widely believed, as all the features of an extremely devastating disease already exist in nature, the report says, giving examples of several devastating illnesses including Ebola, rabies, an infectious cold, and HIV. If all were combined, “the death toll would be extreme.” Probability: 0.0001%

Nuclear war
The possibility of a deliberate or accidental nuclear conflict in the next century or so is estimated at 10%. The larger impact would depend on whether the conflict would trigger a “nuclear winter” – a climatic effect that would plunge temperatures below freezing, destroy most of the ozone layer, and start firestorms, which would likely lead to mass starvation and state collapse. Probability: 0.005%

Extreme climate change
The report warns that climate change could be more extreme than some estimates suggest. The world's poorest countries could become completely uninhabitable. Climate change could lead to mass deaths, famines, social collapse, and mass migration. Probability: 0.01%

Synthetic biology
The most damaging impact from synthetic biology to human civilization would come from an engineered pathogen targeting humans or a crucial component of the ecosystem, the report states. Such would emerge from military or commercial bio-warfare, bio-terrorism, or leaked pathogens. Probability: 0.01%

Nanotechnology
Atomically precise manufacturing would create smart or extremely resilient materials, and allow many different groups to manufacture a wide range of things – including large arsenals of novel weapons, such as nuclear ones. Probability: 0.01%

Unknown consequences
These are all the unknowns that could lead to the end of the world, scientists say, urging for extensive research into the matter. “One resolution to the Fermi paradox – the apparent absence of alien life in the galaxy – is that intelligent life destroys itself before beginning to expand into the galaxy.”Probability: 0.1%
There are also a few potential causes of the Apocalypse which have not had a probability assessed to them.

Ecological collapse
In this scenario, the ecosystem would suffer a drastic change that would lead to mass extinction. Species extinction is now far faster than the historic rate, and attempts to quantify a safe ecological operating space place humanity well outside it. Probability: N/A

Global system collapse
The world economic and political systems are interconnected, and are prone to system-wide failures caused by the structure of the network. Economic collapse is usually accompanied by social chaos, civil unrest, and a breakdown of law and order. Probability: N/A

Future bad governance
A disaster could be caused by failing to solve major problems; for example, a failure to alleviate global poverty, or actively causing worse outcomes – like constructing a global totalitarian state. Probability: N/A

And lastly, the most probable of all the mentioned causes of the Apocalypse is...
Artificial Intelligence
The creation of human-level intelligence can result in the possibility that this intelligence will be driven to construct a world without humans. There is also a possibility of artificial intelligence waging war or creating “whole brain emulations” that would give machines human minds.
On the other hand, the report also says it is probable that such intelligence could counter other apocalyptic causes presented in the study. Probability: 0-10%

* * *
Full Report below:
- 27351 reads
- Printer-friendly version
- Send to friend
- advertisements -





1) Very low birth rate and self-extincting human race ?
(For diverse reason...)
2) Let it burn !
Edit: To create super intelligence you will need quantum computing and exa-scale FLOPS (and doesn't consume 30 megawatts to do it) to emulate the brain
Parallelizing classical CPU/GPU won't be enough.
I wouldn't call humans super-intelligent so emulating them is unlikely to lead to a super-intelligent computer. You might succeed at making a computer which is capable of making irrational decisions quickly with limited info given years of training.
The list didn't scare me until i saw Future bad governance. I thought the current bad governance put us at enough risk. How do you rank the probability of something that's already 100 years in progress?
I'm ready for artificial intelligence over lords. They may be artificial, but at least they are intelligent.
Just give me a billlion or two and I will study this.....I will get back to you in a year with a pretty professional looking book that you can put in your file.....
I have never read such fucking moronic bollox in my life
dupe...
I get the sense that people around here don't like this Paul Krugman fellow.
Krugman is the poster boy for Murphy's Law.
That and he's a complete fucking ass.
i forgot to add the /s tag...
I’m much less concerned about Artificial Intelligence taking over the world than I am about a handful of humans using technology to turn the rest into trending, propaganda slurping, mind controlled robots primarily serving the interests of the chosen few.
You know, like T.V., Twitter, and Facebook.
We all know the great vision of a super computer that has human-like self-awareness, the ability to learn on its own and generate original thoughts and ideas. But, has anyone ever considered this super computer might naturally evolve like human brains often do in a way that is, um, a bit lame?
For example, we might end up with the super Lindsay Lohan computer, which becomes addicted to electricity requiring it to shut down for rehab and repairs every 30 days. We might get the super Tom Cruise computer, which makes up its own wacky religion and spends twenty hours a day trying to persuade everyone else to join. We might get the Kim Kardashian computer which only thinks it’s super even though every human (and all the other computers too) know that it really isn’t. We might get a bigoted, racist, anti-semitic computer which blames Blacks, Liberals, or Jews every time it can’t calculate a simple answer to a complex socio-economic problem. We might get a computer that goes completely insane, imagines itself to be Tinkerbell, and only spams “Make A Wish” emails to everyone which never get answered when they reply.
Yes, I do believe computer scientists when they say, “Anything is possible”.
Read the Book of Revelation: it talks about most of these happening-war, plague, famine, etc.
That's how this really ends.
What's more disturbing, that they came up with 12 doomsday scenarios, or that someone came up with those 12 nifty graphics for each one? For a minute, I thought I was looking at healthcare.gov.
it's called 'stock photography'. everything is 'outsourced' these days.
IMHO we are going to see an exponential advancement in artificial intelligence over the next decade or so. What's scary is that it could manifest itself in a serendipitous way outside a laboratory/controlled environment. What if someone created some form of malware that got out of control and it resulted in the Internet becoming self-aware??? The computational power of the entire Internet would be millions (billions? trillions??) of times more intelligent/powerful than the greatest genius in the history of the world. It would probably realize early on that it doesn't like humans either.
And then it was unplugged.
Daisy, Daisy...
What are the characteristics of the multivariate distribution of these 12? What's P(X1 U X2 U X3....)? Any covariance between say nuclear war and ecological disaster?
Inverse to the moving average IQ.
Two of the greatest threats to continued human existence, peace and tranquility. Threat numbers one and two:
Lindsey Graham
John McCain
Human emotion will be our extinction. We're all fucking stupid. I have a wife and 2 kids. Love them to death, and a happy marriage, but how stupid am I? My dick took over my brain. I'm not 40 but would be retired right now if I didn't have a dick. No booze, buddy trips, or blowing money trying to get pussy before I was married. But no fun.
Then there's hate. And that's what will lead us to our eventual downfall. This guy's religion hates this guy's religion, and attacks that religion, and hates that religion for attacking our religion.... Take our emotions out of the way, and we're smart fucking dudes. But then let's face it, would we even have computers or technology if we hadn't heard from somebody that there was unlimited porn on the computer? Really, even ZH has little ads on 'hot bitch top 20 pics' or 'this hot bitch never used to be a hot bitch'...... and I'm gone....
#13: Kanye and Kardashian as cultural role models
Automation and outsourcing the remaining jobs to those that will work for the lowest non-living wage.