Stephen Hawking, the visionary physicist who passed away this week at the age of 76, made five predictions about how and when mankind will face its doom.
In a week when geo-political tensions between the world's two leading nuclear powers, Russia and the U.S. are worsening with the UK and U.S. accusing Russia of a nerve agent attack and imposing sanctions on Russia, it is a good time to consider and heed Hawking's doomsday predictions or warnings, especially about nuclear war.
Below is an interesting article which looks at five of Hawking's doomsday predictions as collated by Qt.com.au:
WE NEED an exit strategy. Fast.
From global warming to artificial intelligence, Professor Stephen Hawking made a number of terrifying predictions about the apocalyptic threats facing humanity.
The celebrated late scientist said humanity is at a "tipping point", and that our best bet will be to leave Earth completely.
Here are five main factors he said are contributing to the end of the world.
In 2007, Prof Hawking fronted a campaign to cancel Trident, Britain's nuclear weapons deterrent.
"Nuclear war remains the greatest danger to the survival of the human race," he said.
"To replace Trident would make it more difficult to get arms reduction, and increase the risk.
"It would also be a complete waste of money because there are no circumstances in which we would use it independently."
Prof Hawking has also identified "aggression" as the human trait will destroy us all.
He warned it could lead to irrational actions, like sparking nuclear war.
Prof Hawking said nuclear war remains the ‘greatest danger’ to humanity’s survival.
"I fear evolution has in-built greed and aggression to the human genome," he told the BBC. "There is no sign of conflict lessening, and the development of militarised technology and weapons of mass destruction could make that disastrous. The best hope for the survival of the human race might be independent colonies in space."
Prof Hawking made it clear he was not a fan of Donald Trump.
He was particularly critical of the US President after he vowed not to sign the Paris Agreement on climate change.
"Climate change is one of the great dangers we face and it's one we can prevent if we act now," he told the BBC. "By denying the evidence for climate change and pulling out of the Paris Climate Agreement, Donald Trump will cause avoidable environmental damage to our beautiful planet, endangering the natural world, for us and our children."
In a Skype talk delivered at the Starmus science and arts festival last year, he made his case more urgent.
"Unlike Donald Trump, who may just have taken the most serious and wrong decision on climate this world has seen, I am arguing for the future of humanity and a long-term strategy to achieve this," Prof Hawking said.
"We have given our planet the disastrous gift of climate change ... When we have reached similar crises there has usually been somewhere else to colonise ... But there is no new world, no utopia around the corner. We are running out of space, and the only places to go to are other worlds."
On ITV'S Good Morning Britain, Prof Hawking was asked to explain Mr Trump's ascendancy to the White House.
"I can't," he responded. "He is a demagogue who seems to appeal to the lowest common denominator."
In recent years, Prof Hawking had raised the alarm about the potential threat of artificial intelligence.
Speaking at the Web Summit in Lisbon in November, the famous physicist said AI has the potential to be the best or worst thing humanity has ever seen and the scary reality is we just don't know which yet.
"We cannot know if we will be infinitely helped by AI or ignored by it and sidelined, or conceivably destroyed by it," he said.
While AI could be hugely beneficial for reducing poverty, disease and restoring the natural environment, it's impossible to predict "what we might achieve when our own minds are amplified by AI".
"AI could be the worst invention of the history of our civilisation, that brings dangers like powerful autonomous weapons or new ways for the few to oppress the many.
"AI could develop a will of its own, a will that is in conflict with ours and which could destroy us. In short, the rise of powerful AI will be either the best, or the worst thing ever to happen to humanity."
Hawking warned scientists and global governments needed to focus on maximising benefits for society rather than pure capability.
"We need to employ effective management in all areas of its development," he said. "We stand on a threshold of a brave new world. It is an exciting, if precarious, place to be and you are the pioneers."
DEATH BY FIREBALL
Overpopulation is going to turn our planet into a red-hot fireball.
Prof Hawking warned the Earth will be reduced to a ball of fire within 600 years when our energy consumption overloads.
In a video appearance at the 2017 Tencent WE Summit in Beijing, he said: "By the year 2600, the world's population would be standing shoulder to shoulder, and the electricity consumption would make the Earth glow red-hot."
To save ourselves, he said we must take a leaf out of Star Trek and "boldly go where no one has gone before".The planet is eventually going to become one big, red-hot fireball.
He has also warned that over the next 100 years, we need to look to colonise Mars and other planets.
Speaking at the Starmus science festival in Norway last year, he said the Moon and Mars would be the best sites to begin new colonies, and said we could establish outposts on these sites within 30 and 50 years respectively, The Telegraph reported.
Prof Hawking has warned that if global warming doesn't wipe us out, an asteroid strike will.
At the Starmus festival, he said it was only a matter of time before the Earth would be destroyed by either an asteroid, soaring temperatures or overpopulation.
"This is not science fiction. It is guaranteed by the laws of physics and probability," he said.
"To stay risks being annihilated.
"Spreading out into space will completely change the future of humanity. It may also determine whether we have any future at all."Prof Hawking warned that becoming a "cosmic sloth" was not an option.
"Wherever we go we will need to build a civilisation, we will need to take the practical means of establishing a whole new ecosystem that will survive in an environment that we know very little about and we will need to consider transporting several thousands of people, animals, plants, fungi, bacteria and insects."
* * *