About
Subscribe

‘AI helped push us to 85 seconds to midnight’ – Doomsday Clock

Nicola Mawson
By Nicola Mawson, Contributing journalist
Johannesburg, 30 Jan 2026
The Doomsday Clock, run by The Bulletin of the Atomic Scientists, is set at 85 seconds to catastrophe. (Image: Nicola Mawson | The Doomsday Clock, Pexels and Freepik)
The Doomsday Clock, run by The Bulletin of the Atomic Scientists, is set at 85 seconds to catastrophe. (Image: Nicola Mawson | The Doomsday Clock, Pexels and Freepik)

The rapid development of () and its weaponisation has helped push the Doomsday Clock to its closest point to catastrophe ever – 85 seconds to midnight.

The Bulletin of the Atomic Scientists, which runs the iconic Doomsday Clock, is warning that the world is even closer to midnight than before, and that’s partially because of AI.

In a recent post, the non-profit company, based in Chicago, stated “it is now 85 seconds to midnight” with 00:00:00 being the time the world hits the point of an irreversible catastrophe.

The Bulletin moved the clock hands closer to midnight because of escalating nuclear threats due to tensions among nuclear-armed powers like the US, Russia and China; the ongoing effects of climate change; failing global cooperation; and caused by the rapid development of AI.

The new time, it says on its website, is “the closest it has ever been to catastrophe”.

Tick-tock

The Bulletin, often advised by Nobel Peace Prize Laureates, has reset the minute hand on the Doomsday Clock 27 times since its debut in 1947, when it started out at seven minutes to midnight. This latest change moved it four seconds closer to total catastrophe.

Wikipedia notes that the clock has been set backward eight times and forward 18 times. The farthest time from midnight was 17 minutes in 1991, and the closest is 85 seconds in 2026.

When the Doomsday Clock was created, the greatest danger to humanity came from nuclear weapons, in particular from the prospect that the US and the then Soviet Union were headed for a nuclear arms race.

Atomic beginnings

The association dates its lineage back to 1945, when Albert Einstein helped found it just after Hiroshima and Nagasaki, alongside the scientists from the Manhattan Project that created those bombs. Listed among its founders is Manhattan Project scientific director J Robert Oppenheimer – who is also known as the father of the atomic bomb.

Einstein also served as chair of The Bulletin’s board of sponsors until his death in 1955.

“The Bulletin began as an emergency action, created by scientists who saw an immediate need for a public reckoning in the aftermath of the atomic bombings of Hiroshima and Nagasaki,” it explains.

Now it is warning about a different type of warfare in that the US, Russia and China are incorporating AI across their defence sectors, despite the potential dangers of such moves.

The Bulletin specifically points to the fact that the Trump administration has revoked a previous executive order on AI safety, which it says is a dangerous prioritisation of innovation over safety.

Innovation trumps safety

Within hours of US president Donald Trump taking office in January last year, he revoked former president Joe Biden’s October 2023 order that set a national policy goal for safe and trustworthy AI development. Trump replaced it with Removing Barriers to American Leadership in AI, which seeks to remove barriers to innovation.

Under that executive order, a national AI plan has been produced, and the White House has directed federal agencies to take specific steps, such as accelerating permitting for data centre infrastructure, address bias or what it calls “ideological” influences in government AI systems, and support exports of US AI technology.

Trump’s administration has also taken steps to expand underlying digital and physical infrastructure supporting AI, such as data centre growth, and is encouraging public and industry input on federal AI priorities.

“And the AI revolution has the potential to accelerate the existing chaos and dysfunction in the world’s information ecosystem, supercharging mis- and disinformation campaigns and undermining the fact-based public discussions required to address urgent major threats like nuclear war, pandemics and climate change,” the Bulletin warns.

“Our current trajectory is unsustainable. National leaders – particularly those in the United States, Russia and China – must take the lead in finding a path away from the brink. Citizens must insist they do so,” it says.

Share