AI to render humans 'second most intelligent creations'
Digitally-connected users are in the midst of an artificial intelligence (AI) revolution and can easily become targets for automated cyber attacks.
Speaking at ITWeb Security Summit 2023, cyber security specialist and author Mikko Hyppönen, chief research officer at WithSecure, said any connected smart device today should be considered vulnerable.
“If it’s programmable, it’s hackable,” said Hyppönen.
He noted that the advent of AI and widespread use of intelligent chatbot offerings like OpenAI’s ChatGPT triggered a race between humans and bots to see who gains artificial general intelligence (AGI) first.
AGI is equated with supreme human intelligence and if bots win, he said, they can perform any intellectual task that a human can do – and do it quicker and better.
For Hyppönen, humans must win to be able to leverage AI’s capabilities but prevent domination and limit control.
“That’s exactly what OpenAI wants to achieve with its chat systems. It wants to humans to achieve AGI first, ahead of bots.”
He explained that the rationale behind this is that supreme human intelligence can be used to discover solutions to global socio-economic and political crises – including finding a cure for cancer, for example, or preventing more climate-related catastrophes.
But there is a flip side, as Hyppönen warned that should this source of powerful intelligence land in the wrong hands – such as terrorist organisations, power-hungry moguls or cyber criminal masterminds – the consequences could be catastrophic.
“We cannot control our creations, just like we cannot control our kids…our creations will ultimately become more powerful,” he said. “Eventually, mankind will emerge as the second most intelligent creation.”
In his book “If it’s smart, it’s vulnerable”, Hyppönen writes: “Artificial intelligence that can program is an interesting thought. Being program code itself, it could be made to examine and improve its own operation: AI could code a better version of itself, which would in turn code a better version. Quite soon, we wouldn’t be able to grasp the basics of how such AI works.”
He added: “Creating supreme intelligence in our own biosphere might sound like a fundamental evolutionary mistake. If we become the second most intelligent creatures on the planet, our entire existence may be in the balance. We would be no match for superior artificial intelligence. It is almost too easy to fool ourselves with the thought that, should AI become too intelligent, we could simply switch it off. But supreme artificial intelligence could anticipate our every move and ensure its survival in ways we couldn’t even imagine.”
One of these ways could be through full-scale cyber warfare. But this is unlikely, said Hyppönen.
The concern with cyber weapons is that unlike conventional or traditional weaponry, there are no deterrents.
He cited nuclear capability as an example. “That’s why they have military parades to show off and showcase their military might and their weaponry. Nations won’t attack other nations with nuclear capability. But with cyber weapons, there are no deterrents.”
Cyber warfare is one of several dimensions in modern warfare and it’s a theme Hyppönen covers in his book, along with cyber espionage and the impact on governments and corporates.
Hyppönen advised governments on the continent to get rid of legacy technology as quickly and efficiently as possible. “Complexity is the enemy of security…running multiple old or outdated technology creates complexity and vulnerability in systems, especially within the public sector.”
These sectors are looking to digitally transform and utilise AI to strengthen service delivery.
The influence of AI in cyber security remains an issue that many sectors are grappling with.
This week, ITWeb reported on the launch of an AI industry body, the South African Artificial Intelligence Association. It was established to promote the advancement of responsible AI, and unite practitioners across multiple sectors, including government, academia and start-ups.