Subscribe
  • Home
  • /
  • IOT
  • /
  • Dolphin pitch could hack Siri

Dolphin pitch could hack Siri

Lauren Kate Rawlins
By Lauren Kate Rawlins, ITWeb digital and innovation contributor.
Johannesburg, 08 Sept 2017
VPAs could be hacked using the same high-pitched frequencies that dolphins emit.
VPAs could be hacked using the same high-pitched frequencies that dolphins emit.

Researchers have found virtual personal assistants (VPAs) can be activated using high-pitched inaudible-to-humans sounds, which could lead to new attack opportunities for hackers.

VPAs like Apple's Siri, Samsung's Bixby, Amazon's Alexa and Microsoft's Cortana could be vulnerable to what researchers have dubbed the DolphinAttack.

Voice commands are becoming an increasingly popular way for people to interact with handsets, cars and home speakers to control Internet of things devices, set reminders and check the weather, among other things.

VPAs are always listening for an activation phrase like 'OK Google' or 'Hey Siri'. A report from the department of electrical engineering at Princeton University has detailed how this phrase does not have to be audible for the smartphone, car, or home speaker to respond.

The researchers were able to demonstrate through a real-world experiment how inaudible voice commands can attack an Android phone and an Amazon Echo device with high success rates at a range of two to three metres.

A separate report by Zhejiang University in China showed how modulating voice commands on ultrasonic carriers, so they were less than 20KHz, could successfully 'wake up' a voice-controlled device.

"We [have] validated DolphinAttack on popular speech recognition systems, including Siri, Google Now, Samsung S Voice, Huawei HiVoice, Cortana and Alexa," states the report.

"By injecting a sequence of inaudible voice commands, we show a few proof-of-concept attacks, which include activating Siri to initiate a FaceTime call on iPhone, activating Google Now to switch the phone to airplane mode, and even manipulating the navigation system in an Audi automobile."

Other uses by attackers could include opening a malicious Web site that automatically downloads malware onto the device, or sending a fake e-mail or text message to a contact.

The researchers proposed hardware and software defence solutions in the report, which adds manufacturers should re-design voice-controllable systems to be resilient to inaudible voice command attacks.

Google told the BBC it was investigating the claims made in the research.

Share