A FRENCH GOVERNMENT security outfit has found a way to take control of a person’s Siri voice assistant, or the Google equivalent, from 16ft away and, no, we do not mean by shouting.
Reports, some of which are rather shrill, suggest that this is a real problem. Wired said that the hack has been developed by French government outfit ANSSI, and uses radio waves to trigger what would otherwise have been a human instruction.
The obvious risks are that victims might be dragged into downloading some crap from somewhere, or conned into calling up a premium rate phone line.
There are some extenuating conditions. The would-be hacker requires a headset with earbud and microphone – these are relatively common – and needs to remotely fiddle with the wire and the jack to send pulses that can inform the virtual assistants to follow a random whim.
The industry has a more considered opinion and reckons that this could turn out bad for the Internet of Things (IoT).
“This has been a very interesting year for software-defined radio hacks. We have seen hacks ranging from turning RAM chips into radios broadcasting air-gapped data to pita-sized radios stealing encryption keys to opening garage doors with a child’s toy,” said Craig Young, a security researcher at Tripwire.
Siri may be your personal assistant. But your voice is not the only one she listens to. As a group of French researchers have discovered, Siri also helpfully obeys the orders of any hacker who talks to her—even, in some cases, one who’s silently transmitting those commands via radio from as far as 16 feet away.
A pair of researchers at ANSSI, a French government agency devoted to information security, have shown that they can use radio waves to silently trigger voice commands on any Android phone or iPhone that has Google Now or Siri enabled, if it also has a pair of headphones with a microphone plugged into its jack. Their clever hack uses those headphones’ cord as an antenna, exploiting its wire to convert surreptitious electromagnetic waves into electrical signals that appear to the phone’s operating system to be audio coming from the user’s microphone. Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker’s number to turn the phone into an eavesdropping device, send the phone’s browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter.
“The possibility of inducing parasitic signals on the audio front-end of voice-command-capable devices could raise critical security impacts,”
Here’s a video showing the attack in action: In the demo, the researchers commandeer Google Now via radio on an Android smartphone and force the phone’s browser to visit the ANSSI website.
The researchers’ silent voice command hack has some serious limitations: It only works on phones that have microphone-enabled headphones or earbuds plugged into them. Many Android phones don’t have Google Now enabled from their lockscreen, or have it set to only respond to commands when it recognizes the user’s voice. iPhones have Siri enabled from the lockscreen by default, but the the new version of Siri for the iPhone 6s verifies the owner’s voice just as Google Now does.
Most spy movies (at least the ones worth their salt) will include a few scenes that depict nerds in a van listening in on conversations remotely and causing the victims phones to do things like turn themselves or their cameras on. We have been made to believe that it takes an entire van of equipment and one or two MIT level hackers to pull this off. Turns out all it takes is about $2300, some know how, and an unsuspecting target with a set of microphone-equipped headphones attached to their phone.
The French Government’s information security research group ANSSI has been investigating this and published a paper with their findings. Unfortunately that paper is behind a paywall.
use a transmitter to induce a current in the headphone wires.
We think this is a really cool proof-of-concept.
Comments:
I expect guys send an AM signal that gets detected at audio input circuitry. Quite easy in theory,
Since audio signals through those jacks are single ended, SNR or not not going to matter. The ESD diodes could act as rectifiers for the RF signals capture by the headphone cable acting as antenna.
From the paper: “It was also observed that the minimal field required around the target was in the range of 25–30 V/m at 103 MHz, which is close to the limit accepted for human safety”
This attack is only relevant if you are wearing earphones that make a connection to the mic portion of the jack, also you need to have this feature enabled in the first place..
Interesting in the past I have had to test audio equipment for compliance, one test involved broadcasting an AM signal modulated by a 1KHz sine into the device in a test chamber. We hooked the audio stage output to an spectrum analyser and we were not meant to see a 1KHz signal at all. If we did it’d mean the amplifier circuit was demodulating the AM which would have been a fail. It passed. Tempest testing works along the same sorts of lines.
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.
We are a professional review site that has advertisement and can receive compensation from the companies whose products we review. We use affiliate links in the post so if you use them to buy products through those links we can get compensation at no additional cost to you.OkDecline
3 Comments
Tomi Engdahl says:
Hackers claim to enable remote control of Siri and Google Now
Could be bad, depending on the circumstances
http://www.theinquirer.net/inquirer/news/2430691/hackers-claim-to-enable-remote-control-of-siri-and-google-now
A FRENCH GOVERNMENT security outfit has found a way to take control of a person’s Siri voice assistant, or the Google equivalent, from 16ft away and, no, we do not mean by shouting.
Reports, some of which are rather shrill, suggest that this is a real problem. Wired said that the hack has been developed by French government outfit ANSSI, and uses radio waves to trigger what would otherwise have been a human instruction.
The obvious risks are that victims might be dragged into downloading some crap from somewhere, or conned into calling up a premium rate phone line.
There are some extenuating conditions. The would-be hacker requires a headset with earbud and microphone – these are relatively common – and needs to remotely fiddle with the wire and the jack to send pulses that can inform the virtual assistants to follow a random whim.
The industry has a more considered opinion and reckons that this could turn out bad for the Internet of Things (IoT).
“This has been a very interesting year for software-defined radio hacks. We have seen hacks ranging from turning RAM chips into radios broadcasting air-gapped data to pita-sized radios stealing encryption keys to opening garage doors with a child’s toy,” said Craig Young, a security researcher at Tripwire.
Tomi Engdahl says:
Hackers Can Silently Control Siri From 16 Feet Away
http://www.wired.com/2015/10/this-radio-trick-silently-hacks-siri-from-16-feet-away/
Siri may be your personal assistant. But your voice is not the only one she listens to. As a group of French researchers have discovered, Siri also helpfully obeys the orders of any hacker who talks to her—even, in some cases, one who’s silently transmitting those commands via radio from as far as 16 feet away.
A pair of researchers at ANSSI, a French government agency devoted to information security, have shown that they can use radio waves to silently trigger voice commands on any Android phone or iPhone that has Google Now or Siri enabled, if it also has a pair of headphones with a microphone plugged into its jack. Their clever hack uses those headphones’ cord as an antenna, exploiting its wire to convert surreptitious electromagnetic waves into electrical signals that appear to the phone’s operating system to be audio coming from the user’s microphone. Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker’s number to turn the phone into an eavesdropping device, send the phone’s browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter.
“The possibility of inducing parasitic signals on the audio front-end of voice-command-capable devices could raise critical security impacts,”
Here’s a video showing the attack in action: In the demo, the researchers commandeer Google Now via radio on an Android smartphone and force the phone’s browser to visit the ANSSI website.
The researchers’ silent voice command hack has some serious limitations: It only works on phones that have microphone-enabled headphones or earbuds plugged into them. Many Android phones don’t have Google Now enabled from their lockscreen, or have it set to only respond to commands when it recognizes the user’s voice. iPhones have Siri enabled from the lockscreen by default, but the the new version of Siri for the iPhone 6s verifies the owner’s voice just as Google Now does.
Tomi Engdahl says:
RF Attack Controls Nearby Smartphones Via “Okay Google” And “Hey Siri”
http://hackaday.com/2015/10/17/rf-attack-controls-nearby-smartphones-via-okay-google-and-hey-siri/
Most spy movies (at least the ones worth their salt) will include a few scenes that depict nerds in a van listening in on conversations remotely and causing the victims phones to do things like turn themselves or their cameras on. We have been made to believe that it takes an entire van of equipment and one or two MIT level hackers to pull this off. Turns out all it takes is about $2300, some know how, and an unsuspecting target with a set of microphone-equipped headphones attached to their phone.
The French Government’s information security research group ANSSI has been investigating this and published a paper with their findings. Unfortunately that paper is behind a paywall.
use a transmitter to induce a current in the headphone wires.
We think this is a really cool proof-of-concept.
Comments:
I expect guys send an AM signal that gets detected at audio input circuitry. Quite easy in theory,
Since audio signals through those jacks are single ended, SNR or not not going to matter. The ESD diodes could act as rectifiers for the RF signals capture by the headphone cable acting as antenna.
From the paper: “It was also observed that the minimal field required around the target was in the range of 25–30 V/m at 103 MHz, which is close to the limit accepted for human safety”
This attack is only relevant if you are wearing earphones that make a connection to the mic portion of the jack, also you need to have this feature enabled in the first place..
Interesting in the past I have had to test audio equipment for compliance, one test involved broadcasting an AM signal modulated by a 1KHz sine into the device in a test chamber. We hooked the audio stage output to an spectrum analyser and we were not meant to see a 1KHz signal at all. If we did it’d mean the amplifier circuit was demodulating the AM which would have been a fail. It passed. Tempest testing works along the same sorts of lines.