Laser light commands could enable hackers to hijack Google Home and Amazon Echo personal assistants
The researchers claim that virtually any device featuring a microphone and voice-control software could be hijacked
Voice-controlled digital assistants, such as Google Home and Amazon Echo, can be hijacked from hundreds of feet away with the use of a laser pointer.
That's according to cyber security researcher Takeshi Sugarawa, who claims to have developed the new technique of taking over smart speakers, in association with other researchers from the University of Michigan.
The researchers carried out a series of demonstrations at Tokyo's University of Electro-Communications and the University of Michigan, showing that lasers can be used to send commands to smart speakers, without using any verbal instruction.
The researchers used a 60 milliwatt laser to send 'light commands' to 16 different voice-controlled devices
According to Sugarawa, the trick can be used to carry out a variety of malicious activities, such as making online purchases, opening garages and remotely starting/stopping vehicles.
In one experiment, the researchers used a 60 milliwatt laser to send 'light commands' to 16 different voice-controlled devices, including smartphones and smart speakers, from a distance of 164 feet. Almost all of those devices were observed to register the commands from that range.
The researchers said they were able to control an iPhone from a range of about 33 feet, while two Android smartphones were found to be susceptible from a range of just 16 feet.
The researchers claim that virtually any device featuring a microphone and voice-control software could be hijacked
In another experiment, the researchers used a five milliwatt laser beam, similar to those used in inexpensive consumer laser pointers. The beam was used on targets located about 361 feet away. While the low-intensity beam was mostly unsuccessful in recreating the earlier results, the researchers were able to control a first-generation Echo Plus and Google Home from that range.
Based on the findings of their study, the researchers claim that virtually any device featuring a microphone and voice-control software could be hijacked with a laser beam modulated by a voice signal.
"The implications of injecting unauthorized voice commands vary in severity based on the type of commands that can be executed through voice," the researchers state on a website dedicated to the Light Commands vulnerability.
"As an example, in our paper we show how an attacker can use light-injected voice commands to unlock the victim's smart-lock protected home doors, or even locate, unlock and start various vehicles."
A Google spokesperson told Wired that the company was reviewing the findings of the research and that it was always ready to take appropriate steps to boost the security of its products.
An Amazon spokesperson said they were willing to "engage with the authors to understand more about their work."