Voice commands in SIRI, Alexa or Google Assistant can be hacked with … a laser

Scientists have discovered a new way to trigger voice commands in SIRI, Alexa or Google Assistant. With the help of a laser beam, you can attack any device from several dozen meters.

Voice commands have been under the magnifying glass of security specialists for some time. In addition to the obvious problem of collecting data from microphones without the user’s knowledge, the trouble is that a voice command can be given by anyone. This element is susceptible to manipulation – e.g. its activation with the help of ultrasound has already been forced. It is worrying that this option is possible even though the devices have special filters for such frequencies. However, it turns out that you do not need sound at all to activate the voice command.

SIRI, Alexa and Google Assistant can be enabled with a laser

Waking assistants such as Siri voice assistant or Google and control of laser was discovered by researchers at the University of Michigan and the University of Electro -Communications in Tokyo.

This possibility exists because modern MEMS microphones react not only to air vibrations, but also to a beam of light. Takeshi Sugawara noticed that pointing the laser at the microphone makes it emit sound at a certain frequency. Importantly, the effect of the light beam was not dependent on the wavelength, but only on its intensity.

Scientists came up with the idea that in this way it is possible to manipulate devices even from a long distance. To do this, you need to have the right equipment. However, it is also not the cheapest – a laser for several dollars, a laser diode driver (about $ 340), an amplifier (about $ 30) and a telephoto lens ( about $ 200). It is also necessary to know the specifications of the device that is the target of the attack. This is a small problem, because all the data can be found on the internet. Depending on the equipment to be taken over, the laser must be calibrated differently.

Scientists managed to attack the device in the building next door, 75 m away. Windows of both buildings did not stand in the way. The largest distance that could be achieved is 110 m, because the corridor was at the disposal of the researchers.

Extremely dangerous attacks

Such attacks can become extremely dangerous, because with the help of voice, you can, for example, open the garage door, which will allow for burglary, you can also control other devices at home, shop online, start vehicles remotely, open door locks. The danger is therefore real and serious.

How to avoid such an attack? It’s a good idea to hide your voice-controlled devices at home so that you can’t see them from the outside. The smartphone may be more difficult to hack this way because of its mobility, but it is still possible. We should not put it in plain sight. You should also count on the fact that manufacturers will develop voice recognition so that the device will be able to be activated only by the user assigned to it.

Related posts

If you thought Netflix was richer thanks to a pandemic, you’re wrong

Mzee Kobe

Apple received a billion-dollar fine for manipulating the French market

Mzee Kobe

Apple may acquire Disney

Mzee Kobe

Leave a Comment

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More