IoT Security

Hackers Can Use Lasers to Send Voice Commands to Phones, Smart Speakers

Hackers can use lasers to send voice commands to Siri, Alexa, Google Assistant and other voice assistants present on phones, tablets and smart speakers, researchers have demonstrated.

<p><strong><span><span>Hackers can use lasers to send voice commands to Siri, Alexa, Google Assistant and other voice assistants present on phones, tablets and smart speakers, researchers have demonstrated.</span></span></strong></p>

Hackers can use lasers to send voice commands to Siri, Alexa, Google Assistant and other voice assistants present on phones, tablets and smart speakers, researchers have demonstrated.

A group of researchers from the University of Michigan in the United States and the University of Electro-Communications in Japan has shown that a laser pointed at the microphone of a device can be used to issue commands that get processed by the voice assistant present on the device.

The attack method, dubbed Light Commands, relies on the fact that the MEMS (micro-electro-mechanical systems) microphones, which are used in many modern devices, react to light if it’s aimed directly at them.

Microphones are designed to convert sound waves into electrical audio signals, but researchers noticed that they react to a light source, such as a laser, similarly to how they react to sound waves. This can be used to “inject” sound into a microphone by modulating the intensity of the laser light.

In attacks aimed at voice assistants, the attacker can transmit a voice command attached to a laser beam to the targeted device’s microphone, which demodulates the signal. The device will process the resulting signal as it would a voice command sent by the user.

According to the researchers, an attacker could use the Light Commands attack to instruct voice assistants to execute various types of commands aimed at smart systems, including to control home light switches, open garage doors, unlock and start vehicles, open locks, and make online purchases on behalf of the victim. Attacks can work against many devices, including phones and tablets running Siri or Google Assistant, and the Amazon Echo smart speaker and Facebook’s Portal smart display, both of which use the Alexa assistant.

The researchers have published a paper detailing their findings and they have set up a website that summarizes the Light Commands attack, including through some demo videos.

Advertisement. Scroll to continue reading.

The attack has been tested on over a dozen devices from Apple, Google, Amazon, Facebook and Samsung. Depending on the device, the attack can be launched over distances of at least 110 m (360 ft) — the testing distance was limited by the length of a corridor available to the researchers — and it can work even through glass windows.

The experts say an attack can be launched with as little as $600 worth of commercially-available equipment, including a simple laser pointer, a laser driver to modulate the signal, a sound amplifier for playing recorded commands, and a telephoto lens to focus the laser.

A telescope or binoculars can be used to point the laser at the microphone of a device that is located at a large distance from the attacker’s setup. Alternatively, over smaller distances, an attacker could use a laser flashlight, which has a larger spot size, making it easier to aim at the targeted device’s microphone.

In order to launch stealthier attacks, hackers could use invisible lasers and they could instruct the targeted devices to lower the speaker volume so that the victim does not hear the voice assistant confirming the execution of a command.

According to the researchers, the most efficient mitigations against such attacks would be adding an extra layer of authentication, using physical barriers to block straight light beams from reaching the microphone, and using multiple microphones, as a legitimate voice command would result in a signal being transmitted to more than just one microphone.

Related Content

Copyright © 2024 SecurityWeek ®, a Wired Business Media Publication. All Rights Reserved.

Exit mobile version