Voice assistants hackable by laser

 - 
03/11/2020

The more I think about it, the more I realize I’m a “house purist.” I like my home to be as free as possible from “extra” electronics of any kind. Sure, I have a laptop, smartphone and a rebuilt iPAD from 10 years ago, and I’m connected to the Internet, but I don’t have cable TV and quite frankly, I refuse to put a voice assistant of any kind in my home … ever! It creeps me out that some random person can be listening, inserting themselves into my daily existence at any time. 

Recent research backs up my no-voice-assistant decision: 75 percent of U.S. households will be at risk to get hacked via voice assistants by 2025 and inaudible, invisible commands can be injected into voice-controlled devices simply by shining a laser at the device; no spoken words needed. 

Researchers at the University of Michigan and the University of Electro-Communications found that light can be converted to sound using a microphone. This means that a remote attacker standing several meters away from a device, most of which are embedded with the common MEMS microphone, can inject arbitrary audio signals to the target microphone by aiming a laser at the microphone’s aperture, covertly triggering the production on an acoustic pressure wave. Basically, the microphone responds to the laser light as if it were sound. 

So, what devices are vulnerable to this attack, now known as LightCommands? The researchers demonstrated this attack on many commercially available voice-controllable systems that use Siri, Portal, Google Assistant and Alexa. They successful injected LightCommands at a maximum distance of more than 100 meters while penetrating clear glass windows. 

The researchers concluded that additional compromises of third-party hardware, such as lock and cars, can be vulnerable to LightCommands attacks and they believe that the heat caused by lasers can also be an effective way to inject false signals into sensors. 

If you still choose to have voice assistants in your home, at this point the only protection against LightCommands attacks is to: 

  1. Keep all voice assistants non-visible from the outside by physically blocking them from sight from windows; and 
  2. Because LightCommands allows attackers to inject commands as a legitimate user, avoid giving voice assistants access to every single connected IoT device/sensor that you have invited into your life as hackers can hijack any digital smart systems attached.