Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems

A new THaW paper was published at USENIX Security last week. It describes using a laser at a distance of 110 meters to stimulate audio sensors on smart speakers and thereby insert audio commands that are accepted as coming from a legitimate user. Techniques for dealing with this vulnerability are proposed.

Takeshi Sugawara, Benjamin Cyr, Sara Rampazzi, Daniel Genkin, and Kevin Fu. Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems. In Proceedings of the USENIX Security Symposium (USENIX Security), pages 2631–2648, August 2020. USENIX Association.

Paper and video presentation at https://www.usenix.org/conference/usenixsecurity20/presentation/sugawara 

With a Laser, Researchers Say They Can Hack Alexa, Google Home or Siri

The New York Times just posted a story about this startling new research from Kevin Fu and his group, funded in part by THaW.  The research team posted more details here.