x
Breaking News
More () »

11 Investigates: Hacking Your Home

A smart speaker is vulnerable to hacking with a laser pointer and some know-how, researchers demonstrate at the University of Michigan.

ANN ARBOR, Mich. — From high atop the 167-foot tall Lurie Tower, a blue laser pierces through the night sky of the University of Michigan’s North Campus.

It cuts through Daniel Genkin’s office window, dancing over the speaker of a Google Home device on the windowsill.

The beam triggers the device, causing lights to flash. Then, a robotic voice simply states: “Opening.”

An “attacker” about two football fields away had taken over the device and made it execute a command.

Researcher Sara Rampazzi excitedly pumps her fist.

Rampazzi, Genkin, Benjamin Cyr, and Kevin Fu partnered with Takeshi Sugawara, a Tokyo university researcher, on a paper that highlighted the ability of light beams to hijack voice-activated assistants.

Titled “Light Commands,” the paper pointed out that much attention has been paid to improving the capabilities of voice-controllable systems but that there is still much to learn about protecting these systems against hardware and software attacks.

Credit: WTOL
Researcher Sara Rampazzi adjusts a voice-controlled assistant as it is hit with a laser beam from a nearby bell tower.

One recent night, Genkin and Cyr set up a laptop and laser driver in the tower. Cyr’s command to “open garage door” was recorded on a laptop and encoded on the laser beam with the use of the driver. The beam was then shot through a telescopic lens to allow it to travel several hundred yards. Using her phone, Rampazzi helped Genkin and Cyr aim the light.

Once the command-coded beam hit the device, the speaker’s diaphragm reacted the same way as it would to a voice. The device heard, “Open the garage door.”

It has been a process that has been replicated over and over with Amazon, Apple, and Google devices.

“This surprised me, and I immediately thought about the consequences,” Rampazzi said when asked about her thoughts when the team was first able to pull off the “hijacking.”

Those consequences would all be dependent upon what the consumer has attached to the device. Cyr said they were able to use a simple command to add a laser pointer to an Amazon shopping cart.

“The worst-case scenario is the ability to steal cars, unlock doors, open garage doors, do voice purchasing,” Genkin said.

Credit: WTOL
Daniel Genkin, right, and Benjamin Cyr prepare to “hijack” a voice-controlled assistant from the Lurie Tower on the campus of the University of Michigan.

The researchers have been in touch with the makers of the voice-controlled assistants. 11 Investigates reached out to Amazon to get a comment on the research. A spokesman sent the following statement:

“Customer trust is our top priority and we take customer security and the security of our products seriously. We are reviewing this research and continue to engage with the authors to understand more about their work.”

The company added that there have been no reports of devices being maliciously hijacked and that there are simple protections customers can take.

Cyr agreed.

“From a consumer standpoint, the idea would be to make sure that your device is not within line of sight of anywhere obvious outside,” Cyr said. “Put it behind a TV or something like that, or even move it to an inside wall, away from a window. You want to do what you can to break the line of sight with the microphone port on the device.”

But the long-term goal of the researchers is for manufacturers to build speakers impervious to attack from a light beam.

“The thing I was most surprised about (by the study) is that it actually worked,” Genkin said. “There is so much we don’t know about hardware security.”

RELATED: FBI warns your smart TV could be spying on you

RELATED: Disney Plus user accounts already found on hacking sites

Before You Leave, Check This Out