• Researchers found a way of hijacking voice assistants from the major tech firms using cheap lasers.
  • They discovered shining even cheap laser pointers at microphones in smart speakers and some smartphone models can result in the device interpreting the light as sound.
  • The research team demonstrated how they were able to “speak” to smart speakers and smartphones running Google’s Assistant, Amazon’s Alexa, and Apple’s Siri using the lasers, even getting them to perform tasks like opening a garage door.
  • Smart speakers, which don’t require extra authentication, were particularly vulnerable to this kind of attack. Researchers tested popular models from all the major tech firms.
  • Google and Amazon told Business Insider they are reviewing the research for its security implications. Apple declined to comment. Facebook, which uses Amazon’s Alexa in its Portal speaker, did not immediately respond.
  • Visit Business Insider’s homepage for more stories.

Turns out laser pointers are good for more than just confusing cats.

A team of researchers from Tokyo’s University of Electro-Communications and the University of Michigan have discovered that you can “hijack” voice-enabled devices by shining a laser at them.

The team found microphones in some of the most popular smart speakers and smartphones on the market interpreted the bright light of the laser as sound.

They wrote: “Thus, by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio.”

The team tested popular smart speaker models from all the major tech firms as well as some smartphones that variously run Google's Assistant, Amazon's Alexa, and Apple's Siri.

Amazon Echo

Foto: Amazon Echo speakers were among the devices the researchers found they could hack.sourceShutterstock

Their list of devices included Google Home, various Amazon Echo models, the Apple HomePod, and Facebook's Portal speaker which runs Alexa. They also tested an iPhone XR, a Samsung Galaxy S9, and a Google Pixel 2.

The team found all were vulnerable to the attack, in varying degrees. They were able to hijack the tablets, phones, and speakers from some distance - and through windows. They hijacked a Google Home speaker from 110 meters away, for example.

Some of the devices were less vulnerable than others, as noted by Wired and in the team's paper. Some Android smartphones, the iPhone, and the iPad, require additional authentication or a "wake word" from the user before carrying out certain actions. A hijacker would need to recreate a person saying a wake command like "Hey Siri", or "Ok Google" to wake up an assistant before they could then carry out an attack.

But smart speakers don't have this extra layer of authentication.

The researchers used reasonably affordable laser pointers ranging from $13.99 to $17.99 to carry out the attacks, although to give the speakers specific instructions the laser pointer had to be paired with a $27.99 sound amplifier and device called a laser driver to control the intensity of the beam - which costs $339.

Here's a video of the team hacking a Google Home device to open a garage door using a cheap laser pointer:

In their paper the researchers warned the laser attack could be also be used to unlock smartphone-connected front doors, to shop online, or find and unlock cars such as Teslas connected to a victim's Google account.

A Google spokeswoman told Business Insider: "We are closely reviewing this research paper. Protecting our users is paramount, and we're always looking at ways to improve the security of our devices."

Amazon is also taking a closer look at the security of its devices following the paper's publication. "Customer trust is our top priority and we take customer security and the security of our products seriously. We are reviewing this research and continue to engage with the authors to understand more about their work," an Amazon spokeswoman said.

Apple declined to comment when contacted by Business Insider, and Facebook was not immediately available for comment.

The researchers noted that they haven't found any evidence to suggest this hack has been used in the real world. You can read the researchers' full paper here.