Voice assistants can be hacked too

Voice Assistants Can Be Hacked Too And You Should Protect Yourself

Voice assistants are extremely convenient, from ordering food to running internet searches and more. You no longer have to tap through your phone or make a call, and your voice assistant does all the work. Millions of people in the United States are using voice assistants like Siri or Alexa to order them food, buy their household goods and items, and even book their travel for them.

However, are these voice assistants secure? Are you opening yourself up to a risk if a hacker can gain control of your voice assistant and make purchases without you knowing? Let’s explore. There is more and more evidence, theoretical and practical, that shows hackers can get into your voice assistants. Certain aspects of voice assistant security systems, like the smart speakers, are much more vulnerable than you would originally think.

Examples of Hacked Voice Assistants

We’ve seen that Amazon Echo’s voice recognition feature can be easily tricked into giving an unauthorized user access to the device, which then gives them the power to make any credit card purchase they wish. Following this, there have been many news stories released about Alexa or Siri putting through orders given by children for hundreds of dollars worth in toys and games.

This is relatively undamaging though. What’s more concerning are the recent results of researchers who have found ways that hackers who are not even in the same vicinity as your voice assistant can hack into it and take control of it. Researchers in both the United States and China have proven that there are ways for them to send silent commands to a voice assistant, getting them to dial a phone number or open a website, without the user (you) even knowing or hearing it. These are known as “dolphin attacks” and it’s a technology that can be used to unlock your doors, wire some money, or purchase things online.

Even more recently, researchers discovered that there was a way they could get access to all the voice assistants, including Siri, Alexa, and Google Home through a method of shining laser pointers at the devices from more than hundreds of feet away. As per Frederic Keanes, a tech blogger at Writinity and Last Minute Writing, “these researchers, from Japan and the United States, explained that attackers can use these voice commands powered by these laser beams to unlock a target’s front door that was previously protected by a smart lock, open their garage doors, go on a shopping spree on online stores using the target’s credit cards stored on their device, and they can even locate a vehicle, unlock, and start that vehicle if it has been connected to the voice assistant device”.

In terms of the laser beam technique, potential hackers have to conduct their attack physically from a few hundred feet away. To protect your voice assistant, it would have to be not visible or limited to your dwelling. Bear in mind that laser beams can go through glass windows, however. Regarding the dolphin attacks previously mentioned, they used to be only effective when the attackers were in a close proximity to the voice assistants during all the lab testing. However, last year, American researchers found that they could conduct a successful attack from 25 feet away. This isn’t a big risk yet, but it also seems like it’s a matter of time before more sophisticated technology can conduct attacks from farther away.

How to Protect Yourself

Until now, voice-command attacks were pretty much theoretical or isolated incidents. However, what’s important to know, explains Martha Wood, an IT expert at Draft Beyond and Research Papers UK, “is that fraudsters and hackers can adapt really rapidly to new technologies and are constantly seeking ways to improve their tactics. Although manufacturers are working on improving the security of their devices, you should still be taking protective measures yourself.”

That means using strong passwords for all your devices, leaving your phone locked at all times that you’re not using it, and using two-factor authentication or PIN protecting your voice assistant tasks that have to do with your personal or financial data, health records, and home security. Alternatively, you can just avoid linking that information to your devices that have voice command enabled.


Professional writer Ashley Halsey works for Lucky Assignments and Gum Essays. She is involved in tech and cybersecurity projects and writes articles to help average people help themselves. She is keen to expose the risks of new technologies while allowing people to still benefit from them.

Leave a Reply