Skip to Main Content

You Should Mute Your Smart Speaker's Mic More Often

You Should Mute Your Smart Speaker's Mic More Often
Credit: beeboys - Shutterstock

Does your voice assistant often seem a little too eager to chime in? A recent study by Ruhr University Bochum and Max Planck Institute for Security and Privacy found over 1,000 words and phrases that Alexa, Siri and Google Assistant frequently misidentified as activation commands (also known as “wake words”). Here are a few examples, via Ars Technica’s reporting on the study:

Alexa:

unacceptable,” “election” and “a letter

Google Home:

“OK, cool,” and “Okay, who is reading

Siri:

“a city” and “hey jerry

Microsoft Cortana:

Montana

According to the study, these false positives are very common and easy to initiate, which is a major privacy concern.

Alexa, what’s the problem?

Voice assistants are always “listening” for an activation command. While they’re not necessarily recording, they’re clearly on alert. Once the AI recognizes a command—whether through a smart speaker or your phone’s mic—it records any subsequent audio it “hears” and then sends it to a remote server, where it’s processed by various algorithms that determine what is being asked. Sometimes, this audio is saved and listened to later by employees working to refine a voice assistant’s speech recognition capabilities, which is where the privacy concerns come in: Even if the captured audio doesn’t activate anything server-side, it still may be recorded, saved and even listened to by engineers to see if a command was missed or misinterpreted.

This isn’t speculation; we know this is how these “machine learning” algorithms actually work—by having humans manually help the machines learn. They’re not autonomous beings. This practice often leads to privacy breaches and subsequent public backlash and legal ramifications. Google is constantly under fire for selling user data to advertisers, and Amazon has repeatedly leaked or mishandled its users’ video and audio recordings. Apple has the “best” data privacy policies overall, but its employees have been caught transcribing overheard audio.

The point is: if Alexa, Siri and Google Assistant are being activated accidentally, more of your personal interactions are going to be recorded and potentially accessed by outsiders—and who knows what they’re doing with that data. While each of these companies let users manage and delete audio after it’s recorded, you should also take precautionary measures to make sure your smart devices are only listening when you want them to.

Tips for preventing mistaken voice assistant activations

[Ars Technica]