Hackers can control Siri, Alexa, and Google Assistant, scientists claim

UC Berkeley university students found vulnerabilities in Siri and Alexa allowing hackers to embed stealthy commands in audio recordings

Siri and Alexa vulnerabilities illustrated

Researchers at UC Berkeley proved[1] Siri and Alexa to be vulnerable to hacker's exploit. Favorite voice assistants can be embedded with secret commands undetectable to the human ear allowing hackers to steal money from device owner's account, unlock smart locks, make purchases online, make phone calls, and initiate other unauthorized activities in disguise of the victim.

The studies being held in 2016 by UC Berkeley students revealed how silent commands could be injected into the audio recordings.[2] Researchers constructed audio adversarial examples on speech-to-text transcription neural networks that can be added to the arbitrary waveform. These alterations add a digital noise, but cannot be heard by a human ear or recognized by the smartphones and smart speakers.

This kind of Siri and Alexa vulnerability enables hackers to activate artificial intelligence (AI) systems on smartphones and smart speakers without leaving a chance for their owners to know about backdoor access.

Hackers haven't yet initiated Dolphin Attacks on Siri and Alexa

Luckily, neither Siri nor Alexa or Google voice assistants have been exploited by hackers. At least not yet. However, the vulnerability detected is extremely dangerous as it would allow criminals to initiate all kinds of nefarious activities.

Siri,[3] for example, is a more powerful assistant than most people realize. People can learn more about Siri's capabilities on the Hey Siri website available on iPhone. For example, it can download apps from App Store, turn on Airplane Mode, use HomeKit to adjust lights or lock the door, read notifications, make phone calls, check for emails, and so on. Thus, consequences of Siri's attack may be devastating.

According to one of the UC Berkeley students, Nicholas Carlini, Siri and Alexa vulnerabilities may initially be exploited for initiating web browser's redirects to illegal websites (gambling, pornography) or making phone calls to premium rate telephone numbers. In his report, he also added that:

We wanted to see if we could make it even more stealthy […] I assume that the malicious people already employ people to do what I do.

Companies are aware of the issue

Scientists started analyzing the concern regarding Siri and Alexa's vulnerability for the injection of secret commands back in 2016. The results have been provided to the companies developing voice assistants, including Amazon, Apple, and Google.

Although companies do not expatiate on the issue, it's clear that they are taking measures to make their smart devices secure. Amazon refuses to disclose specific security measures, but, based on the Checkmarx report,[4] it turns out that Amazon changed Alexa's settings in a way it would shut down any uncommon sessions in which the microphone receives commands for longer-than-usual time. Besides, the company plans to strengthen its law regulations in the future.

Apple makes the improvements on Siri as well. According to the company, HomePod[5] will be supplemented with multiple security features. For example, to unlock the door using a smart speaker or access sensitive data, people will be asked to unlock their iPhone.

All in all, we expect that companies will make their voice assistants secure until they reach a stage of mainstream usage.

About the author
Alice Woods
Alice Woods - Likes to teach users about virus prevention

Alice Woods is the News Editor at 2-spyware. She has been sharing her knowledge and research data with 2spyware readers since 2014.

Contact Alice Woods
About the company Esolutions

References
Files
Software
Compare