Back
to top
← To posts list

Audio Virus is Coming?

ClickHelp Team
Written by
ClickHelp Team
Category
Last Updated on
May 18th, 2018
Read Time
1 minute read

Audio Virus

According to a recent research, it’s possible to hack voice-activated assistants like Apple’s Siri and Amazon’s Alexa. These systems are sensitive to hidden commands that you don’t hear. Researchers from University of California, Berkeley, and Georgetown University showed in 2016 that such command sounds could be in YouTube videos, radio programs and even in white noise that many people liked to listen to during their workday. This situation carries a potential threat because someone can make your phone call somebody, open websites or even buy something and unlock the door of the smart home through the speech recognition systems.

Last year, researchers at Princeton University and China’s Zhejiang University found voice-activated devices that can activate Google Now through using inaudible frequencies. Chinese researchers called the technique DolphinAttack.

But it’s not necessary to create special devices to activate the system. You should be careful while watching TV shows. For example, an episode of South Park named “White People Renovating Houses” triggers Amazon Alexa and Google Home.

One person mentioned that he had to unplug the phone because “this @SouthPark episode has set my @amazon Alexa off about 15 times so far.”

Another example is Burger King ad that purposely asked ‘O.K., Google, what is the Whopper burger?” And Google voice-activated system reads information about Whopper from Wikipedia page.

Of course, now it’s funny, but it can be a problem in the future. According to Juniper Research, more than a half of all households will have at least one smart speaker by 2021. Some Berkeley researchers published a research paper where they described that it’s possible to hide commands into music or spoken text.

Speaking about a spoken text, as you know, technical writers create different content including audio content and video documentation with sound. Sometimes a company may pay a third-party to create audio versions of some documentation parts. In the wrong hands, this may cause significant problems – hidden commands in an audio file that third-party produces may have your smartphone make an unnecessary purchase, or even send company confidential data to the attacker. It is highly possible that some anti-audio-virus programs will be created in the future to solve these problems and detect harmful sounds. Who knows! Nowadays you should be careful with any audio content you get from a third-party. A workaround may be to produce this content in-house rather than outsourcing this work.

Take care!

Good Luck with your technical writing!
ClickHelp Team
Author, host and deliver documentation across platforms and devices

Give it a Try!

Request a free trial to discover the ClickHelp features!
Start Free Trial

Want to become a better professional?

Get monthly digest on technical writing, UX and web design, overviews of useful free resources and much more.

"*" indicates required fields

Like this post? Share it with others: