Last week an Alexa-enabled Echo device recorded and shared a private conversation of its owners without their knowledge. This news has created an alert among the public on the voice assistant privacy issues.
On Thursday, news came out that a Portland family’s Echo device had recorded a conversation of them without them knowing and had sent an audio file to one of their contacts. The couple whose name was not revealed said that this had happened two weeks back. They told the news station KIRO 7 that they came to know about this issue only when the contact who received the file called them to say that she received a strange voice recording. They have reported this matter to the Amazon immediately.
Amazon has confirmed about the issue and had explained in a statement that Echo woke up due to a word in background conversation that sounds like “Alexa. Then, the subsequent conversation was heard as a “send message” request and Alexa said out loud “To whom?” At this moment, the background conversation was transcribed as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” and interpreted the background conversation as “right.”
The affected family’s home was wired with Internet of Things connected with several Amazon devices to control the house’s heat, lights and security system. They have disconnected everything following the incident.
This incident has been looked upon by the public as another example of how easy it is for Alexa – and other voice assistants – to pose a threat to the consumers’ privacy.
“It is not clear if this was simply a software flaw or a malicious attack, but it is a stark wake-up call nonetheless. The reports that a popular voice assistant unexpectedly recorded a personal conversation and leaked information to a third party should be a reminder of the potential security and privacy risks of our… always-connected world”, says Andreas Kuehlmann, senior vice president and general manager at Synopsys.
Amazon has already been under observation when it comes to privacy issues. It was discovered by researchers that it was possible to closely mimic legitimate voice commands in order to carry out suspicious actions. Checkmarx researchers launched a malicious proof-of-concept Amazon Echo Skill to show how attackers can abuse the Alexa virtual assistant to eavesdrop on consumers with smart devices and automatically transcribe every word spoken.
But this incident shows that there exist glitches within smart voice assistants that can potentially lead to a breach of privacy.
The privacy issues are not just confined to Alexa. Last year, researchers devised a proof of concept that gives potentially harmful instructions to popular voice assistants like Siri, Google, Cortana, and Alexa using ultrasonic frequencies instead of voice commands.