FourWinds10.com - Delivering Truth Around the World
Custom Search

Your smart speakers could be SPYING on you, especially now that you’re working from home

Franz Walker

Smaller Font Larger Font RSS 2.0

3-27-20

Image: Your smart speakers could be SPYING on you, especially now that you’re working from home

As people shift to working from home in response to measures to stem the spread of the global coronavirus outbreak, a new security threat has reared its ugly head. Lawyers are now warning that smart speakers, such as Amazon Alexa, could be spying on them during critical meetings at home.

U.K. law firm Mishcon de Reya issued advice to staff to mute or shut down listening devices like Amazon’s Echo or Google’s voice assistant when talking about client matters at home. The firm also suggested that staff not have any such devices near their workspace.

The warning from Mishcon covers any sort of visual-enabled or voice-enabled device, such as the aforementioned smart speakers from Amazon and Google. However, Joe Hancock, who heads Mishcon de Reya’s cybersecurity efforts, said that video products such as Ring, also owned by Amazon, as well as baby monitors and even closed-circuit TV, are also a concern.

“Perhaps we’re being slightly paranoid, but we need to have a lot of trust in these organizations and these devices,” said Hancock. “We’d rather not take those risks.”

He added that the firm is worried about these devices being compromised, especially with cheap knock-off devices.

Smart speakers pose a security risk

Law firms are currently facing challenges trying to create work-from-home arrangements for specific job functions while maintaining security. Alongside confidential discussions, critical documents and communications also need to be secured. This mirrors the situation faced by banks in Wall Street, where some traders are now being asked to work from alternative locations that banks keep on standby for disaster recovery, instead of from home, to maintain confidentiality.

Smart speakers have already become notorious for activating in error and making unintended purchases, or sending snippets of audio to Amazon or Google. In fact, a report from Consumer Intelligence Research Partners claims that their installed base was 76 million units and growing, which has put them under scrutiny from cybersecurity experts.

Devices can start recording even by accident

For their part, Amazon and Google claim that their devices are designed to record and store audio only after they detect a keyword to wake them up. These companies say that instances of inadvertent activation are rare. However, a recent study by Northeastern University and Imperial College London found that these can happen between 1.5 and 19 times a day.

“Anyone who has used voice assistants knows that they accidentally wake up and record when the ‘wake word’ isn’t spoken – for example, ‘seriously’ sounds like the wake word ‘Siri’ and often causes Apple’s Siri-enabled devices to start listening,” stated the study.

“There are many other anecdotal reports of everyday words in normal conversation being mistaken for wake words,” continued the report. “Our team has been conducting research to go beyond anecdotes through the use of repeatable, controlled experiments that shed light on what causes voice assistants to mistakenly wake up and record.”

Companies are listening in

One of the more concerning things about these devices is that the companies behind them are actively listening in. Last year, Amazon admitted that not only did Alexa save recorded audio, even if they were deleted, but that its employees were also actively listening in on those recordings.

“This information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone,” Amazon said in a statement after the fact came to light.

“We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow.”

Despite this, Amazon does not explicitly state in its terms and conditions that employees review customer recordings. That said, the privacy settings for Alexa does offer users the chance to opt-out of helping the firm “develop new features.”

Sources include:

SeattleTimes.com

Independent.co.uk 1

Independent.co.uk 2

https://www.naturalnews.com/2020-03-27-smart-speakers-are-spying-on-millions-of-americans.html