Digital Aides Are Recording Our Bedtime Fun?

Stock Photo / Image via Pexels

Turns out, all our fears are true.

When first getting a digital assistant like Siri, Cortana, or Alexa, there’s a worry about how much the machine is tuning in. While the normal command to start up the assistant is to call out its name, there’s the concern that these products and digital aides are listening in at all times. Even in our most intimate of moments.

Advertisement

And now, a British news site is taking up arms against Amazon because they think Alexa is doing just that. Apparently, Amazon has records of many Alexa owners in the heat of passion. Again, Alexa is meant to start up once it hears its name or a command from the user. The Echo speakers, which accompany the program, then make recordings of users speaking to them.

But unfortunately, the product isn’t perfect. Apparently, there have been several cases of the product hearing a sound that it thinks is a command and thus starts recording. As you can imagine, this can create all kinds of awkward situations.

The Sun reports that an English-speaking Amazon team in Bucharest, Romania monitors thousands of Alexa recordings. Within them, employees have heard several moments of family fights, talks of health concerns, and sexual interactions.

Advertisement

“It’s been said that couples having sex and even what sounded like a sex attack have been heard by staff,” said one employee to the news source.

“There were times when I heard couples arguing at home and another when kids were trying to teach Alexa to swear. We were told to focus on Alexa commands but it was impossible not to hear other things going on.”

After hearing this report, politicians and political advocacy groups shared their unhappy thoughts.

“These devices are being used to invade people’s privacy,” said Tory MP Andrew Rosindell.

Advertisement

“We need stronger laws to protect the public from devices which are always on and always listening. People with these devices are right to feel creeped out and concerned,” added Privacy International.

Unfortunately, this problem is a universal one as Siri also records requests for monitoring and it deals with accidental listening. In addition, there are listening offices in the U.S., Costa Rica, and India.

That said, Apple shares that only one percent of UK voice activation recordings were listened to by staff. While that might sound like a good thing, that means there are still 99% more recordings with private conversations about health, financial issues, family drama, and sexual adventures that are in “the cloud” somewhere.

Advertisement

But in response to these claims of privacy breaches, Amazon says it intends to protect its customers.

“We take the security and privacy of our customers’ information seriously. We label a fraction of one percent (0.2%) of customer interactions in order to improve the customer experience,” Amazon said in a statement.

“For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.

“We have strict technical and operational safeguards in place to protect customer privacy, and have a zero tolerance policy for the abuse of our system.

“Data associates do not receive information that can identify customers, access to internal tools is highly controlled and customers can delete voice recordings associated with their account at any time”

Source: The Sun

Leave a Comment