Apple Apologizes For Listening Into Your Sex Lives

Stock Photo / Image via Pexels

Apple is now apologizing for eavesdropping on customers’ sex lives.

Last month, we shared with you the story of digital assistants listening into their users. While Siri, Cortana, and Alexa are set to listen out for a specific command, reports were coming in that these products were mistaking background sounds for commands and recording them. Even worse, many of those recordings are then reviewed by special teams assigned to listen to the audio.

Advertisement

These reports came after a former Apple contractor exposed the practice to The Guardian.

“The sound of a zip, Siri often hears as a trigger,” the contractor said. “You can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”

Image via Apple

At the time, Apple shared that only one percent of UK voice activation recordings were listened to by their staff. But while that might sound ok, that still counts for hundreds of recordings heard by the teams. In addition, the company still holds hundreds of more recordings in their files.

Advertisement

And now this past Wednesday, Apple has come forward to release an apology statement to address the issue. Apple apologized for employing third-party contractors to listen to audio recordings by the Siri voice assistant.

“We realize we haven’t been fully living up to our high ideals, and for that we apologize,” Apple said.

“We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place,” the company said. “Those who choose to participate will be able to opt out at any time.”

Apple has also announced the suspension of these recordings. Customers will now have the option to opt out of the program, so they won’t keep audio recordings for grading Siri’s performance unless customers allow it.

Leave a Comment