Apple has agreed to pay $95 million to settle a long-running class action lawsuit that accused the company of illegally intercepting customers’ conversations through its Siri virtual assistant and sharing snippets of those conversations with human reviewers.
The suit was originally filed in 2019 after a whistleblower told The Guardian that third-party contractors Apple hired to review Siri’s responses sometimes heard private interactions, ranging from patients talking to doctors to people having sex or buying drugs. While Apple claimed that Siri only activated its listening mode after detecting its wake word—”Hey Siri”—The Guardian reported that the assistant mistakenly turned itself on and began recording conversations in response to similar words and even the sound of zippers.
The lead plaintiff in the class action lawsuit, Fumiko Lopez, alleged that Apple devices improperly recorded their daughter, who was a minor, mentioning brand names like Olive Garden and Air Jordans and then served her advertisements for those brands on Apple’s Safari browser. Other named plaintiffs alleged that their Siri-enabled devices entered listening mode without them saying “Hey Siri” while they were having intimate conversations in their bedrooms or were talking with their doctors.
In their suit, the plaintiffs characterized the privacy invasions as particularly egregious given that a core component of Apple’s marketing strategy in recent years has been to frame its devices as privacy-friendly. For example, an Apple billboard at the 2019 Consumer Electronics Show read “What happens on your iPhone, stays on your iPhone,” according to the lawsuit.
The proposed settlement, filed in California federal district court on Tuesday, covers people who owned Siri-enabled devices from September 17, 2014 to December 31, 2024 and whose private communications were recorded by an unintended Siri activation. Payout amounts will be determined by how many Apple devices a class member owned that improperly activated a listening session.
Apple also agreed to confirm that it has permanently deleted recordings collected by Siri before October 2019 and to publish a web page that explains how customers can opt-in to its Improve Siri feature, which allows the company to share and listen to audio recordings for quality control.
Apple did not immediately respond to a request for comment.
Shortly after The Guardian’s report, Apple temporarily suspended all human grading of Siri responses and acknowledged that “we haven’t been fully living up to our high ideals.” The company said it would resume human grading after releasing software updates and that going forward, graders would be given computer-generated transcripts of conversations, rather than the audio itself, and that only Apple employees, and not third-party contractors, would conduct the grading.
Read the full article here