French prosecutors have launched a high-profile criminal investigation into Apple Inc.’s Siri voice assistant, thrusting the tech giant into the center of Europe’s intensifying debate over digital privacy and artificial intelligence. The probe, which was officially opened by the Paris public prosecutor’s office on October 6, 2025, follows a formal complaint alleging that Apple unlawfully collected and analyzed users’ conversations through Siri without obtaining proper consent.
The complaint was filed by the French human rights organization Ligue des Droits de l’Homme (LDH), with substantial support from Thomas Le Bonniec, a former Apple subcontractor who worked on Siri’s audio analysis team in Ireland. Le Bonniec’s testimony paints a troubling picture: he alleges that contractors routinely listened to and graded snippets of Siri recordings, which sometimes included highly sensitive details—medical information, family discussions, and even financial exchanges—without users’ explicit permission. According to Le Bonniec, “contractors accessed sensitive user data, including fragments of private medical discussions, personal relationships, and financial exchanges, all without explicit user permission.”
The investigation, now under the jurisdiction of France’s Office for Combating Cybercrime, marks the first time French authorities have opened a formal criminal inquiry into the handling of voice assistant data by a major U.S. Big Tech company. Prosecutors are examining whether Apple’s data handling practices violated the European Union’s General Data Protection Regulation (GDPR) and French data protection laws. According to Reuters, the Paris prosecutor’s office confirmed that the case had been assigned to France’s specialized cybercrime police unit, highlighting the seriousness with which authorities are treating the allegations.
At the heart of the complaint is the claim that Siri’s “always-on” listening capabilities led to accidental activations, resulting in user conversations being recorded without their knowledge. These recordings, critics argue, were then reviewed by human contractors for quality control and improvement purposes—a process known as “grading.” While Apple has stated that these reviews are anonymized and used solely to enhance Siri’s performance, privacy advocates contend that the anonymization process may have been insufficient. When combined with metadata or device information, there could be a risk of re-identifying individuals, undermining the very privacy protections Apple claims to uphold.
The implications could be far-reaching. If French investigators determine that Apple failed to obtain informed consent or that its subcontracting arrangements breached confidentiality standards, the company could face criminal penalties under France’s penal code, as well as administrative sanctions from the national data protection authority, CNIL. Legal experts suggest that the case could set a significant precedent for how voice data is treated and the responsibilities that technology firms bear when operating within the EU.
Apple, for its part, has vehemently denied any wrongdoing. The company insists that Siri does not record or share user audio without consent and points to a series of privacy reforms implemented after a similar controversy erupted in 2019. That year, whistleblower revelations exposed that contractors had been listening to Siri interactions, prompting Apple to temporarily suspend the program. In the aftermath, Apple made data collection opt-in, introduced explicit consent controls, and committed to handling all reviews internally under tighter controls. In a statement provided to Benzinga, Apple said, “We have strengthened Siri’s privacy protections in 2019 and again in 2025, and voice recordings are never sold to advertisers or shared with marketers.”
Despite these assurances, skepticism lingers among regulators and privacy advocates. Bloomberg reports that critics question whether accidental activations—triggered by phrases resembling “Hey Siri”—can truly be considered informed consent. Furthermore, the lack of full transparency in Apple’s privacy disclosures has drawn renewed scrutiny. LDH’s complaint specifically accuses Apple of collecting, recording, and processing voice data from Siri interactions in violation of French privacy law and the GDPR, arguing that users were not adequately informed about the extent of data collection or how their information would be used.
This isn’t the first time Apple has faced legal trouble over Siri’s privacy practices. In January 2025, the company agreed to a $95 million settlement in the United States to resolve a lawsuit alleging that Siri had inadvertently recorded private conversations, which sometimes resulted in targeted advertising. Eligible users began receiving compensation emails labeled “Lopez Voice Assistant Class Action Settlement” in May. These incidents have fueled mounting criticism of Apple’s approach to privacy, especially as the company has struggled to keep pace with rivals in the rapidly evolving field of generative AI.
Indeed, Apple’s handling of Siri has come under fire from both investors and competitors. Some investors have called Siri “an embarrassment” due to its perceived lag behind competitors like Google Assistant and Amazon’s Alexa. In August 2025, Google even mocked Siri’s slow upgrades in a Pixel 10 ad, urging users to “change your phone.” By September, reports from Bloomberg’s Mark Gurman indicated that Apple was racing to overhaul Siri, aiming to make it more chatbot-like and competitive in the age of artificial intelligence.
The French investigation arrives at a time of heightened regulatory pressure on Big Tech. France has positioned itself at the forefront of digital privacy enforcement, pursuing cases against Meta, Google, and Amazon under the GDPR, and implementing its own tax framework for digital services. The Office for Combating Cybercrime will now scrutinize Siri’s data storage, access controls, and the adequacy of contractual arrangements with third-party graders. Investigators are also expected to assess whether data could be cross-referenced with device identifiers or location logs, potentially increasing the risk of re-identification.
The outcome of the probe could have ripple effects far beyond Apple. As BBC and other outlets have noted, voice assistants depend on massive datasets of human speech to refine their algorithms. However, the risk of accidental recordings and inadvertent surveillance has led privacy advocates to call for stronger safeguards and clearer user disclosures. Even if the investigation does not result in a conviction, the proceedings may reignite public skepticism about the true extent of corporate access to personal data amid the current AI boom.
Market watchers are paying close attention to the case’s impact on Apple’s financial performance. Apple shares slipped 0.52% on October 6, 2025, and edged down another 0.35% in after-hours trading, according to Benzinga Pro. Analysts suggest that, should Apple be found guilty, the company could face substantial fines and be compelled to overhaul its data pipelines, potentially delaying or altering features in upcoming products like the iPhone 17 or enhanced Apple Intelligence integrations.
Looking ahead, the French investigation may accelerate broader EU-wide audits of voice technologies and push companies toward privacy-by-design approaches, such as federated learning models that keep data on-device. For Apple and its peers, the message is clear: in an era of intelligent assistants, transparency and user control are not just marketing slogans—they are legal imperatives that could shape the future of digital innovation.
As the world waits for the outcome, Apple’s handling of the Siri investigation will serve as a crucial test of its commitment to privacy in an increasingly connected age, watched closely by regulators, competitors, and consumers alike.