Today : Oct 23, 2025
Technology
23 October 2025

Apple Removes Dating Apps Amid Growing Privacy Concerns

State attorneys general, tech giants, and the United Nations are taking action as privacy breaches, app removals, and new neurotechnology risks highlight the urgent need for stronger data protections.

In a year marked by mounting concerns over digital privacy and the rapid evolution of technology, a flurry of enforcement actions, high-profile app removals, and fresh calls for global standards have brought the issue of personal data rights into sharp focus. With the absence of comprehensive federal data privacy legislation in the United States, state attorneys general, tech giants, and international bodies are each stepping up in their own way to address the growing risks and ethical dilemmas posed by new digital platforms and emerging neurotechnologies.

According to a landmark report published by the Electronic Privacy Information Center (EPIC) on October 22, 2025, state attorneys general have played a pivotal role in safeguarding consumer privacy over the past five years. The report reveals that these officials brought or settled more than 1,200 consumer privacy cases between 2019 and 2024, a figure that underscores their vital, if sometimes under-resourced, position in the digital privacy landscape. The cases span a broad spectrum: 78 data privacy cases, 564 data breach incidents, 117 platform governance actions—most relating to children’s online safety—and 145 cases involving unwanted calls.

“Because State AGs are determined to protect consumers from privacy harms, we see them creatively pursuing privacy-related enforcement despite limited resources and enforcement authorities that have not always kept pace with newer technologies,” Chris Frascella, EPIC counsel and report co-author, said in a prepared statement, as cited by The Record. The majority of these enforcement actions have been brought under state consumer protection laws, a trend that may soon shift as more states enact their own comprehensive data privacy statutes.

While state authorities have been busy, the private sector has also been forced to reckon with the implications of weak privacy protections. A case in point: the removal of the popular Tea and TeaOnHer apps from Apple’s iOS App Store on October 21, 2025. These apps, which gained viral attention over the summer—Tea for its anonymous reviews of men by women, and TeaOnHer as a male-focused counterpart—became lightning rods for debate around privacy, digital safety, and the responsibilities of tech platforms.

Apple’s decision to pull both apps followed a cascade of user complaints and negative reviews, including allegations that minors’ personal information had been posted. According to Business Insider, an Apple spokesperson explained that the removals were due to the apps failing to meet requirements around content moderation and user privacy. Notably, the company had “communicated repeatedly” with the app developers, but persistent issues led to the final removal. “We’ve worked closely with Apple through 20+ rounds of feedback, implementing every safety feature they requested and removing thousands of inappropriate posts daily,” Xavier Lampkin, developer of TeaOnHer, told Business Insider, expressing disappointment with the outcome.

The privacy concerns surrounding these apps were not hypothetical. In late July, Tea suffered a data breach that exposed approximately 72,000 images, including selfies and driver’s licenses used for identity verification, as well as some users’ direct messages. Lawsuits quickly followed. Scott Cole, lead attorney on one of the cases, told Business Insider, “I don’t think Tea intended to violate people’s rights; they were just sloppy.” TeaOnHer, for its part, experienced a brief but serious API documentation exposure in August, which allowed a security researcher to access some user posts. Lampkin maintained that the issue was fixed within an hour and characterized the API’s openness as a “safety and transparency feature.”

Despite these incidents, as of October 22, some users reported that the apps continued to function on devices where they had already been downloaded, highlighting the persistent challenges of digital enforcement. Meanwhile, copycat apps like Tea On Her & Him—Overheard quickly filled the void, with the latter even topping the lifestyle charts on Apple’s App Store. As of this writing, Google has not indicated whether it will take similar action regarding these apps on its Play Store.

The struggle to keep up with privacy threats is not limited to the U.S. or to consumer apps. On the global stage, the United Nations is sounding the alarm about the next frontier: neurotechnologies and the sensitive neurodata they generate. On October 22, 2025, Dr. Ana Brian Nougrères, the UN Special Rapporteur on the right to privacy, presented a report to the 58th session of the Human Rights Council in Geneva. The report called for the development of a model law on neurotechnologies and neurodata, warning that current legal frameworks are woefully inadequate to protect individuals’ mental privacy in the face of rapidly advancing brain-computer interfaces and data-gathering tools.

Dr. Nougrères’ report outlined four key recommendations for states: develop specific regulatory frameworks for neurotechnologies and neurodata; incorporate established privacy principles into national law; promote ethical practices in the use of neurotechnologies; and educate the public to ensure informed consent. She emphasized the unprecedented sensitivity of neurodata, which is “directly related to cognitive state and reflects unique personal experiences and emotions.”

“Neurotechnologies are tools or devices that record or alter brain activity and generate neurodata that not only allow us to identify a person, but also provide an unprecedented depth of understanding of their individuality,” Dr. Nougrères stated. While she acknowledged the potential mental health benefits of such technologies, she voiced concern that neurodata could “manipulate people’s brains, leading to a violation of privacy in one’s own thoughts and decision-making.”

Most countries have yet to legislate neuro-rights or establish protections for neurodata. Chile, notably, became the first country to pass a neuro-rights law in 2021, setting a precedent that privacy advocates hope others will soon follow.

The convergence of these stories—the determined efforts of state attorneys general, the high-profile failures of app-based privacy protections, and the UN’s urgent call for new global standards—underscores how the privacy debate has entered a new era. The digital and biological frontiers are colliding, and the need for clear, enforceable, and forward-looking privacy protections has never been more pressing. As technology continues to outpace regulation, the choices made by lawmakers, tech companies, and international bodies in the coming years will shape not just how our data is handled, but the very contours of personal autonomy in the digital age.