Today : Nov 16, 2025
Technology
16 November 2025

Tech Giants Challenge California Over Social Media Law

Meta, TikTok, and YouTube file lawsuits to block a new California statute restricting personalized feeds for minors, igniting a high-stakes First Amendment battle.

On November 13, 2025, a new chapter in the ongoing battle between tech giants and government regulators began as Meta, TikTok, YouTube, and the trade group NetChoice filed lawsuits against the State of California. Their target: California’s Protecting Our Kids from Social Media Addiction Act, a law designed to restrict personalized social media feeds for minors aged 13 to 17. The law, which supporters argue is necessary to protect young people from addictive design features, has ignited a fierce constitutional debate over free speech, parental rights, and the responsibilities of Big Tech.

According to filings in the US District Court for the Northern District of California, the companies claim that the law amounts to a content-based restriction on speech, violating the First Amendment. The plaintiffs—Meta Platforms Inc., TikTok Inc., and NetChoice (which represents Meta, Snap Inc., X Corp., and other technology companies)—are no strangers to legal wrangling over the regulation of online platforms. Their complaints, filed on November 13, 2025, echo a broader national campaign by NetChoice to challenge state laws governing social media and e-commerce.

The central issue at stake is whether California can legally restrict the way social media companies curate and deliver content to minors. The law would ban personalized algorithmic feeds for users under 18 unless there is parental consent, and it would impose additional requirements such as hiding “like” counts from minor users. The state’s Attorney General’s Office has staunchly defended the statute. In a statement, the office said, “Companies have blatantly shown us that they are willing to use addictive design features, including algorithmic feeds and notifications at all hours of the day and night, to target children and teens, solely to increase their profits.” The AG’s office further argued, “This law is about protecting speech.”

This isn’t the first time California’s efforts to regulate social media have come under legal fire. NetChoice previously succeeded in blocking the Age Appropriate Design Code, another California regulation, after the Ninth Circuit Court of Appeals agreed that it likely violated the First Amendment. But this time, the legal landscape appears more complicated. The Ninth Circuit recently held that an existing challenge to the “addictive-feed protections” of the new law is likely to fail—a preliminary win for the state. However, the court also ruled that some provisions, such as the prohibition on displaying “like” counts to minors, are likely unconstitutional.

During oral arguments in the NetChoice case, Judge Ryan D. Nelson drew a striking analogy, comparing the use of addictive social media algorithms to tobacco products. He remarked that these algorithms “might be actually worse than a carcinogen.” Such comparisons underscore the growing concern among lawmakers, parents, and public health advocates about the potential harms of social media on young users.

Meta’s legal argument hinges on the First Amendment. The company claims that its algorithmic curation of content—even for teens—constitutes protected expressive activity. In its complaint, Meta cited the US Supreme Court’s recent decision in Moody v. NetChoice LLC, which recognized that the curation and dissemination of third-party expression by platforms is protected by the First Amendment. Meta argues, “Subject to its content moderation policies, its services ‘allow[] users to gain access to information and communicate with one another about it on any subject that might come to mind.’” The company insists that the law’s restrictions on personalized feeds “unconstitutionally restrict Meta from curating and disseminating to teens aged 13–17 third-party expression—a form of expression that the US Supreme Court recently held, in Moody v. NetChoice LLC, is protected by the First Amendment.”

Meta further contends that California’s law improperly dictates how the company organizes user-generated speech, likening it to the government telling a library how to order its books. The company’s complaint reads, “Just as the state can’t dictate how a library orders its books, it also can’t dictate how Meta organizes fully protected user-generated speech.” Meta asserts that the law fails strict scrutiny, calling it a content-based and speaker-based restriction on its editorial activity that does not promote a legitimate government interest. The company also points out that parents already have tools at their disposal to supervise their children’s online activity, suggesting that the law is unnecessary and overbroad.

Other plaintiffs echoed these arguments. YouTube and Google, for example, claimed in their joint complaint that the law “burdens their protected right to express their view as to the content that will be relevant, valuable, and appropriate for each particular user, and burdens the rights of minors to access speech and discover content without the permission of a parent.” They argue that the parental consent provision imposes unconstitutional burdens not only on the platforms but also on minors’ ability to access information freely.

The legal teams representing the tech companies are among the country’s most prominent law firms: Covington & Burling LLP for Meta, O’Melveny & Myers LLP for TikTok, and Cooley LLP for Google and YouTube. The cases—Meta Platforms Inc. v. Bonta, TikTok Inc. v. Bonta, and NetChoice v. Bonta—are all being heard in the Northern District of California.

California’s Attorney General, meanwhile, has pointed to the state’s preliminary win in the Ninth Circuit as evidence that the law is on solid legal ground—at least in part. The AG’s office emphasized that the law is aimed at protecting children and teens from the potentially harmful effects of social media addiction, not at censoring speech. The office stated, “The US Court of Appeals for the Ninth Circuit held that the existing challenge to the ‘addictive-feed protections is likely to fail.’”

Yet, the legal battle is far from over. TikTok’s complaint notes that neither the Ninth Circuit nor the district court has determined whether the law’s personalized-feed provisions are unconstitutional as applied to any particular online platform. Meta’s complaint explicitly states that it filed its suit to address this unresolved issue, underscoring the high stakes for both the tech industry and state regulators.

The outcome of these cases could set a national precedent, influencing how states can regulate the design and delivery of social media content to minors. As NetChoice continues its litigation campaign across the country, the eyes of the tech world—and many concerned parents—are fixed on California. Will the courts side with the platforms’ claims to free speech and editorial discretion, or will they uphold the state’s efforts to protect young users from what some see as the digital equivalent of a public health crisis?

As the legal saga unfolds, one thing is certain: the tension between innovation, free expression, and the welfare of children online isn’t going away. The coming months will reveal whether California’s bold experiment in social media regulation survives constitutional scrutiny—or becomes just another footnote in the ever-evolving story of the internet age.