Today : Nov 09, 2025
World News
08 November 2025

Denmark Moves To Ban Social Media For Under Fifteens

The Danish government plans sweeping new rules to restrict social media use for children, citing youth mental health and tech industry resistance as key concerns.

Denmark is poised to become the first country in the European Union to enact a sweeping ban on social media access for children under 15, a move that has already ignited debate across the continent and beyond. The Danish government’s announcement on November 7, 2025, marks a significant escalation in efforts to shield young people from what officials describe as the overwhelming risks posed by digital platforms and their algorithms.

The proposal, led by Denmark’s Ministry of Digitalization, would set a national age limit of 15 for social media use. However, there’s a nuance: parents will be able to grant access to certain platforms for children as young as 13, but only after a specific assessment. The details of how this assessment will work—and which platforms the rules will target—are still being hammered out. For now, the list of affected apps is expected to include some of the most popular among EU youth, such as TikTok, Instagram, Snapchat, Discord, Roblox, Fortnite, Twitch, and YouTube.

“The amount of time they spend online—the amount of violence, self-harm that they are exposed to online—is simply too great a risk for our children,” said Caroline Stage, Denmark’s Minister for Digital Affairs, in an interview with The Associated Press. She added, “They [tech giants] have an absurd amount of money available, but they’re simply not willing to invest in the safety of our children, invest in the safety of all of us.”

This move isn’t coming out of nowhere. According to Stage, a staggering 94% of Danish children under 13 already have profiles on at least one social media platform—and more than half of those under 10 do as well. A February 2025 analysis by the Danish Competition and Consumer Authority found that children in Nordic countries spend an average of 2 hours and 40 minutes per day on social media. The prevalence of youth online has raised alarm bells for parents, educators, and lawmakers alike, especially as studies and anecdotal reports point to rising rates of anxiety, depression, and disrupted sleep among children and teens.

Prime Minister Mette Frederiksen, who has championed the initiative, cited this growing mental health crisis as a driving force. In a recent speech to Parliament, Frederiksen argued that mobile phones and social networks were “stealing our children’s time.” She pointed to data showing that never before have so many young people suffered from anxiety and depression, and noted that many children are struggling with reading and concentration. “On screens, they see things no child or young person should see,” Frederiksen said.

The government’s plan is far from an overnight fix. Lawmakers from across the political spectrum—right, left, and center—have signaled support, but the legislation will likely take months to pass. Stage emphasized, “I can assure you that Denmark will hurry, but we won’t do it too quickly because we need to make sure that the regulation is right and that there is no loopholes for the tech giants to go through.”

Enforcement is a major question mark. Denmark plans to leverage its national electronic ID system—used by nearly all citizens over 13—and is developing an age-verification app. The idea is to require tech platforms to implement robust age checks. If companies fail to comply, Denmark could seek to enforce penalties through the European Union, with potential fines of up to 6% of a company’s global income. Several other EU countries are also piloting age-verification technologies, and the European Commission rolled out a prototype app in July as part of broader efforts to protect minors online.

Pressure on Big Tech is mounting, not just in Denmark but around the world. The EU’s Digital Services Act, which took effect two years ago, already forbids children under 13 from holding accounts on major social media and video-sharing platforms. Australia set a precedent in December 2024 by banning social media for anyone under 16, with fines of up to 50 million Australian dollars ($33 million) for platforms that fail to enforce the rule. In the United States, a patchwork of state laws has emerged, with some states enacting outright bans for those under 13 and others requiring parental consent or restricting access during certain hours.

Despite these efforts, critics argue that outright bans may not be the silver bullet policymakers hope for. Many point out that most platforms already have age limits—typically 13—yet children routinely bypass them, often with help from parents. In Denmark, 94% of seventh-graders reportedly have social media profiles before their 13th birthday. An Australian government report from 2025 found that 80% of kids under 13 regularly sidestep age restrictions on platforms like YouTube, TikTok, and Snapchat, and more than a third have their own accounts, sometimes set up with parental assistance.

Some experts and lawmakers across the EU—including those in Greece, Italy, and Spain—are advocating for mandatory, verifiable age checks as a more effective solution. France and Norway are reportedly considering their own age-dependent bans. As the debate unfolds, Denmark’s bold move could trigger a domino effect, with other countries watching closely to see if the policy succeeds—or if it simply pushes children to find new ways around the rules.

The Danish government’s initiative has drawn broad public support. Last year, 50,000 citizens signed a petition calling for a ban on TikTok, Snapchat, and Instagram for children. Lawmakers argue that the business models of tech giants exert too much pressure on young users, exposing them to harmful content and commercial interests. “Children and young people have their sleep disrupted, lose their peace and concentration, and experience increasing pressure from digital relationships where adults are not always present,” the Ministry of Digitalization said in a statement. “This is a development that no parent, teacher or educator can stop alone.”

Tech companies, for their part, say they are working to improve safety. TikTok noted in a statement that it has “more than 50 preset safety features for teen accounts, as well as age appropriate experiences and tools for guardians such as Family Pairing.” The company added, “We look forward to working constructively on solutions that apply consistently across the industry.” Meta, parent company of Instagram and Facebook, did not immediately respond to requests for comment.

Lawmakers in Denmark are clear-eyed about the challenges ahead. “We’ve given the tech giants so many chances to stand up and to do something about what is happening on their platforms. They haven’t done it,” said Digitalization Minister Caroline Stage. “So now we will take over the steering wheel and make sure that our children’s futures are safe.”

As the legislative process unfolds, Denmark’s move is already sparking conversations across Europe and beyond about how to balance the promise of digital technology with the need to protect the youngest users. Whether other countries will follow suit—and whether such bans can be effectively enforced—remains to be seen. But for now, Denmark is drawing a line in the sand and signaling that, at least for its children, the digital wild west may soon have new sheriffs in town.