A global wave of legislative action is targeting social media algorithms, drawing direct parallels to the historical regulation of the tobacco industry. Nations such as Australia, France, and Denmark are now implementing strict age limits and algorithmic restrictions to protect minors from platforms designed for maximum engagement.
According to a report by elmostrador.cl, the movement follows a pattern known as the Overton window, where once-radical ideas regarding digital regulation are becoming mainstream policy. Australia became the first nation to ban social media access for children under 16 in December 2025, imposing heavy fines on non-compliant platforms.
France and Denmark have established a minimum age of 15, while Spain is moving toward a 16-year-old limit. The European Union is also advancing the Digital Services Act, which mandates the removal of addictive algorithms and prohibits targeted advertising to children.
Legal precedents and health risks
Recent court rulings in the United States have established legal accountability for tech giants. In March 2026, a Los Angeles jury found Meta and Google negligent in their platform designs, ordering $6 million in damages. A day prior, a New Mexico jury ordered Meta to pay $37 and million for allowing child sexual exploitation.
Data from international public health organizations highlight the urgency of these measures. The World Health Organization reports that 11% of young people now show symptoms of problematic use, a figure that has doubled in the last six years. Research indicates that adolescents spending more than three hours daily on these platforms face double the risk of depression and anxiety.
Furthermore, 46% of teenagers report that social media negatively impacts their body image. The outlet reports that 95% of adolescents aged 13 to 17 use at least one social network, with one-third using them "almost constantly."
Critics argue that these platforms are built using sophisticated engineering to maximize time spent on apps through infinite scrolls and dopamine-driven notifications. The report notes that internal documents and whistleblower testimonies from former employees have confirmed that these features are optimized using behavioral data.
While the technology provides benefits—with 67% of adolescents reporting feeling more supported through social networks—the focus of new legislation is on preventing algorithmic optimization for profit at the expense of child welfare. Proposed solutions include mandatory age verification, bans on targeted advertising for minors, and restricting recommendation algorithms on child accounts.