Norway, Indonesia, and Australia Are Banning Social Media for Under-16s. Meta Is Threatening to Leave States That Try It.
What happened
New Mexico opened the second phase of its landmark child safety trial against Meta on May 4, seeking additional damages beyond the $375 million jury verdict from March plus a court order forcing Meta to redesign Facebook and Instagram for users under 18. The demanded changes include mandatory age verification, rolling back end-to-end encryption for minors, algorithmic redesign, 90-hour monthly usage caps, and a five-year court-appointed child safety monitor. Meta warned it may quit New Mexico rather than comply, calling the demands 'technically impractical.' The trial runs through May 22. In parallel, Norway announced draft legislation to ban social media accounts for under-16s. Australia's age verification mandate took effect in December 2025. Indonesia's ban launched in March 2026. Over 40 US state attorneys general have filed similar lawsuits against Meta.
Meta's exit threat is not a bluff, but it is also not a defense: it confirms the company believes complying with child safety rules is incompatible with its business model.
The Hidden Bet
Meta's platform exit threat would cause New Mexico to back down from structural demands
New Mexico AG Torrez compared this to Big Tobacco. Courts in the 1990s did not accept tobacco companies threatening to stop selling cigarettes as a reason to drop safety requirements. Leaving a state market is not a constitutional defense against a public nuisance finding
Age verification is technically difficult, which justifies the absence of enforcement
Australia implemented a functional age verification mandate at the national level in December 2025. The EU is launching a standardized verification app in mid-2026. The technical difficulty argument is real but shrinking rapidly as countries build the infrastructure
Social media bans for children will push teens to safer alternatives
Meta itself makes this argument, and it is partially right. Australia's early implementation data shows VPN usage among teens increased. Without coordinated enforcement across platforms, banning Meta in Norway or New Mexico may concentrate young users on less regulated alternatives without eliminating the underlying harm
The Real Disagreement
The genuine fork is whether the harm is in the platform's existence for minors or in how it is designed. Norway, Indonesia, and Australia have chosen the access restriction approach: under-16s simply cannot have accounts. New Mexico is choosing the design regulation approach: Meta can operate but must redesign features that cause harm. These are not compatible strategies. Access restriction is enforceable at scale but easy to circumvent. Design regulation requires ongoing judicial oversight of a global product's technical architecture, which no court has ever successfully maintained. I would lean toward design regulation as the more durable tool, but only if paired with age verification enforcement that has real teeth. Norway's approach without the verification mechanism is mostly symbolic.
What No One Is Saying
Meta's internal documents showed employees calculated that Zuckerberg's 2019 encryption decision would affect detection of 7.5 million child sexual abuse material cases. That is not a design flaw. That is a documented, deliberate choice with a quantified cost in child harm, made by a named executive. No one at Meta has faced criminal liability for that decision. The civil trial is about money and remedies. The criminal question, whether knowingly enabling child exploitation at scale constitutes a crime, has not been asked in court.
Who Pays
Teenagers in New Mexico
Immediately upon any exit implementation; trial concludes May 22
If Meta exits the state rather than comply, 2.1 million New Mexico residents lose access to Instagram and Facebook. The harm falls on young people who use these platforms for social connection, not Meta's shareholders
Meta shareholders
Ongoing erosion; structural damages from the New Mexico trial could range from hundreds of millions to billions
40+ state AG lawsuits, EU Digital Services Act enforcement, and potential federal legislation converging simultaneously. Meta's own investor disclosure warns of material financial impact. Each new country's ban reduces the addressable teen demographic globally
Children in countries with poorly enforced bans
Implementation gap of 12-24 months between law passage and functional enforcement
Symbolic bans without verification infrastructure (likely Norway, Indonesia at launch) create paper compliance. The harm does not decrease; it just becomes legally invisible because the accounts technically violate a rule nobody enforces
Scenarios
Judge orders structural remedies, Meta complies
Chief Judge Biedscheid finds public nuisance and orders a subset of New Mexico's demands (age verification, algorithm audit, child safety monitor). Meta complies to avoid a precedent in 40 other states. The New Mexico model becomes the template for a national standard.
Signal Meta does not appeal the public nuisance finding; begins implementing age verification within 60 days of the ruling
Meta exits, political crisis forces federal action
Meta follows through on the exit threat. Instagram and Facebook go dark for New Mexico users. Congressional pressure for a federal preemption law intensifies, either to override state child safety rules (industry preference) or to codify them nationally (advocacy preference).
Signal Meta announces New Mexico service termination date within 30 days of any adverse ruling
Global age verification infrastructure normalizes
EU's standardized age verification app launches successfully in mid-2026. Australia's system is extended to other platforms. International pressure on Meta to adopt a single global verification standard becomes commercially easier than defending 40 different legal regimes.
Signal Meta announces adoption of the EU age verification standard for all EU users before the New Mexico trial concludes
What Would Change This
If the New Mexico judge finds that applying 'public nuisance' law to a digital platform is constitutionally unsupported under the First Amendment, the entire design-regulation strategy collapses and the only viable tool is age-based access restriction. That ruling would accelerate the Norway/Australia approach globally.
Related
The World Is Copying Australia's Teen Social Media Ban. Australia's Is Already Failing.
powerThe Kids App Ban That Big Tech Will Actually Win
powerThe Social Media Ban That Already Failed
powerThe EU Charged Meta With Letting Children Onto Instagram. Meta Said It Uses Self-Declared Birthdays.