Meta Platforms Inc. is considering a complete withdrawal from New Mexico, a state of 2.1 million people, escalating a high-stakes legal battle over child safety regulations that could set a precedent for social media nationwide.
Back
Meta Platforms Inc. is considering a complete withdrawal from New Mexico, a state of 2.1 million people, escalating a high-stakes legal battle over child safety regulations that could set a precedent for social media nationwide.

Following a $375 million jury verdict, Meta is threatening to pull Facebook and Instagram from New Mexico rather than comply with proposed child safety measures, arguing the demands are technically impossible and would force a statewide service shutdown for all users.
"While it is not in Meta’s interests to do so, if a workable solution to Attorney General Torrez’s demands is not reached, we may have no choice but to remove access to its platforms for users in New Mexico entirely,” a Meta spokesperson said in a statement.
The standoff follows a Santa Fe jury's decision last month that found Meta knowingly harmed children's mental health, resulting in the $375 million penalty. Now, in a second phase, New Mexico Attorney General Raúl Torrez is seeking court-ordered reforms including 99% accuracy in age verification for users under 13 and mandatory parental association for child accounts, which Meta claims are unfeasible.
The conflict in New Mexico is the first to reach trial among more than 40 similar state-led lawsuits, making it a critical test case. A withdrawal would cut off services for 2.1 million residents and could embolden other states to pursue aggressive regulations, while compliance could force Meta to build a new, costly standard for child safety that could become a blueprint for the rest of the country.
New Mexico's demands for injunctive relief go beyond financial penalties, aiming to fundamentally alter how Meta's platforms operate for younger users. Attorney General Raúl Torrez is asking the court to mandate permanent bans for predatory adults, safer recommendation algorithms, and the appointment of an independent, court-supervised child safety monitor. "This would be a truly historic moment for a district court to order those kinds of measures and to have Meta create a new standard for child safety," Torrez said.
Meta, in a court filing, countered that the state's requirements are impractical and disregard the realities of the internet. The company specifically highlighted the demand for 99% accuracy in verifying that child users are at least 13 years old. “As a practical matter, this requirement effectively requires Meta to shut down its services — for all users in the state — or else comply with impossible obligations,” the company stated.
The legal battle is a focal point in a broader, national reckoning over the impact of social media on youth mental health. More than 40 state attorneys general have filed lawsuits against Meta, alleging the company contributes to a mental health crisis among young people. While most of those cases are being pursued in federal court, New Mexico's has advanced the furthest, making its outcome particularly significant.
Torrez dismissed Meta's threat as a "stalling tactic," expressing doubt that the company would abandon a market, even a small one, while facing dozens of similar legal challenges nationwide. “I highly doubt that they’re going to be willing and able to turn the lights off for their product all over the country,” he said. This situation draws comparisons to Meta's 2023 decision to block news content in Canada in response to a new law requiring payments to publishers, a move that drew criticism during a wildfire crisis.
This article is for informational purposes only and does not constitute investment advice.