Meta Platforms, the parent company of major social media networks, is facing serious allegations of prioritizing growth over the safety of young users.
New Mexico Attorney General Raúl Torrez filed a lawsuit against Meta, its subsidiaries, and CEO Mark Zuckerberg, accusing the company of creating unsafe environments for children on its social media platforms.
Allegations of inadequate protection for minors
This lawsuit brings to light significant concerns about the safety of minors in the digital realm.
In the lawsuit, Mr. Torrez alleges that Meta fails to remove Child Sexual Abuse Material (CSAM) effectively and enables adults to contact and solicit underage users.
He also contends that the addictive design of Meta’s platforms harms children and teenagers by negatively impacting their mental health, self-worth, and physical safety.
These allegations suggest a significant lapse in the company’s responsibility towards its younger audience.
Read More: Governor DeSantis criticizes Senate Republicans on border deal
Meta sued by 33 states for impact on youth mental health
Meta has previously been the subject of a separate lawsuit filed in October 2023 by 33 states, accusing it of contributing to the youth mental health crisis.
In this context, Mr. Torrez said, “Mr. Zuckerberg and other Meta executives are aware of the serious harm their products can pose to young users, and yet they have failed to make sufficient changes to their platforms that would prevent the sexual exploitation of children.”
He also criticized Meta’s executives for failing to self-regulate effectively despite assurances to Congress and the public.
Investigation exposes Meta’s knowledge of inappropriate user content
Much evidence for the lawsuit was gathered through an undercover investigation, with decoy accounts of children aged 14 years and younger.
The recently unredacted documents from this investigation reveal that Meta employees were aware of the inappropriate content shared between adults and minors.
An internal email from 2017 showed resistance within Meta to scan Messenger for harmful content, citing competitive disadvantages.
Also Read: Blinken emphasizes necessity of a Palestinian state for Israel’s security
Meta platforms linked to child harassment and grooming issues
Further, a 2021 presentation estimated that at least 100,000 children faced sexual harassment daily on Meta’s messaging platforms.
Additionally, a 2020 document revealed that Instagram lacked safety features present on Facebook, leading to higher rates of grooming.
By March 2021, Instagram took steps to restrict messaging between adults and minors.
Meta’s defense and alleged commitment to safety
In response, Meta argued that it has always aimed for teens to have safe online experiences, spending a decade working on these issues.
The company emphasized using sophisticated technology, hiring child safety experts, and cooperating with law enforcement.
According to Meta, the complaint mischaracterizes its efforts, using selective quotes and cherry-picked documents.
Reporting obligations and future safety measures
Under U.S. law, Meta is required to report instances of CSAM to the NCMEC CyberTipline. The company has submitted millions of reports, comprising about 85 percent of all reports to NCMEC in 2022.
Additionally, Meta announced plans to restrict content that teenagers can see on its platforms, aiming to provide a more age-appropriate experience.
Mark Zuckerberg, along with other social media CEOs, is scheduled to testify before the U.S. Senate at the end of January on child safety issues.
This upcoming testimony underscores the growing concern and regulatory scrutiny regarding the safety of children on social media platforms.
Read Next: GOP shift on Mayorkas impeachment appearance requests written statement