A newly unredacted version of the multi-state lawsuit against Meta, formerly known as Facebook, alleges a troubling pattern of deception and minimization in how the company handles children under 13 on its platforms. The lawsuit, filed last month, accuses Meta of violating the Children’s Online Privacy Protection Act (COPPA) and highlights discrepancies between the company’s public statements and internal practices.
Key Takeaway
A recently unredacted lawsuit against Meta reveals allegations of the company’s disregard for children on its platforms and violation of COPPA regulations. The lawsuit highlights the discrepancy between Meta’s public statements and internal practices, raising concerns about the company’s handling of underage users and its impact on children’s mental health.
COPPA Violations
In the lawsuit, it is alleged that Meta does not obtain or attempt to obtain verifiable parental consent before collecting the personal information of children on Instagram and Facebook, despite knowing that its platforms are used by millions of kids under 13. The internal documents obtained by the Attorneys General of 42 states provide evidence of Meta’s knowledge of underage users and its failure to enforce age restrictions.
According to the lawsuit, Meta has internally tracked and documented under-13 users on its platforms for years, contrary to its public claims that users under 13 are not welcome. CEO Mark Zuckerberg reportedly received a report in 2018 estimating that there were approximately 4 million people under 13 on Instagram in 2015. Furthermore, Meta possesses data from 2020 indicating that a significant percentage of children aged 6-12 had used Facebook.
Ignoring Reports of Underage Accounts
The lawsuit also highlights Meta’s alleged negligence in handling reports of underage accounts. It is claimed that reports of accounts owned by users under 13 are automatically ignored, and Meta continues collecting the child’s personal information if there are no photos associated with the account. Out of the 402,000 reports of under-13 accounts in 2021, fewer than 164,000 were disabled. Additionally, actions taken on one platform do not extend to associated accounts on other Meta platforms.
Complexity of Age Verification
Meta has cited age verification as a complex industry challenge, stating that “verifying the age of people online is a complex industry challenge.” However, the lawsuit argues that Meta could do more to verify parental consent and implement stricter age restrictions to protect children.
Implications for Children’s Mental Health
One of the key concerns raised in the lawsuit is the impact of Meta’s practices on children’s mental health. The lawsuit alleges that Meta’s platforms contribute to poor body image and bullying and points out the company’s failure to take appropriate measures to address these issues.