Newsnews

Meta Faces New Questions In Europe About Child Safety Risks On Instagram

meta-faces-new-questions-in-europe-about-child-safety-risks-on-instagram

Introduction

Meta, the parent company of Instagram, has received another formal request for information from European Union regulators, focusing on child safety concerns on the social network. The request is made under the Digital Services Act (DSA), a recently implemented online rulebook that places obligations on large platforms to tackle illegal content and protect minors. The European Commission has expressed concerns about Meta’s response to child protection issues, particularly regarding the sharing of self-generated child sexual abuse material (SG-CSAM) on Instagram.

Key Takeaway

Meta faces further questions from European regulators about its efforts to address child safety risks on Instagram, including the sharing of self-generated child sexual abuse material (SG-CSAM). The company has been warned of potential heavy sanctions if it fails to take swift action.

The Growing Concern

The European Commission’s latest request comes in the wake of a report by the Wall Street Journal (WSJ) that highlighted Instagram’s struggle to combat CSAM. The report revealed that Instagram’s algorithms were connecting accounts involved in making, buying, and trading underage-sex content. In response to the WSJ’s exposé, the EU warned Meta about the risk of severe sanctions if it does not promptly address child protection issues.

Continued Challenges

A subsequent report by the WSJ claims that Meta has not effectively rectified the identified issues. Although the company established a child safety task force, it reportedly failed to prevent its own systems from promoting accounts involved in producing and sharing underage-sex content. Despite removing hashtags associated with pedophilia, Meta’s recommendation systems continue to suggest new ones with minor variations. The company’s track record in removing problem accounts and user groups has also been inconsistent.

Potential Consequences

Meta’s inadequate performance in combating the sharing of illegal CSAM/SG-CSAM and addressing associated child safety risks could prove costly in the EU. Under the DSA, the European Commission has the authority to impose fines of up to 6% of a company’s global annual turnover for breaching the regulation’s rules. In addition to financial penalties, Meta may also face reputational damage if EU regulators continue to question its approach to safeguarding minors.

Request for Information

The European Commission has requested Meta to provide additional information on the measures it has taken to comply with its obligations regarding the protection of minors, specifically addressing the circulation of SG-CSAM on Instagram. The request also seeks details about Instagram’s recommender system and the amplification of potentially harmful content. Meta has until December 22 to submit the requested child safety data. Failure to comply with RFIs can result in DSA sanctions.

Conclusion

Meta’s handling of child safety risks on Instagram is under scrutiny in Europe, with regulators seeking more information on its efforts to combat illegal content, particularly self-generated child sexual abuse material. The company’s actions and responses to these concerns will be closely monitored, as it potentially faces significant penalties and reputation damage if found to be non-compliant with the DSA.

Leave a Reply

Your email address will not be published. Required fields are marked *