EU Warns Meta Over Illegal Content And Disinformation Targeting Israel-Hamas War


The European Union has expressed concerns about the circulation of illegal content and disinformation related to the Israel-Hamas war on social media platforms. In an urgent letter addressed to Mark Zuckerberg, the founder of Meta, the parent company of Facebook and Instagram, the EU commissioner emphasizes the need for strict compliance with the Digital Services Act (DSA) rules. The EU has given Meta 24 hours to respond to these concerns.

Key Takeaway

The European Union has warned Meta, the parent company of Facebook and Instagram, about the dissemination of illegal content and disinformation regarding the Israel-Hamas war. Meta has been given 24 hours to respond to the EU’s concerns. The EU emphasizes compliance with the Digital Services Act and the need for effective measures to mitigate the spread of such content.

Increasing Concerns

The EU’s internal market commissioner, Thierry Breton, has published letters to both Elon Musk, owner of X (formerly Twitter), and Mark Zuckerberg, raising concerns about the dissemination of illegal content and potentially harmful disinformation. Following the recent terrorist attacks by Hamas against Israel, there has been a surge in illegal content and disinformation on certain platforms within the EU. The EU calls on Meta to be vigilant and ensure compliance with the DSA rules, act promptly on notices of illegal content, and take effective measures to mitigate the spread of such content.

Meta’s Response

Meta, at the time of writing, has not responded to Breton’s warning. However, the company has established a special operations center to monitor and respond to the situation. Meta’s teams are working to keep the platforms safe, take action against policy violations and local laws, and collaborate with fact-checkers to limit the spread of misinformation.

Concerns about Disinformation in Elections

Breton’s letter to Zuckerberg also addresses concerns about the spread of disinformation targeting European elections. While steps have been taken by Meta to increase mitigation measures, reports have also surfaced about deep fakes and manipulated content circulating on the platforms. The EU emphasizes the importance of taking the risk of amplification of fake and manipulated images and facts seriously, especially in the context of upcoming elections in several European countries.

DSA and the Obligations for Larger Platforms

The Digital Services Act applies deep obligations and governance controls to larger platforms, including Meta-owned Facebook and Instagram. These platforms are expected to proactively identify and mitigate systemic risks, such as political disinformation, in addition to promptly acting on reports of illegal content. Breaches of the DSA can result in fines of up to 6% of global annual turnover.

Concerns about Deepfake Disinformation

Political deepfakes, which have become easier and cheaper to produce due to advancements in generative AI, are a particular concern for the EU. The bloc is taking steps to address this issue and will be engaging with AI giant OpenAI to discuss potential safeguards. The EU recognizes the role that social media platforms play in disseminating deepfake disinformation and aims to address this issue as part of its wider efforts to combat misinformation.

Leave a Reply

Your email address will not be published. Required fields are marked *