In a recent interview at the Code Conference, Yoel Roth, the former head of trust and safety at Twitter, voiced concerns about X, the platform formerly known as Twitter, and its potential clash with the European Union’s (EU) Digital Services Act (DSA). Roth highlighted X’s decision to withdraw from the EU’s Code of Practice on Disinformation and Commissioner Vera Jourova’s criticism of the platform as the worst offender for spreading disinformation. The collision with EU regulators is now seen as inevitable.
Former Twitter executive Yoel Roth warns that X (formerly Twitter) is on a collision course with the European Union’s Digital Services Act. X’s decision to withdraw from the EU’s Code of Practice on Disinformation and its reduction in staff dedicated to election integrity raise concerns about the platform’s commitment to safety and trust. Compliance with the DSA is crucial for X to avoid penalties, and failure to address disinformation and safety issues may result in losing access to the EU market.
X’s Departure from EU Code of Practice Raises Eyebrows
X’s withdrawal from the EU’s Code of Practice on Disinformation in May raised concerns about the platform’s commitment to combating systemic threats such as disinformation. As a very large online platform (VLOP) under the DSA, X is legally obligated to address these threats. The EU has made it clear that compliance with the non-legally binding Disinformation Code will be a factor in evaluating whether larger platforms are in line with the binding DSA.
Penalties and Consequences for Non-Compliance
The DSA carries penalties of up to 6% of global annual turnover for confirmed breaches of the online governance regime. In addition to financial sanctions, the European Commission could also block services of platforms that repeatedly fail to comply with the rulebook. This means X could potentially lose access to the EU market if it fails to address safety issues and tackle political disinformation effectively.
Concerns About Election Integrity and Safety
Roth raised concerns about the drastic reduction in X’s election integrity team, which was recently reported to have lost half of its members. This reduction comes at a time when the platform is emphasizing its efforts to combat threats to elections. The departure of key staff members raises doubts about X’s ability to effectively tackle political disinformation and maintain the integrity of elections.
Abandoning the Rule of Law
Roth expressed his disappointment with X’s decision-making under Elon Musk’s leadership and its departure from the rule of law. He cited the abandonment of operating principles and policies to enforce safety and trust on the platform. Roth’s personal experience with death threats, which have not been removed from X despite his departure, raises questions about the company’s commitment to improving safety.
The Need for Transparency and Independent Research
Roth criticized X’s lack of transparency regarding prevalence of hate speech, abuse, and disinformation on the platform. He questioned the reliability of studies released by X and called for rigorous peer review to evaluate the platform’s data. X’s restrictions on data accessibility for independent researchers are also in conflict with the EU’s demands for public interest research into algorithmic effects.