EU Policy. X urged to answer content moderation questions

Workers install lighting on an "X" sign atop the company headquarters, formerly known as Twitter, in San Francisco.
Workers install lighting on an "X" sign atop the company headquarters, formerly known as Twitter, in San Francisco. Copyright Noah Berger/Copyright 2023 The AP. All rights reserved.
Copyright Noah Berger/Copyright 2023 The AP. All rights reserved.
By Cynthia Kroet
Share this articleComments
Share this articleClose Button

The platform reduced by four languages its coverage within the bloc.

ADVERTISEMENT

Online platform X has received a request for additional information about its decision to reduce its content moderators by 20% compared to October last year, the European Commission said today (8 May).

The move is a next step in an ongoing investigation which the Commission started in December 2023 related to X's handling of risk management, content moderation, dark patterns, advertising transparency and data access for researchers under the EU’s Digital Services Act (DSA).

The Commission said today that it wants to know more about why the platform reduced its linguistic coverage within the bloc from 11 EU languages to seven; numbers which the platform revealed in its second transparency report, a document required under the Digital Services Act (DSA).

X removed moderators for Bulgarian, Croatian, Latvian and Polish coverage, leaving language experts for Dutch, English, French, German, Italian, Portuguese and Spanish, the latest report said. 

With the submission of the first reports in October, platforms were scrutinised over the low number of content moderators they had in some of the smaller EU member states.

Facebook has a single employee looking at Maltese content, and three in Estonia, claiming that much of the process is automated. In comparison, TikTok, which has fewer users per month, has six people looking at Estonian content and none for Maltese.

Penalties

The EU executive's request also touches on the risk assessment conducted by X in relation to the implementation of generative AI tools in the EU, the Commission said. X must provide the content moderation and AI information by 17 May, and remaining questions – on dissemination of illegal content, and protection of fundamental rights – by 27 May.

In case of failure to reply, the Commission could decide to request the information by decision. Under the DSA, the EU executive can also impose fines for incorrect, incomplete, or misleading information. In this case, failure to reply by the deadline can lead to periodic penalties.

Under the DSA, which fully came into force in February, platforms with more than 45 million monthly average users in the EU are required to follow strict rules, such as transparency requirements and the protection of minors online.

Besides the X probe, the Commission has started investigations and preliminary probes into Meta, TikTok and AliExpress.

Share this articleComments

You might also like