Ireland’s media regulator, Coimisiún na Meán, has launched a formal investigation into Elon Musk’s social media platform, X (formerly Twitter), to determine whether the company is complying with European Union (EU) law regarding content moderation and user appeals. The probe focuses on whether X provides users with a clear, accessible, and effective way to challenge decisions made about content they report, and whether the platform adequately informs users about the outcomes of those reports.

The Digital Services Act and User Rights

The investigation stems from concerns that X is violating the EU’s Digital Services Act (DSA), a landmark regulation designed to hold large online platforms accountable for illegal and harmful content. The DSA mandates that platforms like X must establish robust internal complaint-handling systems, allowing users to report issues and appeal moderation decisions. The core principle is transparency and user control over content. Without this, the DSA argues, platforms can operate with impunity, potentially amplifying misinformation or failing to protect vulnerable users.

Key Areas of Investigation

Coimisiún na Meán will specifically examine three crucial aspects of X’s operations:

  1. Appeal Mechanisms: Does X provide a straightforward process for users to challenge content moderation decisions, particularly when content seemingly violates the platform’s own terms of service?
  2. Reporting Feedback: Are users informed about the outcome of their reports, and are they given explanations for why content was or wasn’t removed?
  3. Accessibility of Complaints: Is the complaints-handling system easy to find and use, or is it buried in complex menus or legal jargon?

Potential Consequences for X

If the investigation finds X in violation of the DSA, the platform could face substantial financial penalties. The regulator has the authority to impose fines of up to 6% of X’s global annual turnover, which could amount to hundreds of millions of euros. The DSA is designed to be a serious deterrent.

Beyond fines, X could also be forced to enter into a legally binding agreement with Coimisiún na Meán to address the identified compliance issues. This could involve overhauling its content moderation processes, increasing transparency, and improving user support.

Previous Scrutiny of X

This is not the first time Coimisiún na Meán has scrutinized X’s operations. In June, the regulator requested detailed information about how the platform protects children from harmful content. The DSA places a particularly high burden on platforms to safeguard minors.

Earlier in 2024, another investigation was launched to assess whether X and other major platforms make it easy for users to report illegal content and provide clear contact points for complaints.

Last year, the European Commission preliminarily found X in breach of the DSA regarding digital advertising, data access, and manipulative “dark patterns” designed to influence user behavior without informed consent.

Broader Implications

The Irish investigation into X is part of a wider EU effort to enforce the DSA and hold large online platforms accountable. The DSA’s goal is to create a safer and more transparent online environment for users. The EU believes platforms have a responsibility to protect their users from harmful content and ensure fair treatment.

The outcome of this investigation will likely set a precedent for how other platforms operating in the EU must comply with the DSA. If X is found in violation, it could trigger similar enforcement actions against other companies that fail to meet the law’s standards.

Ultimately, the investigation underscores the growing regulatory pressure on social media platforms to balance free speech with user safety and transparency. The DSA represents a significant shift in how online content is governed, and X, like other major platforms, must adapt to the new rules or face serious consequences