EU Charges Elon Musk’s X for Breaching Digital Services Act

eu-charges-elon-musk's-x-for-breaching-digital-services-act

European regulators have charged Elon Musk’s X with breaching the Digital Services Act (DSA), accusing the platform of misleading users among other violations. This marks the first time the European Commission has issued preliminary findings under the DSA.

The European Commission’s investigation reveals that X has failed to comply with key transparency areas stipulated by the DSA. The platform faces allegations of employing “dark patterns” to deceive users, not maintaining a sufficient ad repository, and obstructing researchers’ access to data. These dark patterns refer to subtle design cues intended to nudge consumers into giving up personal data or making other decisions beneficial to the company. A common example is when companies highlight an acceptance button for tracking in bright colors while downplaying the opt-out option by minimizing its font size or placement.

Additionally, the European Commission highlighted concerns regarding X’s approach to “verified” accounts. According to the findings, the process for obtaining verified status does not align with industry practices and deceives users. Unlike traditional practices where blue checks indicated trustworthy sources of information, on X, anyone can subscribe to obtain the verified status. This has led to evidence of malicious actors abusing the blue check to deceive users, undermining the platform’s integrity.

If the Commission’s preliminary findings are confirmed, X could face a fine of up to 6% of its global annual turnover. This potential penalty underscores the seriousness of the alleged violations and the EU’s commitment to enforcing the DSA’s provisions.

The Digital Services Act, which came into effect in August, aims to create a safer and more transparent online environment. Among its many rules, it specifically bans dark patterns, ensuring that companies cannot subtly manipulate users into actions that compromise their personal data or lead to other unfavorable decisions.

The findings against X are the result of an ongoing investigation launched by EU regulators in December. This probe extends beyond the issues of transparency and user deception, as regulators are also scrutinizing X’s content moderation practices. The investigation seeks to determine whether X has breached the DSA by allowing the dissemination of illegal content and failing to combat misinformation effectively.

The formal investigation was initiated after EU officials began questioning X earlier last year. This inquiry was driven by growing concerns about the presence of Hamas-affiliated accounts on the platform, particularly following the terror group’s attacks against Israel on October 7. The presence of such accounts raised alarms about X’s effectiveness in moderating content and ensuring the platform’s safety and reliability.

X has not responded to requests for comments regarding these allegations. The platform’s silence on the matter has only heightened scrutiny and speculation about its compliance with the DSA and its overall commitment to transparency and user protection.

The Digital Services Act represents a significant step forward in regulating online platforms and protecting users from deceptive practices. By enforcing stringent transparency and content moderation standards, the EU aims to foster a safer digital environment where users can trust the information they encounter and feel secure in their online interactions.

As the investigation continues, the spotlight remains on X and its practices. The outcome of this probe could have far-reaching implications not only for X but for other digital platforms operating within the EU. The findings serve as a reminder that compliance with the DSA is not optional, and platforms must prioritize user protection and transparency to avoid severe penalties and maintain their credibility.

The European Commission’s preliminary findings against X highlight significant concerns about the platform’s transparency and content moderation practices. As the investigation progresses, the tech world watches closely, understanding that the enforcement of the Digital Services Act could reshape the landscape of online platform regulation and user protection.