In a rapid litmus test of the European Union's newly upgraded content moderation rules, the bloc has issued a stern warning to Elon Musk's X (formerly Twitter) for its alleged failure to combat illegal content in the aftermath of Saturday's deadly attacks on Israel by Hamas terrorists operating from the Gaza Strip.
The European Commission has also expressed concerns regarding the proliferation of disinformation on X, particularly related to the terrorist attacks and their repercussions.
Unlike terrorism-related content, disinformation itself is not inherently illegal in the EU. However, the EU's Digital Services Act (DSA) mandates that X, classified as a "very large online platform," must address the risks posed by harmful falsehoods and diligently respond to reports of illegal content.
Graphic videos, purportedly depicting terrorist attacks on civilians, have been circulating on X since Saturday. Some of this content includes posts claiming to show footage of attacks within Israel or Israel's retaliatory strikes on Gaza Strip targets. These claims have been debunked by fact-checkers.
The attacks by Hamas on Israeli civilians and tourists, following militants breaching border fences to mount surprise assaults, have led to the declaration of a "state of war" by Israel's prime minister. Israel's military has responded with numerous missile strikes into the Gaza Strip.
Several videos posted on X since the attacks have been identified as entirely unrelated to the conflict, including footage from Egypt captured last month and even a clip from a video game falsely claimed to depict Hamas missile attacks on Israel.
A Wired report recently highlighted the chaotic environment on Musk's platform, aptly titled "The Israel-Hamas War Is Drowning X in Disinformation."
At one point, Musk even recommended following accounts that had previously posted antisemitic comments and false information, although he subsequently deleted the tweet containing the suggestion.
Musk's challenge lies in the DSA's regulations, which dictate how social media platforms and other user-generated content services must respond to reports of illegal content, including terrorism. It also legally obligates larger platforms, such as X, to counter disinformation risks.
This unfolding and volatile situation in Israel and Gaza serves as a real-world test for the EU's revamped regulations to determine if they are robust enough to confront X's most notorious provocateurs. Notably, Elon Musk, the owner of the platform since last fall, has made several changes that have significantly impacted the quality of information accessible on X. These changes include ending legacy account verification and altering the Blue Check system into a pay-to-play scheme. Musk has also overhauled content moderation policies, reduced in-house enforcement teams, and promoted a decentralized, user-driven alternative called "Community Notes." This approach essentially outsources the responsibility for addressing complex issues like disinformation to users, potentially contributing to increased engagement and confusion by perpetuating extreme relativism. Musk has also withdrawn X from the EU's Code of Practice on Disinformation, signifying a clear challenge to EU regulators.