Elon Musk’s recently rebranded social media platform, X, is making headlines again, but not for the reasons the tech mogul had hoped. The Australian e-Safety Commission has issued a substantial fine of A$610,500 (approximately $386,000) to the platform for its refusal to cooperate with an investigation into its child abuse prevention practices. This development poses a significant challenge for X, which has been grappling with revenue decline and criticism over its content moderation policies.
The e-Safety Commission’s decision to impose the fine on X, formerly known as Twitter prior to Musk’s involvement, stems from the platform’s unwillingness to respond to inquiries regarding how it handles reports of child abuse material and the methods it uses to identify such content. This lack of cooperation during the investigation has raised questions about X’s commitment to combatting illegal content.
Although the financial penalty may seem relatively modest when compared to the $44 billion price tag Musk paid for the platform in October 2022, it carries substantial implications for X’s reputation. Advertisers have already been reducing their investments in the platform, given its marked reduction in content moderation and the reinstatement of previously banned accounts.
In addition to the Australian investigation, the European Union has launched its own inquiry into X for potential violations of tech regulations, particularly related to disinformation linked to the Hamas attack on Israel.
The Australian e-Safety Commission holds the power to compel internet companies to provide information about their online safety practices. Non-compliance can lead to fines, and if X chooses not to pay the fine, the regulator may resort to legal action.
Elon Musk had publicly stated that “removing child exploitation is priority #1” after taking the company private. However, the regulator identified inconsistencies in X’s responses. When questioned about its efforts to prevent child grooming, X claimed that it was “not a service used by large numbers of young people.” The platform also contended that the available anti-grooming technology was not sufficiently capable or accurate for deployment on its platform.
The e-Safety Commission also issued a warning to Alphabet’s Google for its non-compliance with requests for information regarding the handling of child abuse content. Some of Google’s responses were characterized as “generic” by the regulator. Google expressed disappointment with the warning but reiterated its commitment to collaborating on online safety.
However, the non-compliance of X with the regulator’s requests was seen as more serious. The platform failed to answer questions about response times to child abuse reports, efforts to detect abuse in live streams, and staffing levels for content moderation, safety, and public policy. X admitted that it had reduced its global workforce by 80% and no longer maintained public policy staff in Australia following Musk’s takeover. X also indicated that it did not employ tools to detect child abuse material in private messages due to the technology still being in a developmental stage.
In light of these developments, X’s future remains uncertain, with both financial and reputational challenges ahead as it navigates the complex landscape of content moderation and online safety.