Elon Musk’s social media platform, formerly known as Twitter, faced a setback in its attempt to overturn a California state law requiring social media companies to disclose their content moderation practices. The law mandates that companies with substantial annual revenue provide semiannual reports detailing their content moderation actions. Despite Twitter’s argument that the law infringes on its free speech rights, U.S. District Judge William Shubb dismissed the company’s request, deeming the reporting requirement justifiable within the context of the First Amendment. The ruling could have implications for how social media platforms operate and disclose their content moderation strategies.
Twitter, now referred to as X, had sued California in September, contending that the content moderation law violated both the U.S. Constitution’s First Amendment and the state constitution. The law aims to increase transparency by compelling social media companies to divulge information about objectionable posts and how they handle such content. Judge Shubb acknowledged the burden the reporting requirement places on social media companies but found it justified in the pursuit of transparency and accountability.
The court decision comes amid ongoing challenges for X, with its monthly U.S. ad revenue reportedly declining by at least 55% year-over-year since Elon Musk assumed control in October 2022. Many companies have paused advertising on the platform, raising concerns about its financial performance. Additionally, X faces scrutiny from the European Union, which is investigating the company for suspected breaches of obligations, particularly related to posts following attacks by Hamas on Israel. This investigation marks the EU’s inaugural probe under the Digital Services Act.
The content moderation law in California necessitates social media companies to issue regular reports, providing insights into their practices for handling objectionable content. While this places a substantial compliance burden on these companies, the court found the requirement justified within the framework of the First Amendment. The decision may influence how other states approach content moderation transparency, impacting the broader landscape of social media governance.
As X navigates these legal challenges and regulatory scrutiny, its commitment to complying with the Digital Services Act in Europe is emphasized. The company aims to cooperate with the regulatory process and address concerns related to its content moderation policies. The court case, identified as X Corp v Bonta, is scheduled for further proceedings, with a meeting between the involved parties and the judge set for February 26. The outcome will likely shape the discourse around free speech, transparency, and accountability in the realm of social media content moderation.