X Corp, under the ownership of Elon Musk, has launched a legal challenge against Minnesota's recently enacted law targeting deepfakes in elections, arguing that it infringes upon free speech rights. The lawsuit, filed in a Minnesota federal court, contends that the law is unconstitutional, overly vague, and creates an environment ripe for censorship.
The Minnesota law, passed in 2023, aims to prevent the spread of misinformation by criminalizing the dissemination of AI-generated deepfakes that falsely depict political candidates within 90 days of an election. Deepfakes are defined as manipulated media, including videos, images, and audio clips, that are so realistic that a reasonable person would believe them to be genuine. Violators could face fines and even jail time if they knowingly share deepfakes with the intent to harm a candidate or influence an election.
X Corp's lawsuit claims the Minnesota law violates the First and Fourteenth Amendments of the U.S. Constitution, as well as the Minnesota Constitution. The company argues the law's vague wording makes it difficult for social media platforms to determine what content is prohibited, leading to "blanket censorship" of legitimate political speech. X Corp also argues that the law conflicts with Section 230 of the Communications Decency Act, which generally protects social media companies from liability for user-generated content. The company is seeking a permanent injunction to prevent the enforcement of the statute.
Elon Musk, a self-proclaimed "free speech absolutist," has been vocal about his commitment to allowing a wide range of expression on X. Since acquiring the platform, he has rolled back content moderation policies and reinstated accounts that were previously banned. X Corp's lawsuit against Minnesota aligns with Musk's broader stance on free speech, which prioritizes open discourse, even if it includes potentially misleading or offensive content. X argues that its "Community Notes" feature, which allows users to add context and flag problematic content, is a sufficient tool for addressing deepfakes and misinformation. The company also suggests that its AI chatbot can help explain content to viewers.
The debate around deepfakes and their potential to manipulate voters has led at least 22 states to enact some form of prohibition on their use in elections. Proponents of these laws argue that they are necessary to protect the integrity of the democratic process. However, critics like Musk and X Corp contend that such laws can stifle free speech and give the government too much power to regulate online content.
This is not the first legal challenge to Minnesota's deepfake law. A Republican state lawmaker and a social media influencer previously filed a similar lawsuit, arguing the law could lead to criminal prosecution even for political satire or parody. While a judge rejected their request for a preliminary injunction, the case is still ongoing. X Corp's lawsuit is the first such challenge from a major social media platform, and its outcome could have significant implications for the regulation of online political speech.
The case highlights the ongoing tension between the need to combat misinformation and the protection of free speech rights in the digital age. As AI technology advances, the debate over how to regulate deepfakes and other forms of manipulated media is likely to intensify, with social media platforms, lawmakers, and the courts grappling with the complex legal and ethical issues involved. Other legal challenges for X Corp include a lawsuit against California's AB 587, a law requiring social media companies to disclose content moderation policies. Additionally, X has challenged the Indian government's content blocking practices.