'Speech-police regime': Big Tech trade group sues Minnesota over social media transparency law
Published in Science & Technology News
Social media companies like Meta, Snapchat and X are scheduled to start sharing more information with consumers about usage rates and content practices under a new Minnesota transparency law that took effect Tuesday.
A powerful trade association and lobbying firm representing Big Tech wants to block the requirements, arguing in a federal lawsuit that the state’s regulations violate free speech rights protected by the First Amendment.
The law requires social media companies to publicly disclose user engagement statistics, content assessment, notification practices, interaction rates between user accounts and product experiments. It passed in 2024, as Minnesota and other states have sought to regulate some aspects of online conduct.
Minnesota Attorney General Keith Ellison has released two reports showing the potential harms of artificial intelligence and social media on the mental health of Minnesotans. Children were found to be frequent targets of harassment and online bullying, the latest report found, and other users frequently receive graphic or disturbing content or become targets of unwanted contact and fraud.
The lawsuit challenging Minnesota’s law mirrors other legal challenges that have come up across the country in recent years as state lawmakers seek to regulate social media and Big Tech, absent broader federal oversight.
Similar to lawsuits filed by Big Tech against laws in New York and California, the trade group NetChoice argues in the Minnesota case that the First Amendment’s wide protections on free speech shield social media companies from making mandatory disclosures. Many of the arguments draw on court precedents on free speech protections involving news publications.
Minnesota’s law is unconstitutional because it requires sharing information about “editorial” decisions in order to publish content, the lawsuit contends. NetChoice also argues the state is attempting to dictate which messages are publicized and which are not, a core part of a Supreme Court case last year involving the Big Tech lobby. The lawsuit also cites several provisions of Minnesota’s law as being vague and likely unenforceable.
NetChoice accuses Ellison of running a “speech-police regime” by requiring the disclosures.
In a statement Tuesday, Ellison said the reforms in Minnesota came after residents of all ages expressed the need and he looks forward to defending the law in court.
“For too long, social media companies have tested powerful and addictive new algorithms and features on Minnesotans effectively in secret, and with no oversight by their parents or the public,” Ellison said, adding that state residents — and especially kids — “deserve better than to be manipulated and exploited for profit by Big Tech.”
One key difference in Minnesota’s law compared with others is a required disclosure of the specific algorithms that social media companies use, said Paul Taske, NetChoice’s associate director of litigation.
These algorithms dictate how companies disseminate content to users, Taske said, and if made publicly available could enable bad actors or hamper U.S. companies from remaining competitive. The tech trade group has said Minnesota’s law would force social media companies to compromise trade secrets.
Taske said Minnesota’s law puts government pressure on websites with the goal of changing how machine algorithms disseminate posts to social media users.
“What the Supreme Court has been very clear on in the past and in recent years is that the government can’t do that directly, and it can’t do it indirectly either,” Taske said.
Minnesota’s law contains a broad enforcement provision allowing the attorney general to take unspecified actions against companies that fail to comply.
Jane Kirtley, a media ethics and law professor at the University of Minnesota, said challenges that say a law is vague or overly broad have been successful in the past, with courts finding that the laws violate the First Amendment. If a person cannot be sure what expressive activities are prohibited under the law, she said, the Supreme Court has concluded that people would be more likely to self-censor or avoid conduct that might run afoul with a law — thereby chilling free speech.
Kirtley also said that as states have attempted to regulate social media, judges have ruled that First Amendment protections must remain intact even as new technology changes the way people communicate and find information online.
“Just because you’re dealing with these new digital platforms doesn’t mean that the First Amendment principles go away,” she said.
©2025 The Minnesota Star Tribune. Visit at startribune.com. Distributed by Tribune Content Agency, LLC.
Comments