The New York Times warns that free speech is under threat

Are federal officials violating the First Amendment when they pressure social media companies to suppress “misinformation”? That’s the question raised by a federal lawsuit filed last May by the attorneys general of Missouri and Louisiana.

The New York Times Reporter Steven Lee Myers warned that the lawsuit “could derail the Biden administration’s already struggling efforts to combat misinformation.” He worries that “the First Amendment, for better or worse, has become a stumbling block to virtually any government effort to quell a problem that, in the case of an epidemic, threatens public health and the integrity of elections, even democracy itself.” As Myers frames the issue, free speech is a threat to “public health” and “even to democracy.”

There is no denying that when people are free to express their opinions, no matter how misguided, ill-informed or malicious, some of them will say things that are misleading, demonstrably false or divisive. Yet the First Amendment guarantees them the right to say those things, based on the premise that the dangers posed by unfettered speech are preferable to the dangers posed by government efforts to regulate speech in the public interest.

Myers may disagree with that calculation or push back on its implications. But the First Amendment expressly prevents the government from banning speech deemed dangerous to public health or democracy. the plaintiffs Missouri vs. Biden, which includes individual social media users represented by the New Civil Liberties Alliance (NCLA), argues that federal officials have violated the First Amendment by indirectly attempting to achieve that goal by blurring the distinction between personal moderation and state censorship. NCLA attorney Janine Younes said the government “can’t use a third party to do something it can’t do.” times.

Myers won’t buy it. He holds that the private communications that plaintiffs view as evidence of censorship by proxy actually show that social media companies have made independent decisions about what speech and speakers they are willing to allow on their platforms.

The emails were produced during discovery in response to an order by US District Judge Terry A. Doughty, which Myers has portrayed as biased against the Biden administration. He noted that Doughty was “engaged [Donald] Trump in 2017″ and “previously blocked the Biden administration’s national vaccination mandate for health care workers and overturned a ban on new federal leases for oil and gas drilling.” In this case, Myers said, Doughty “granted the plaintiffs’ request for extensive . discovery even before considering their request for a preliminary injunction.”

Myers also suggests that plaintiffs are motivated by dubious ideological charges. “Their claims,” ​​he says, “reflect a narrative that has taken root among conservatives that the nation’s social media companies have colluded with government officials to discriminate against them, despite evidence showing the contrary.”

While Myers implies that the lawsuit and Doughty’s handling of it were driven by partisan animosity, he noted that “many of the examples cited in the lawsuit also involved government actions taken during the Trump administration, including efforts to combat misinformation before the 2020 presidential election.” It suggests that Plaintiffs’ objection to government interference in restraint decisions goes beyond a desire to score political points.

Emails released through the lawsuit, like internal Twitter communications Elon Musk shared with reporters, indicate that social media platforms were generally eager to address content concerns raised by public health and law enforcement officials. They responded promptly to take-down requests and asked for additional advice. The tone of communication is, above all, cordial and cooperative.

the plaintiffs Missouri vs. Biden See that ease as boring. But Myers insists on exceptions. “The growing path of internal communication,” he wrote, “suggests a more complex and torturous struggle between government officials frustrated by the spread of dangerous falsehoods and company officials who resent and often resist government requests.”

Myers acknowledged that “government officials” are trying to prevent “the spread of dangerous falsehoods” by encouraging Facebook et al. Delete specific posts and ban specific users. He also acknowledged that the people running these platforms “frustrated and often resisted” those efforts. But he doesn’t think the data is cause for concern that officials are using their positions to make moderation decisions, resulting in less speech than would otherwise be allowed.

Myers misrepresents the context of this “government appeal,” which is important in assessing the extent to which they suppress objectionable speech. He notes a June 16, 2021 text message in which Nick Clegg, Facebook’s vice president of global affairs, told Surgeon General Vivek Murthy “tentatively,” “It’s not very good to charge people with murder.”

According to Myers, the comment was prompted by Murthy’s conclusion that COVID-19 “misinformation” had resulted in “avoidable illness and death”, prompting him to demand “greater transparency and accountability” from social media companies. Myers did not mention that Clegg sent that message after President Joe Biden publicly accused Facebook and other platforms of “killing people” by failing to suppress misinformation about the COVID-19 vaccine. Myers does not mention that Murthy had just released a sermon in which he called for a “whole-of-society” effort to address the “urgent threat to public health” posed by “health misinformation,” possibly “including appropriate legal and regulatory measures.” “

Myers also omitted something else Clegg said in that text message: He was “interested in finding a way to de-escalate and work together.” What Myers presents as evidence that Facebook has “vehemently” resisted “government requests,” in other words, is actually evidence that the platform was desperate to appease the president’s ire.

To that end, Facebook did what Biden and Murthy demanded. “Thanks again for taking the time to meet earlier today,” Clegg said in an email to Murthy a week later. “I wanted to make sure you saw the steps we took Just this past week Further steps have been taken to adjust our policies on what we delete regarding misinformation, as well as to address the ‘disinfo dozen.'” He boasted that his company has removed offensive pages, groups and Instagram accounts; several have taken steps to make pages and profiles “our more difficult to find on the platform”; and “expands the group of false claims we remove to keep up with the latest trends.”

As White House spokesman Robin M. Patterson described it, the administration is simply asking Facebook and others. To implement “their own policies to combat misinformation and confusion”. But federal officials have also pushed social media platforms to expand the definition of these categories. And according to Clegg, Facebook “responded to Biden’s assassination allegations in a coordinated manner[ing] Policy on what we are removing regarding misinformation.”

Myers thinks there is nothing to see here. “The legal challenge for plaintiffs is that the government has used its legal or regulatory powers to punish companies when they don’t comply,” he says. But companies generally did “practice,” and this does not extend to suggesting that they did so because they anticipated how “legal or regulatory force” might be deployed against them.

“As evidence of pressure,” Myers wrote, “the lawsuit cites instances when administration officials have publicly suggested that companies may face greater regulation.” In his interview with times, for example, Patterson “reiterated President Biden’s call for Congress to reform Section 230 of the Communications Decency Act, a law that broadly shields Internet companies from liability for what they post on their sites.” But Myers suggested that fears of losing protections are unwarranted, as the Biden administration “couldn’t repeal the law on its own” and “Congress has shown little appetite to revisit the issue, despite calls from Mr. Biden and others for greater social accountability of media companies.”

Since delaying or repealing Section 230 is a bipartisan cause, it’s hardly crazy to think that angering federal officials by refusing to “cooperate” would make such legislation more likely. Allegations about uncontrolled misinformation would bolster Biden’s argument that “greater accountability” requires increased exposure to liability and make Congress more inclined to agree.

Even without the new law, the administration could make life difficult for social media companies through regulation, lawsuits and antitrust enforcement. As Myers sees it, this wouldn’t be a problem unless officials threatened companies with retaliation and then delivered on those threats. This standard would leave the government free to regulate online speech as long as it never engages in blatant extortion.