Sunday, November 17, 2024

What U.S. Policymakers Can Learn from the European Union’s Probe of Meta

Must read

With the announcement of its latest investigation of a global social media company – this time, Meta – the European Union is providing an illuminating lesson on how to regulate tech behemoths without threatening free speech. One would like to think that U.S. politicians and policymakers are taking notes. Unfortunately, that’s probably a fanciful hope.

On April 30 the European Commission, the E.U.’s executive arm, said in a press release that it has “opened formal proceedings” to assess whether Meta’s Facebook and Instagram platforms have breached the Digital Services Act (DSA), a Europe-wide law that took full effect in February 2024 and is designed to deter online manipulation and force tech companies to take greater responsibility for their impact on elections and other aspects of civic life. 

Specifically, the Commission said it is investigating “suspected infringements” related to “deceptive advertising and political content” on Meta’s platforms, as well as the company’s diminishment of CrowdTangle – a tool that formerly provided outsiders, including journalists and researchers – with insight into how content spreads on those services. The Commission added that, based on preliminary assessments, it “suspects” that Meta’s external and internal mechanisms for flagging illegal content “are not compliant with the requirements of the Digital Services Act and that there are shortcomings in Meta’s provision of access to publicly available data to [outside] researchers.”

European regulators are clearly trying to pressure Meta to invigorate its self-policing of disinformation, including content generated by artificial intelligence. The timing is no accident. In early June, the E.U.’s 27 member States will hold elections for representatives serving in the European Parliament. The Kremlin has been targeting many of those countries with political disinformation and is expected to step up its online propaganda efforts in an attempt to discourage support for Ukraine in its defensive war against Russian President Vladimir Putin’s forces.

Commission President Ursula von der Leyen’s written statement about the investigation is worth quoting at length:

This Commission has created means to protect European citizens from targeted disinformation and manipulation by third countries. If we suspect a violation of the rules, we act. This is true at all times, but especially in times of democratic elections. Big digital platforms must live up to their obligations to put enough resources into this and today’s decision shows that we are serious about compliance.

A Law with Teeth

The DSA has teeth. The Commission can fine companies up to 6 percent of their global revenue and has the authority to interview company officials and even raid corporate offices. E.U. regulators are already investigating the content policies and practices of TikTok and X, formerly known as Twitter. 

In its response to the Commission’s announcement, Meta said in a statement that: “We have a well established process for identifying and mitigating risks on our platforms.” It added: “We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”

In contrast to their European counterparts, U.S. lawmakers, with one striking exception, have failed for over a half-dozen years to pass any of the myriad laws that have been proposed to rein in major tech companies in this country. The exception is the bill that U.S. President Joe Biden signed into law on April 25 that requires ByteDance, the Chinese parent of TikTok, to sell the short-video platform within nine months under threat of a sweeping ban of the service in the United States.

The highly unusual TikTok sale-or-ban law reflects heightened geopolitical tension between Beijing and Washington, as well as the Chinese government’s practice of exerting influence over tech companies operating in China. The U.S. State Department issued a report last year finding that China “employs a variety of deceptive and coercive methods,” including “propaganda, disinformation and censorship,” to “influence the international information environment.” 

TikTok has vowed to challenge the new U.S. law as an unconstitutional government restraint on free speech under the First Amendment. That argument is at least plausible, if not necessarily one that the U.S. judiciary will embrace when it weighs the government’s claim that China could use the platform to try to interfere in U.S. elections. Past attempts to ban TikTok by the Trump administration and the state of Montana have been blocked by federal courts. 

But setting aside the rather unique dispute over TikTok, the striking thing about U.S. regulation of social media at the national level is its absence. This regulatory vacuum is typically ascribed to two conditions: the extreme political polarization that renders the U.S. Congress dysfunctional on so many fronts and the First Amendment’s instruction that “Congress shall make no law…abridging the freedom of speech.”

Passing First Amendment Muster

European nations do not operate under as rigid a prohibition of government regulation of speech, an important factor explaining how the E.U. managed to enact the DSA. But the newly unveiled investigation of Meta illustrates that, possibly with modest modification, European-style regulation could pass muster under the First Amendment.

Forming the foundation of the DSA are a range of provisions requiring that social media platforms disclose how they address problems like deceptive political advertising and other kinds of misleading or hateful content. The European Commission noted in its Meta investigation announcement that the opening of the probe was based on a “risk assessment report” that Meta (and all other large social media companies) were required to file in 2023, as well as on the company’s responses to the Commission’s follow-up requests for additional information.

First Amendment absolutists might be skeptical of this sort of mandatory disclosure, seeing it as a precursor to intrusive regulatory action. But there’s a strong argument under existing free speech doctrine that requiring businesses to reveal factual information about how they operate does not constitute censorship or anything close to it. Companies in numerous regulated industries – from airlines to chemicals – are routinely subjected to disclosure requirements, so using this approach would not be novel.

In fact, from what we know so far, nothing about the E.U. investigation of Meta would violate First Amendment strictures. The regional body’s regulators are not dictating that Meta or other social media companies adopt particular policies, let alone specific content practices or decisions. Instead, the E.U. appears to be interested in whether these companies, in general, are providing the kind of resources, personnel, and digital tools that are needed to mount a vigorous defense against manipulation by the likes of Russia or China. 

It may be that one or another E.U. demand might turn out to stray over the First Amendment line if it were examined in a U.S. court. But in the main, the European authorities seem concerned about whether powerful social media companies are providing procedurally adequate protections against disinformation and other harmful content that the companies themselves profess not to want on their platforms. 

Enhancing Consumer Protection

In this sense, early efforts to enforce the DSA shed light on what is at least theoretically possible in the U.S. The NYU Stern Center for Business and Human Rights, where I work, has advocated for Congress to enhance the consumer protection authority and resources of the U.S. Federal Trade Commission so that the FTC could demand “procedurally adequate” safeguards by social media companies, based on a disclosure regime roughly similar to that imposed by the DSA. If the FTC were restrained from dictating substantive policies or content decisions, this approach ought to be able to survive First Amendment scrutiny. Full disclosure: Less ambitious versions of this idea have appeared in some proposed U.S. legislation, but haven’t made much progress toward passage.

Under our approach, the U.S. government would not tell platforms what content they could host. Instead, it would require them to institute procedures that follow through on promises they have made in their terms of service and “community standards” to protect users and society at large.

It is too soon to tell whether the DSA will prove to be a successful experiment in regulation. Meta, TikTok, and X doubtless will push back and appeal any adverse findings. It’s not clear whether in this process the European Commission will demonstrate the courage of its convictions. Keeping 27 member States on board won’t be easy. But the Commission seems to be trying to make the DSA meaningful, and that alone is something policymakers in Washington could learn from.

IMAGE: This photo taken on April 27, 2023, in Toulouse, France, shows a screen displaying the Meta logo and the European flag. (Photo by LIONEL BONAVENTURE /AFP via Getty Images)

Latest article