Democracy

Cyber Democracy

Regulating the 21st Century Internet

As the debate regarding Section 230 of the Communications Decency Act highlights, tech platforms, left to their own devices, are in many ways helping to codify what is increasingly an internet bifurcated by political and philosophical beliefs: you’re either in the “Dorsey” or “Zuckerberg” camp.

Facebook recently failed to take down a deepfake video of House Speaker Nancy Pelosi, and Twitter has adopted new policies urging users to “get the facts” or read before retweeting. As tech giants like Facebook, Twitter, and Amazon increasingly govern themselves, fundamental questions arise regarding internet regulation and the role of government in our digital world. As the debate regarding Section 230 of the Communications Decency Act highlights, tech platforms, left to their own devices, are in many ways helping to codify what is increasingly an internet bifurcated by political and philosophical beliefs: you’re either in the “Dorsey” or “Zuckerberg” camp.

The United States’ Section 230

Congress passed Section 230 in 1996. At the time, it was a bipartisan effort aimed at protecting free speech and preventing censorship online. However, the growth of companies like Facebook and Twitter have raised new questions about why the government can regulate who yells “fire” in a movie theater but refuses to regulate groups pushing misinformation harmful to the very essence of American democracy.

More than two decades after the implementation of Section 230 — and with many of the same people still in office — fresh debates have emerged about platform liability. A bipartisan consensus for policy reform exists, but the two parties’ reasons for wanting change differ starkly.

Republicans are leading the charge to amend Section 230. They claim that tech platform algorithms are biased against conservatives, which leads to more progressive content in users’ feeds. On June 17, Senator Josh Hawley (R-MO) introduced the Limiting Section 230 Immunity to Good Samaritans Act, which would require that companies with more than 30 million US or 300 million global users, or more than $1.5 billion in worldwide revenue, moderate their content “in good faith” to maintain Section 230 immunity. Senators Marco Rubio (R-FL) and Tom Cotton (R-AR) cosponsored the bill.

Critics of Section 230 note that 120 million Americans saw Russian-backed Facebook posts during the 2016 election.[i] They claim that Facebook CEO Mark Zuckerberg is largely responsible for the company’s refusal to correct the corporate policies that led to the Cambridge Analytica scandal and may have helped Donald Trump win the presidency.

Democrats’ efforts to reform Section 230 have largely concentrated on anti-sex trafficking measures that culminated in FOSTA-SESTA legislation in 2018. This law created an exception to Section 230 by seeking to stem sex trafficking via online platforms such as Craigslist’s now-defunct Backpage.com. While the act received strong support from Senator Kamala Harris (D-CA) and others, the progressive arm of the party, led by Senator Elizabeth Warren (D-MA), has since introduced legislation to review FOSTA-SESTA.

In a 2018 tweet, Senator Ron Wyden (D-OR), one of the two original sponsors of Section 230, wrote, “Without #Section 230, the internet as we know it would shrivel. ONLY those platforms run by people with deep pockets and a deeper bench of lawyers would make it.” Intra-party disputes and fears of potential First Amendment infringements, however, have hampered Democrats’ ability to move forward on the initiative.

Both parties agree that the government should establish baseline operating rules for companies like Facebook, which has nearly 2.5 billion active users — more than double the network size of the entire Catholic Church.[ii] However, a lack of common ground behind the parties’ reasons for modifying Section 230 could hamper near-term progress.

The EU’s Digital Services Act

In the European Union, content liability is covered by the e-Commerce Directive, a comprehensive set of regulations that dates to 2000. Like Section 230 in the US, the directive does not hold companies such as Facebook or Twitter liable for content posted on their platforms, though they may choose to take down malicious material voluntarily.

In July 2019, European Commission President Ursula von der Leyen announced plans to develop a new Digital Services Act (DSA), an update to the e-Commerce Directive. The Internal Market and Consumer Protection (IMCO), Civil Liberties, Justice and Home Affairs (LIBE), and Legal Affairs (JURI) Committees have published their draft reports on the DSA. While staffers in Brussels continue hashing out details of this revised regulation, the Commission is expected to present its comprehensive DSA legislative package in late 2020 or early 2021. The inclusion in the DSA of a comprehensive overhaul of liability regimes is widely anticipated. However, a June 18 constitutional decision in France to strike down hate speech law could put a wrench in the Commission’s plans.

Brussels in the Lead?

Since Section 230 came into force more than two decades ago, internet users and policymakers have realized that digital platforms need some level of accountability. As Europe prepares to announce substantial changes to liability regimes, the US needs to make its own fundamental changes to digital governance. Policymakers’ failure to rethink internet policy risks creating a vacuum for the further concentration of power among private entities to fill. This could ultimately lead to a diminished role for government in our digital world.

After Zuckerberg gave, in 2018, congressional testimony that exposed lawmakers’ lack of understanding of tech platforms, Silicon Valley toasted him with champagne.[iii] Next time, Capitol Hill and Zuckerberg should be better prepared.

[i] Sarah Frier, Instagram, Simon & Schuster: New York, p. 195.

[ii] Ibid, p. 195.

[iii] Ibid, p. 259.

Print

Emily Benson

Manager, Transatlantic Legislative Relations
Bertelsmann Foundation

emily.benson@bfna.org