Download

Facebook founder Mark Zuckerberg has created a private “Supreme Court,” or so he says. Since 2021, his company’s Oversight Board has issued verdicts on a smattering of Facebook’s decisions about online speech. Cynics frame the Board as a Potemkin village, but defenders invoke analogies to separation of powers to claim that this new body empowers the public and restrains the company. Some are even calling for a single “platform supreme court” to rule over the entire industry.

Juridical discourse for platforms is powerful, but it can also be deceptive. This Response explores how juridical discourse has legitimized and empowered Facebook’s Board, building on Professor Evelyn Douek’s critique of how a “stylized” picture likens content moderation to judicial review. While Douek focuses on how scholars and lawmakers preach this misleading picture, I expose how platforms drive juridical discourse for their own gain. This deeper understanding of platform complicity is key. Without it, we’ll struggle to comprehend or contest the illusory picture of content moderation favored by platforms. With it, we might better resist platforms’ attempts to thwart regulation that would better serve the public’s interests.

Introduction

Love it or hate it, Facebook’s fledgling Oversight Board is poised to usher in a new era of content moderation for online platforms.1 The Board, launched two years ago to mixed fanfare and disdain, will review a smidgen of the company’s decisions about what may be shared on Facebook and Instagram. Discourse about the Board from both Facebook insiders and outsiders frequently invokes traditional governance concepts of separation of powers and judicial independence.2 Indeed, as Professor Evelyn Douek writes in her recent article, the Board “exemplifies” how standard accounts of content moderation can lead to “judicial review–style solutions — and platforms’ encouragement of this framing.”3 In this analogy, platforms have acted as legislatures by making rules, as executives by enforcing them, and as judiciaries by resolving ensuing disputes. But external oversight via the new Board purportedly devolves part of that adjudicatory power to a new “quasi-judicial” external entity intended to provide process, transparency, and impartiality.4 Facebook’s embrace of judicial analogies is no accident. The company’s Chief Executive Officer Mark Zuckerberg has compared Facebook’s overseers to a “Supreme Court,”5 and the Board’s “Charter” outlines how its “cases” will have “precedential value.”6 People inside and outside Facebook use juridical discourse to claim that the platform is ceding power through external oversight.7

In the Board’s shadow, momentum is also growing to create a cross-platform body to oversee other companies, not just Facebook.8 There’s a surprising discursive harmony between platform insiders and outsiders in advancing such proposals. Scholars and activists propose that a “platform supreme court”9 or “Social Media Council”10 could supervise the entire industry, rule on inter-platform controversies, and establish common procedures and standards. Legislators, meanwhile, are toying with laws to encourage or mandate centralized oversight and governance through external bodies and uniform standards. Facebook’s management has enthusiastically endorsed this trend, with Zuckerberg and his team openly hoping the Board will expand to “include more companies across the industry”11 and provide uniformity as “an industry-wide body.”12

In Content Moderation as Systems Thinking, Douek challenges a “stylized” or “standard” picture of content moderation that props up these recent trends in platform governance.13 Though she addresses the broader landscape of content moderation,14 her article’s insights are essential to understanding the genesis and trajectory of Facebook’s Board.

Douek’s central claim is that a “misleading and incomplete” picture has dominated regulatory and academic discussion of platform gover-nance.15 This stylized picture depicts content moderation as a “rough online analog of offline judicial adjudication of speech rights”16 whereby each platform applies “legislative-style rules drafted by platform policymakers to individual cases and hears appeals from those decisions.”17 As a result of this narrative, much discussion about platform governance obsesses over “paradigm cases involving ‘a platform’s review of user-generated content posted on its site and the corresponding decision to keep it up or take it down.’”18 Moreover, the “range of remedies is limited: the original decision is affirmed or reversed.”19

As Douek shows, this stylized picture misleadingly evokes a “day-in-court ideal” of content moderation by suggesting that “individual utterances get carefully measured against lofty speech rules and principles.”20 The reality is quite different — and more complex — in part because the “scale and speed” of online speech means that content moderation goes far beyond the aggregation of many “individual adjudications.”21 The stylized picture ignores these dynamics. Instead, it “invokes analogies to the practice of offline constitutional law,” such that key questions of content moderation “resemble those raised in First Amendment cases”22 and can be answered by developing “a body of precedent”23 and constructing “governance systems similar to the offline justice system.”24

The stylized picture isn’t just abstract theory. Lawmakers, Douek explains, have channeled this picture, seeking to hold platforms accountable and correct errors by mandating “individual ex post review,”25 an “appeal” to an “independent” arbiter,26 “reasons” for adverse decisions,27 and “ever more due process rights.”28 Some platforms have touted their own voluntary efforts to give users these “rule-of-law” goodies, reaching for the stylized narrative to pat themselves on the back.29 These trends carry prospective risks, as lawmakers seem poised to “overlook[] many of the most important forms of platform decisionmaking” and “lock[] in a form of oversight that is limited in its ambition.”30 In short, Douek concludes, the stylized picture is likely to produce “accountability theater rather than actual accountability.”31

But who is responsible for this juridical discourse around content moderation? And why paint such a misleading and incomplete picture?

This Response sheds light on these questions and builds on Douek’s observations. But while she frames her critique as a story of scholars leading lawmakers astray, I center the role of platforms in cultivating the stylized narrative that dominates popular, academic, and legis-lative debates.32 To do so, I use the example of Facebook’s Oversight Board — in some ways, the embodiment of platforms’ juridical discourse because the Board’s creators justified its role through a theory of separation of powers and the image of a supreme court.33 Excavating the history behind the Board illuminates how scholars, lawmakers, and platforms shape discourse in this space.34

Like Douek’s tale, mine is a cautionary one. Based partly on fieldwork I conducted as the Board took shape, I reveal how and why key figures at Facebook and the Board exploited legal analogies when portraying this novel institution and justifying its potential expansion to oversee other platforms.35 Though my main focus is on Facebook and its Board, my research offers insights applicable across platforms, especially dominant incumbents that exert the greatest power. Without acknowledging how companies are complicit in painting the stylized picture of content moderation, we’ll struggle to confront the “stickiness” of deceptive narratives that shape what Douek calls the “first wave” of platform-governance discourse.36

Debates about platform governance are evolving in legislative chambers, public discussions, and company boardrooms. At this critical juncture, we should scrutinize how platforms entrench their power and the discourses that influence decisionmakers.37 While external oversight could play some role, we should be skeptical of claims that a body like Facebook’s Board will meaningfully enhance users’ participation in governance or restrain a platform’s discretion.38 The Board won’t accomplish either goal, and expansion across the industry can’t fix its defects.39

Following this account, I briefly sketch an idea of platform federalism to assist a “second wave” of discourse and regulatory efforts. Federal systems can promote liberty, innovation, competition, pluralism, and expertise — values important in any scheme of platform governance. Lessons from traditional federalism could guide us in regulating platform power and fostering healthier digital environments, whether through law, policy, or technology.40

This Response proceeds in two Parts. Part I explores the Board’s past and its possible futures. Part II casts a critical eye over the juridical discourse that has legitimized and empowered the Board and surveys Facebook’s motives for adopting this discourse. The conclusion suggests that values and tools associated with federalism, rather than separation of powers, offer better guidance for platform governance.

Continue Reading in the Full PDF


* Assistant Professor, University of Georgia School of Law; Affiliate Faculty, University of Georgia Institute for Women’s Studies and Grady College of Journalism and Mass Communications; Affiliate Researcher, Clinic to End Tech Abuse at Cornell University. I’m especially grateful for generative conversations with Zohra Ahmed, Enrique Armijo, Jack Balkin, Hannah Bloch-Wehba, Kiel Brennan-Marquez, Rebecca Crootof, Evelyn Douek, Robert Gorwa, Daniel Halberstam, Erin Miller, Joe Miller, Ngozi Okidegbe, Laura Phillips Sawyer, Natália Pires de Vasconcelos, Robert Post, Logan Sawyer, Moritz Schramm, Gil Seinfeld, Alicia Solow-Niederman, Christian Turner, and Carly Zubrzycki, as well as feedback I received on earlier versions of this piece at the Freedom of Expression Scholars Conference, Junior Tech Law Scholars Workshop, Media Law and Policy Scholars Conference, Stanford Law School, Uni-versity of Arkansas School of Law, University of Oxford, Wake Forest Law School, and Yale Law School. Ailen Data provided terrific research assistance. My thanks also to the superb Harvard Law Review editors, whose insights and care substantially improved this piece. My work here draws on several years of fieldwork, including reviews of internal documents, anal-yses of public statements, and interviews with activists, academics, members and staff of Fa-cebook’s Oversight Board, and current and former platform employees. I was fortunate to conduct this research around the world — in Berlin, Buenos Aires, London, Menlo Park, New Haven, Oxford, Rio de Janeiro, Salt Lake City, São Paulo, and Washington, D.C. — thanks to generous support from the Andrew W. Mellon Foundation and the Yale MacMillan Center for International and Area Studies.

Footnotes
  1. ^ See Evelyn Douek, Content Moderation as Systems Thinking, 136 Harv. L. Rev. 526, 526 n.2 (2022) (defining content moderation as “platforms’ systems and rules that determine how they treat user-generated content on their services”); Sarah T. Roberts, Behind the Screen: Content Moderation in the Shadows of Social Media 33–51 (2019) (providing a seminal account of “commercial content moderation”); James Grimmelmann, The Virtues of Moderation, 17 Yale J.L. & Tech. 42, 47 (2015) (defining moderation as “the governance mechanisms that structure participation in a community to facilitate cooperation and prevent abuse”).

    Return to citation ^
  2. ^ See generally Josh Cowls, Philipp Darius, Dominiquo Santistevan & Moritz Schramm, Constitutional Metaphors: Facebook’s “Supreme Court” and the Legitimation of Platform Governance, New Media & Soc’y (2022), https://doi.org/10.1177/14614448221085559 [https://perma.cc/676A-W3YA].

    Return to citation ^
  3. ^ Douek, supra note 1, at 567.

    Return to citation ^
  4. ^ See Kate Klonick, Feature, The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression, 129 Yale L.J. 2418, 2457–87, 2499 (2020) (describing the Board’s structure and claiming that Facebook has “voluntarily divest[ed] itself of part of its power in order to create an independent oversight body,” id. at 2499).

    Return to citation ^
  5. ^ Ezra Klein, Mark Zuckerberg on Facebook’s Hardest Year, And What Comes Next, Vox (Apr. 2, 2018, 6:00 AM), https://www.vox.com/2018/4/2/17185052/mark-zuckerberg-facebook-interview-fake-news-bots-cambridge [https://perma.cc/57FE-TAQH].

    Return to citation ^
  6. ^ Oversight Bd., Oversight Board Charter (2019), https://oversightboard.com/governance [https://perma.cc/J4D6-AE5Z].

    Return to citation ^
  7. ^ In framing these developments in terms of discourse, I draw inspiration from sociolegal scholarship by Professor Ari Ezra Waldman, who in turn builds on Michel Foucault’s work. See Ari Ezra Waldman, Industry Unbound: The Inside Story of Privacy, Data, and Corporate Power 4, 47, 272 n.6 (2021). Waldman recounts how platforms use the “power of discourse” to “influence how we think about privacy not just to erode our interest in and capacity to enact robust privacy laws, but to entrench corporate-friendly ideas as common sense and mainstream among their workers.” Id. at 6. This Response builds on Waldman’s insights, mainly focusing on platform discourses about expression rather than about privacy. See id. at 46 (observing that there are “numerous discourses at play in informational capitalism”).

    Return to citation ^
  8. ^ See Evelyn Douek, The Limits of International Law in Content Moderation, 6 U.C. Irvine J. Int’l Transnat’l & Comp. L. 37, 73 (2021) (canvassing various proposals for “independent or quasi-independent institutions” to oversee platforms).

    Return to citation ^
  9. ^ See Rory Van Loo, Federal Rules of Platform Procedure, 88 U. Chi. L. Rev. 829, 873–75 (2021) (suggesting, with wise caveats, that “a central appeals court — a platform supreme court — is worth considering at the very least to provide effective remedies,” id. at 874).

    Return to citation ^
  10. ^ See, e.g., ARTICLE 19, The Social Media Councils: Consultation Paper (2019), https://www.article19.org/wp-content/uploads/2019/06/A19-SMC-Consultation-paper-2019-v05.pdf [https://perma.cc/38QS-PLTA]; David Kaye, Speech Police: The Global Struggle to Govern the Internet 122 (2019).

    Return to citation ^
  11. ^ Mark Zuckerberg, Facebook’s Commitment to the Oversight Board, Facebook (Sept. 17, 2019), https://about.fb.com/wp-content/uploads/2019/09/letter-from-mark-zuckerberg-on-oversight-board-charter.pdf [https://perma.cc/2P5R-RW3S].

    Return to citation ^
  12. ^ New America, The Future of Free Expression Online in America, YouTube, at 38:39 (July 18, 2019), https://www.youtube.com/watch?v=fTIOQsZJ-M0 [https://perma.cc/VDG5-9ZA6].

    Return to citation ^
  13. ^ Douek, supra note 1, at 528; see generally Robert Gorwa, What Is Platform Governance?, 22 Info., Commc’n & Soc’y 854, 855 (2019) (defining platform governance as “the layers of gover-nance relationships structuring interactions between key parties in today’s platform society”).

    Return to citation ^
  14. ^ See, e.g., Douek, supra note 1, at 535–64.

    Return to citation ^
  15. ^ Id. at 528; see also Ari Waldman, Shifting the Content Moderation Paradigm, JOTWELL (Mar. 1, 2022), https://cyber.jotwell.com/shifting-the-content-moderation-paradigm [https://perma.cc/T3TW-XQGA] (critiquing the “standard picture” that likens content moderation to “an old Roman emperor whose thumbs up or thumbs down decides the fate of a gladiator: some all-powerful person or all-powerful thing is deciding whether a post stays up or comes down”).

    Return to citation ^
  16. ^ Douek, supra note 1, at 528.

    Return to citation ^
  17. ^ Id. at 535.

    Return to citation ^
  18. ^ Id. at 535–36 (quoting Klonick, supra note 4, at 2427).

    Return to citation ^
  19. ^ Id. at 537.

    Return to citation ^
  20. ^ Id. at 538 (citation omitted); see, e.g., Kate Klonick, Inside the Making of Facebook’s Supreme Court, New Yorker (Feb. 12, 2021), https://www.newyorker.com/tech/annals-of-technology/inside-the-making-of-facebooks-supreme-court [https://perma.cc/S3YP-3HKR] (asserting that Facebook “has developed a set of rules and practices in the ad-hoc manner of common law”).

    Return to citation ^
  21. ^ Douek, supra note 1, at 528.

    Return to citation ^
  22. ^ Id. at 538.

    Return to citation ^
  23. ^ Id. at 556.

    Return to citation ^
  24. ^ Id. at 529.

    Return to citation ^
  25. ^ Id.

    Return to citation ^
  26. ^ Id. at 565.

    Return to citation ^
  27. ^ Id. at 566.

    Return to citation ^
  28. ^ Id. at 528; see also Waldman, supra note 15 (observing that the stylized picture of content moderation as mainly involving “ex post judicialish review” of one-off platform decisions creates the misimpression that the best reforms should rely on “procedural due processish protections”).

    Return to citation ^
  29. ^ See Douek, supra note 1, at 553. In a related move, a Facebook-commissioned group of Yale academics has presented recommendations on how to improve “procedural justice” in Facebook’s “appeals” process, including through using “citizens juries” to “evaluate appeals.” Ben Bradford, Florian Grisel, Tracey L. Meares, Emily Owens, Baron L. Pineda, Jacob N. Shapiro, Tom R. Tyler, Danieli Evans Peterman, Just. Collaboratory, Yale L. Sch., Report of the Facebook Data Transparency Advisory Group 44 (2019), https://law.yale.edu/sites/default/files/area/center/justice/document/dtag_report_5.22.2019.pdf [https://perma.cc/8T43-HYWX]; see also id. at 14–15, 36–39 (expanding upon these recommendations).

    Return to citation ^
  30. ^ Douek, supra note 1, at 548.

    Return to citation ^
  31. ^ Id. at 528.

    Return to citation ^
  32. ^ For example, Douek asserts that a “wealth of early and current academic, civil society and public discourse,” id. at 535, invokes the stylized picture, which gives lawmakers “an inaccurate understanding” of moderation, id. at 533. Douek isn’t naïve to platforms’ complicity, but her account often casts platforms as grateful bystanders or beneficiaries of these narratives without comprehensively interrogating the companies’ roles in fostering them. See, e.g., id. at 535–64.

    Return to citation ^
  33. ^ See The Joe Rogan Experience, #1863 — Mark Zuckerberg, Spotify, at 1:46:07 (Aug. 25, 2022), https://open.spotify.com/episode/51gxrAActH18RGhKNza598 [https://perma.cc/J5GT-DCCP] (featuring Zuckerberg praising the Board as a kind of “separation of powers” because that form of governance is “one of the things that our country and our government gets right”).

    Return to citation ^
  34. ^ See Chinmayi Arun, Facebook’s Faces, 135 Harv. L. Rev. F. 236, 236 (2022) (reminding us that Facebook “has many faces — different teams working towards different goals, and engaging with different ministries, institutions, scholars, and civil society organizations”).

    Return to citation ^
  35. ^ Interview quotes in this Response are from conversations that weren’t subject to nondisclosure agreements, nor was anything in this Response restrained by any embargo. Before I conducted interviews at Facebook’s headquarters, a company representative told me that, although I wasn’t required to sign a nondisclosure agreement, I couldn’t disclose any quotes without providing a draft and receiving the company’s permission. This kind of preclearance agreement is a nondisclosure agreement. Although Facebook’s representative told me other academics had accepted these terms, I declined to use quotes from those interviews to ensure academic integrity. See Waldman, supra note 7, at 6, 90–93 (condemning Facebook’s relationships with “friendly academics” who let the company review and preclear their work); Thomas E. Kadri, Digital Gatekeepers, 99 Tex. L. Rev. 951, 982 n.187 (2021) (criticizing Facebook’s “ask Facebook first” policy under which some academics give the company “review, revision, and veto powers over their work”). Since December 2020, I’ve participated in a group convened by Meta (Facebook’s parent company) to provide expertise on addressing online abuse. Participants receive annual honoraria of $6000 to attend roundtable discussions that are subject to nondisclosure agreements covering nonpublic and confidential information. I donated the first honorarium to the Equal Justice Initiative and accepted the second after concluding that it was appropriate, even advisable, for a for-profit company to compensate us for our labor, time, and expertise. My decision was influenced by the fact that honoraria aren’t associated with any academic research, but rather with discussions involving other scholars. My initial experiences also reassured me that I could use this forum to help abuse victims by scrutinizing Meta’s policies without jeopardizing my integrity. Finally, Meta’s decision not to publicize the discussions suggested that the company wasn’t exploiting us to gain positive publicity or legitimacy.

    Return to citation ^
  36. ^ See Douek, supra note 1, at 534.

    Return to citation ^
  37. ^ See generally Julie E. Cohen, Between Truth and Power: The Legal Constructions of Informational Capitalism (2019) (exploring how legal and technical discourses combine to advance platforms’ power); Thomas E. Kadri, Platforms as Blackacres, 68 UCLA L. Rev. 1184 (2022) (critiquing how cyber-trespass law gives platforms broad decision-making power to limit access to their services).

    Return to citation ^
  38. ^ See, e.g., Klonick, supra note 4, at 2418 (proclaiming that the Board “has great potential to set new precedent for user participation in private platforms’ governance and a user right to procedure in content moderation”).

    Return to citation ^
  39. ^ See infra sections II.B–D, pp. 187–97.

    Return to citation ^
  40. ^ See generally Thomas E. Kadri, Networks of Empathy, 2020 Utah L. Rev. 1075 (exploring how digital abuse might be addressed through both legal and extralegal regulation).

    Return to citation ^