Technology Recent Proposed Legislation 137 Harv. L. Rev. 2104

Platform Accountability and Transparency Act, S. 1876, 118th Cong. (2023)

U.S. Senate Introduces Mandatory Reporting And Disclosure Law For Social Media Platforms.


Download

“Sunlight is said to be the best of disinfectants”1 — or so the adage goes. At the time, Justice Brandeis’s words described a framework for limiting the monopoly power of investment banks and wealth trusts through compelled disclosures.2 Transparency, he reasoned, “will aid the investor in judging of the safety of the investment” by reducing information asymmetries in the marketplace.3 A century later, the maxim seems to have found its way into social media–regulation circles, with think tanks and regulators calling for transparency in how these companies design their algorithmically curated environments.4 Those advocating for regulation in the space argue that these companies’ abilities to control “troves” of sensitive private data,5 and their inabilities to regulate dangerous speech,6 demand government oversight.7 In 2023, Senator Chris Coons sought to answer such calls with the (re)introduction of the Platform Accountability and Transparency Act8 (PATA) — a law granting academics and researchers broad access to the internal datasets of social media platforms that are covered by the bill’s scope.9 However, while calls for regulation may be warranted, Congress should be mindful of how it answers. As it stands, PATA likely suffers from constitutional infirmities that raise the specter of government censorship. Instead, lawmakers should create public-private partnerships with platform companies that focus on promoting self-regulation and industry-wide standards for user safety, transparency, and accountability.

Transparency is the buzzword of the day in social media regulation circles.10 And rightfully so. Like the wealth trusts of Justice Brandeis’s day, platform companies play the role of gatekeepers11 in the digital public square,12 yet we know little about their black box operations.13 Platforms like TikTok and Instagram “offer immeasurable opportunities to connect public leaders with constituents, businesses with consumers, and communities across the globe.”14 Yet they have been at the center of very public catastrophes, including: a genocide in Myanmar,15 a terrorist attack in Christchurch,16 and a riot at the U.S. Capitol on January 6.17 They evade liability,18 thanks, in part, to immunity statutes like Section 230 of the Communications Decency Act of 199619 and the difficulty in drawing a connection between the platforms’ design choices20 and the real-world harm they allegedly create.21 Algorithmic transparency laws — regulations that require the “disclosure of information about algorithms to enable monitoring, checking, criticism, or intervention by interested parties”22 — have been proposed as a way to begin drawing those connections, allowing regulators and researchers to more fully understand how the platforms rank and amplify certain content.23 PATA was born out of this backdrop.

PATA was first introduced during the 117th Congress in December 2022 by U.S. Senator Chris Coons,24 and was billed as a “multipronged” approach to “create[] new mechanisms to increase transparency around social media companies’ internal data.”25 It required that the Federal Trade Commission (FTC) and the National Science Foundation (NSF) promulgate standards to ensure qualified researchers26 can develop qualified research projects27 and gain access to on-platform information in coordination with social media companies.28 The bill languished at the end of the 117th Congress29 but was reintroduced in the 118th.30 Now, the bill is awaiting review by the Senate Committee on Commerce, Science, and Transportation.31

PATA has three important provisions. First, the bill requires a list of mandatory dataset disclosures the platforms need to make available to the public on an ongoing basis.32 Specifically, this includes data on “[h]ighly disseminated content,”33 the platform’s ranking and design choices,34 and its content moderation practices.35 The goal is for these disclosures to give regulators and researchers a way of knowing, for example, the potential causes, “prevalence[,] and size of the problem of hate speech, disinformation, incitement, child endangerment, and the like”36 — information the lack of which hampers attempts to identify how or why specific types of content appear in someone’s newsfeed.37

Next, PATA provides researchers with an opportunity to access internal datasets for research projects approved by the NSF.38 Prior to the social media age, social scientists were able to freely use public data related to “government statistics, survey data, or other kinds of data” to observe and report on social phenomena as it was happening, but “[n]ow, most of the data, which is relevant to contemporary social problems, is locked up” in the platforms.39 PATA requires social media platforms to make that data available upon request to public interest–focused qualified researchers.40 And if they refuse, PATA provides judicial review of the platforms’ noncompliance.41

Finally, PATA provides safe harbor protections to both researchers and the platforms when data transfers occur.42 For researchers, “[n]o civil claim will lie, nor will any criminal liability accrue . . . for collecting covered information as part of a news-gathering or research project on a platform, so long as,” among other requirements, that research is in the public interest, it follows the privacy and security standards promulgated by the FTC, and it does not “materially burden the [platform’s] technical operation.”43 For platform companies, no “cause of action . . . arising solely from the release of qualified data . . . in furtherance of a qualified research project may be brought against any platform that complies with [PATA].”44

PATA has received mixed reactions. Scholars like Professor Nathaniel Persily welcome PATA,45 explaining that “[i]f you force the platforms to open themselves to outside review, it will change their behavior[;] . . . [t]hey will know they’re being watched.”46 Professor Daphne Keller has been more cautious, explaining that while she is a cheerleader for platform transparency, “in practice [it] is complicated and messy” and could lead to a reduction in “people’s legal protections from state surveillance.”47 Jim Harper, a Senior Fellow at the American Enterprise Institute, argues that “[a]n unconstrained disclosure mandate may be unconstitutional,” and could make content “moderation more difficult” or “degrade the experience for platforms’ users.”48

Though many praise PATA as a welcome legislative intervention from a historically ambivalent Congress, its constitutional implications raise some concerns. PATA’s arrival is part of a pattern of laws seeking to regulate platform companies through mandating: political advertisement disclosures,49 policies about specific viewpoints like hate speech,50 and individualized notices and appeals process accompanying their content moderation decision.51 While each of these laws has arguably advanced compelling governmental interests, some courts have ruled that they likely either compel or impermissibly burden speech.52 So, instead of legislation granting blanket transparency into platforms’ editorial practices, Congress should facilitate opportunities for public-private partnerships that enable the companies to develop self-regulated, industry-wide standards that promote user safety, transparency, and accountability.

States entered the great transparency debate well before PATA’s introduction. In 2018, Maryland passed the Online Electioneering Transparency and Accountability Act53 (OETA) to identify the source of political advertisements in response to Russia’s social media disinformation campaigns during the 2016 election.54 Washington passed a similar measure.55 Soon after, New York and California passed legislation that required platforms to document and disclose their content moderation policies and enforcement actions to combat the spread of hate speech or misinformation.56 Florida and Texas also joined the conversation, passing laws requiring platforms to publish detailed explanations about their content moderation rules.57 And Texas’s law further requires platforms to provide rights of appeal for those content moderation decisions and statistics on their content moderation practices (for example, the content area, the type of review performed, and appeal rates).58

However, many of these laws have faced constitutional scrutiny. For example, several of the laws have been challenged on the theory that they impermissibly compel speech. In Washington Post v. McManus,59 the Fourth Circuit held that Maryland’s OETA was likely unconstitutional because its disclosure and inspection requirements both compelled speech and singled out political speech.60 Platforms had to create searchable advertisement libraries on their websites with specific data about the advertisement purchaser, and had to make that data available upon request to the government, “when they otherwise would have refrained.”61 Similarly, in Volokh v. James,62 a federal district court found that plaintiffs were likely to succeed in showing that New York’s Hateful Conduct Law,63 while well-intentioned, was an unconstitutional speech compulsion because it required social media platforms to devise a hate speech policy consistent with New York’s statute, publish that policy on its website, and create a mechanism to report such content.64 Thus, “at a minimum,” the law “compel[led] Plaintiffs to speak about ‘hateful conduct’”65 and “‘depriv[ed them] of their right to communicate freely on matters of public concern’ without state coercion.”66

Even more troubling is that these same laws raise the specter of Big Brother and could create a coercive effect on platforms’ regulation of internet users’ speech.67 The McManus court reasoned that OETA’s inspection requirement, in particular, places the government in “an unhealthy entanglement with”68 platform companies because “it lacks any readily discernible limits on the ability of government to supervise” platform companies’ editorial judgments.69 Under such a regime, OETA could allow the government to “chill speech” in a manner “the Supreme Court would not countenance.”70 The same was true in Volokh, where Judge Carter recognized that New York’s Hateful Conduct Law “fundamentally implicates the speech of the [social media] networks’ users” and could easily “make social media users wary about the types of speech they feel free to engage in”71 as well as make the platform “less appealing to users who intentionally seek out spaces where they feel like they can express themselves freely.”72

Elements of PATA have the potential to raise similar First Amendment concerns. First, the bill puts forward many requirements similar to Maryland’s OETA. For example, PATA mandates an easily navigable database that hosts disclosures about the content of all advertisements on the platform, who paid for the advertisement, the intended audience, and the advertisement’s reach.73 But it also goes further. It mandates that the FTC promulgate regulations requiring the disclosure of “all consumer-facing product features that made use of recommender or ranking algorithms,”74 “signals used as inputs to the described recommender or ranking algorithms, including an explanation of which rely on user data”75; data on highly disseminated content76; “information about the extent to which . . . content was recommended”77; who supplied the content78; and much more.79 And while PATA is unlike OETA in that its disclosure requirements are seemingly content neutral80 and thus potentially deserving of a lower tier of scrutiny,81 in practice, “no law [should] subject[] the editorial process to private or official examination merely to satisfy curiosity or to serve some general end such as the public interest.”82 PATA grants regulators authority to govern what Daphne Keller calls “speech about speech”83 — that is, even though the law may seek “purely factual and uncontroversial information”84 about the platforms’ operations, those “operations” are inherently editorial practices.85 The First Amendment counsels against forcing the platforms to express words that they may not have shared of their own volition.86

Second, while some may argue that PATA will change platform behavior for the better,87 the government’s potential for impermissible oversight is cause for concern. Instead of granting broad access privileges to the government, PATA places the government in a seemingly neutral role and effectively deputizes academic researchers as inspectors with access to the platforms’ editorial processes.88 This, however, is problematic, because under PATA the government retains effective control on the parameters of access to the platforms, evoking what the McManus court identified as an “unhealthy entanglement”89 with the platform’s operations. The government defines who a researcher is.90 The government defines what a research project is.91 The government bars judicial review “regarding whether a research application will be deemed a qualified research project.”92 And though the government restrains itself from seeking access to “qualified data and information” that has been provided to “a qualified researcher,”93 and qualified research projects must meet a high standard,94 nothing prevents researchers from voluntarily providing that data to the government.

Under PATA, the government can effectively fund and sanction a politically friendly media operation’s qualified research project into a company’s operational practices, effectively bypassing the First Amendment and sidestepping judicial review.95 This could give rise to a host of issues where, based on information derived from these qualified research projects, state officials use their police power to unconstitutionally coerce96 platform companies to remove certain speech, or certain users, to satisfy political ends.97 If PATA’s mandatory disclosures are a statutory front door into a social media platform’s editorial processes, its broad access requirement, under the aegis of the public interest, is an even more concerning backdoor.98

Given some of the uncertainty around PATA and the significant questions its passage would raise, Congress should seek less constitutionally intrusive avenues. One route might be to encourage public-private partnerships for the development of industry-wide standards that promote user safety — a concept that scholars like Newton Minow and Professor Martha Minow have expressed some support for.99 In a recent paper, they explain that “[t]hrough voluntary self-regulation . . . private industry-level organizations create rules and standards with which individual industry actors voluntarily comply.”100

Successful examples of public-private regulatory efforts abound. The financial industry uses a third-party organization to “promote transparency and compliance with ethical standards devised through its own rulemaking process” in coordination with the Securities and Exchange Commission.101 The FTC has also facilitated public-private self-regulation efforts for marketing in the alcohol industry102 and coordinated with the movie, gaming, and music industries to align their ratings systems on definitions “for movies of G, PG, PG-13, and R; [as well as] the label of ‘Mature’ rating for games; and the label of ‘Explicit’ for music.”103 Platform companies have taken on self-regulatory efforts in other parts of the world, with companies like Meta and TikTok voluntarily signing on to the European Commission’s Code of Practice on Disinformation.104 The Minows’ pragmatic approach urges lawmakers and platforms to pursue collaborative self-regulation, because even though it “is likely to advance the interests of the companies and benefit incumbents over new entrants, . . . it also can draw on the knowledge, resources, and flexibility of the private companies”105 in a way similar to the benefits gained from collaborations with the alcohol and entertainment industries.106 Together, these frameworks, along with the rise of independent, third-party organizations with expertise in the space and a commitment to tech accountability,107 can chart a more collaborative path forward to solving the challenges raised by social media regulation’s status quo.

If sunlight is the best disinfectant, PATA’s “electric light [may be] the most efficient policeman.”108 The law’s promise to provide academic researchers with transparency into the algorithmically curated environments that social media platforms have built illuminates a path toward tech accountability.109 And given the impact these companies have on our day-to-day lives, as well as the fact that they are implicated in public controversies like mass shootings, eating disorders, suicides, and countless other social ailments, it’s clear that these companies cannot and should not regulate themselves without oversight. But the cure cannot be worse than the disease. And while sunlight might be the best disinfectant, when the government shines that light on constitutional rights, it should be met with deep skepticism. Broad access to the platforms’ data could easily lead to chilling effects under the government’s (supervised) watch. Congress should focus its legislative power on encouraging self-regulated, industry-wide standards that promote user safety, transparency, and accountability.

Footnotes
  1. ^ Louis D. Brandeis, What Publicity Can Do, Harper’s Wkly., Dec. 20, 1913, at 10, 10, https://www.sechistorical.org/collection/papers/1910/1913_12_20_What_Publicity_Ca.pdf [https://perma.cc/2DJD-A368].

    Return to citation ^
  2. ^ See id. at 12.

    Return to citation ^
  3. ^ Id.

    Return to citation ^
  4. ^ See, e.g., Mark MacCarthy, Transparency Is Essential for Effective Social Media Regulation, Brookings Inst. (Nov. 1, 2022), https://www.brookings.edu/articles/transparency-is-essential-for-effective-social-media-regulation [https://perma.cc/2YAB-B8PS].

    Return to citation ^
  5. ^ Daphne Keller, Privacy, Middleware, and Interoperability: Can Technical Solutions, Including Blockchain, Help Us Avoid Hard Tradeoffs?, Ctr. for Internet & Soc’y (Aug. 23, 2021, 7:01 AM), https://cyberlaw.stanford.edu/blog/2021/08/privacy-middleware-and-interoperability-can-technical-solutions-including-blockchain-0 [https://perma.cc/Q85W-Z69S].

    Return to citation ^
  6. ^ Paige Collings & Jillian C. York, Social Media Platforms Must Do Better When Handling Misinformation, Especially During Moments of Conflict, Elec. Frontier Found. (Oct. 17, 2023), https://www.eff.org/deeplinks/2023/10/social-media-platforms-must-do-better-when-handling-misinformation-especially [https://perma.cc/L439-XPDU].

    Return to citation ^
  7. ^ See, e.g., Social Media Transparency: California Bill AB 587, Anti-Defamation League (May 23, 2023), https://www.adl.org/resources/tools-and-strategies/social-media-transparency-ca-ab-587 [https://perma.cc/ZJ3D-A7L5].

    Return to citation ^
  8. ^ S. 1876, 118th Cong. (2023).

    Return to citation ^
  9. ^ Press Release, Sen. Chris Coons, Senator Coons, Colleagues Introduce Legislation to Increase Transparency Around Social Media Platforms (June 8, 2023), https://www.coons.senate.gov/news/press-releases/senator-coons-colleagues-introduce-legislation-to-increase-transparency-around-social-media-platforms [https://perma.cc/U45H-72ND].

    Return to citation ^
  10. ^ See, e.g., MacCarthy, supra note 4.

    Return to citation ^
  11. ^ See, e.g., Kate Klonick, The New Governors: The People, Rules, and Processes Governing Online Speech, 131 Harv. L. Rev. 1598, 1635, 1638, 1647 (2018).

    Return to citation ^
  12. ^ See Packingham v. North Carolina, 137 S. Ct. 1730, 1737 (2017).

    Return to citation ^
  13. ^ Renée DiResta et al., It’s Time to Open the Black Box of Social Media, Sci. Am. (Apr. 28, 2022), https://www.scientificamerican.com/article/its-time-to-open-the-black-box-of-social-media [https://perma.cc/T68K-W32C].

    Return to citation ^
  14. ^ Dylan Moses, Are Product Liability Lawsuits the Way to Hold Tech Companies Accountable?, Berkeley Tech. L.J. Blog (Jan. 8, 2022), https://btlj.org/2022/01/are-product-liability-lawsuits-the-way-to-hold-tech-companies-accountable [https://perma.cc/MBM5-XWH6].

    Return to citation ^
  15. ^ Steve Stecklow, Why Facebook Is Losing the War on Hate Speech in Myanmar, Reuters (Aug. 15, 2018, 3:00 PM), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate [https://perma.cc/MW9V-LD3P].

    Return to citation ^
  16. ^ Amy Gunia, Facebook Tightens Live-Stream Rules in Response to the Christchurch Massacre, TIME (May 15, 2019, 4:19 AM), https://time.com/5589478/facebook-livestream-rules-new-zealand-christchurch-attack [https://perma.cc/98UC-UB6V].

    Return to citation ^
  17. ^ Craig Timberg et al., Inside Facebook, Jan. 6 Violence Fueled Anger, Regret over Missed Warning Signs, Wash. Post (Oct. 22, 2021, 7:36 PM), https://www.washingtonpost.com/technology/2021/10/22/jan-6-capitol-riot-facebook [https://perma.cc/8JT9-9LX8].

    Return to citation ^
  18. ^ See The Supreme Court, 2022 Term — Leading Cases, 137 Harv. L. Rev. 290, 402–03 (2023).

    Return to citation ^
  19. ^ See 47 U.S.C. § 230(c)(1).

    Return to citation ^
  20. ^ See Integrity Inst., Ranking and Design Transparency Data: Datasets, And Reports to Track Responsible Algorithmic and Platform Design 8–27 (2021), https://static1.squarespace.com/static/614cbb3258c5c87026497577/t/617834ea6ee73c074427e415/1635267819444/Ranking+and+Design+Transparency+%28EXTERNAL%29.pdf [https://perma.cc/C5J4-TAGB].

    Return to citation ^
  21. ^ See, e.g., Twitter, Inc. v. Taamneh, 143 S. Ct. 1206, 1215 (2023).

    Return to citation ^
  22. ^ Nicholas Diakopoulos & Michael Koliska, Algorithmic Transparency in the News Media, 5 Digit. Journalism 809, 811 (2017).

    Return to citation ^
  23. ^ See Evelyn Douek, Content Moderation as Systems Thinking, 136 Harv. L. Rev. 526, 576–77 (2022).

    Return to citation ^
  24. ^ Press Release, Sen. Chris Coons, Senator Coons, Colleagues Introduce Legislation to Provide Public with Transparency of Social Media Platforms (Dec. 21, 2022), https://www.coons.senate.gov/news/press-releases/senator-coons-colleagues-introduce-legislation-to-provide-public-with-transparency-of-social-media-platforms [https://perma.cc/8B3M-ZCPN].

    Return to citation ^
  25. ^ Id.

    Return to citation ^
  26. ^ Platform Accountability and Transparency Act, S. 5339, 117th Cong. § 2(7) (2022).

    Return to citation ^
  27. ^ Id. § 2(8).

    Return to citation ^
  28. ^ Id. §§ 2(5), 3(a).

    Return to citation ^
  29. ^ See S. 5339 — Platform Accountability and Transparency Act, Congress.gov, https://www.congress.gov/bill/117th-congress/senate-bill/5339/actions [https://perma.cc/Y7BB-F98X].

    Return to citation ^
  30. ^ See John Perrino, Platform Accountability and Transparency Act Reintroduced in Senate, Tech Pol’y Press (June 8, 2023), https://techpolicy.press/platform-accountability-and-transparency-act-reintroduced-in-senate [https://perma.cc/8TYW-FNHA].

    Return to citation ^
  31. ^ See S. 1876 — Platform Accountability and Transparency Act, Congress.gov, https://www.congress.gov/bill/118th-congress/senate-bill/1876/actions [https://perma.cc/FNE6-KASQ].

    Return to citation ^
  32. ^ Platform Accountability and Transparency Act, S. 1876, 118th Cong. § 9(b) (2023).

    Return to citation ^
  33. ^ Id. § 9(b)(4).

    Return to citation ^
  34. ^ Id. § 9(d).

    Return to citation ^
  35. ^ Id. § 9(e).

    Return to citation ^
  36. ^ Justin Hendrix, Transcript: Senate Hearing on Platform Transparency, Tech Pol’y Press (May 5, 2022) (statement of Professor Nathaniel Persily), https://www.techpolicy.press/transcript-senate-hearing-on-platform-transparency [https://perma.cc/C9J3-3Q2S].

    Return to citation ^
  37. ^ See id.

    Return to citation ^
  38. ^ Perrino, supra note 30.

    Return to citation ^
  39. ^ Hendrix, supra note 36 (statement of Professor Nathaniel Persily).

    Return to citation ^
  40. ^ S. 1876, 118th Cong. § 4(a) (2023). Of course, researchers and their projects must still meet the rigorous requirements outlined in the bill. Id. §§ 3–4.

    Return to citation ^
  41. ^ Id. § 4(e).

    Return to citation ^
  42. ^ See id. §§ 4(d), 8(a).

    Return to citation ^
  43. ^ Id. § 8(a).

    Return to citation ^
  44. ^ Id. § 4(d).

    Return to citation ^
  45. ^ Hearing on Platform Transparency: Understanding the Impact of Social Media Before the S. Comm. on the Judiciary, Subcomm. on Priv., Tech. & the L., 117th Cong. 6 (2022) [hereinafter Hearing on Platform Transparency] (statement of Professor Nathaniel Persily), https://www.judiciary.senate.gov/imo/media/doc/Persily%20Testimony.pdf [https://perma.cc/L87Y-4B4P].

    Return to citation ^
  46. ^ Ben Brody, Transparency Can Help Fix Social Media — If Anyone Can Define It, Protocol (Oct. 21, 2021), https://web.archive.org/web/20231129121826/https://www.protocol.com/policy/transparency-buzzword [https://perma.cc/R2YP-U7MR].

    Return to citation ^
  47. ^ Hearing on Platform Transparency, supra note 45 (statement of Daphne Keller, Stanford University Cyber Policy Center), https://www.judiciary.senate.gov/imo/media/doc/Keller%20Testimony1.pdf [https://perma.cc/VQ2R-FXKQ].

    Return to citation ^
  48. ^ Hearing on Platform Transparency, supra note 45 (statement of Jim Harper, Nonresident Senior Fellow, American Enterprise Institute), https://www.judiciary.senate.gov/imo/media/doc/Harper%20Testimony.pdf [https://perma.cc/EVU3-BRDR].

    Return to citation ^
  49. ^ See, e.g., Wash. Post v. McManus, 944 F.3d 506, 511 (4th Cir. 2019).

    Return to citation ^
  50. ^ See, e.g., Volokh v. James, 656 F. Supp. 3d 431, 436 (S.D.N.Y. 2023).

    Return to citation ^
  51. ^ See, e.g., NetChoice, L.L.C. v. Paxton, 49 F.4th 439, 446 (5th Cir. 2022), cert. granted in part sub nom. NetChoice, LLC v. Paxton, 144 S. Ct. 477 (2023).

    Return to citation ^
  52. ^ See, e.g., McManus, 944 F.3d at 510; Volokh, 656 F. Supp. 3d at 436.

    Return to citation ^
  53. ^ Md. Code Ann., Elec. Law § 13-405 (LexisNexis 2022).

    Return to citation ^
  54. ^ McManus, 944 F.3d at 510–11.

    Return to citation ^
  55. ^ See Press Release, Wash. State Off. of the Att’y Gen., AG Ferguson Seeks Maximum $24.6M Penalty Against Facebook Parent Meta (Oct. 13, 2022), https://www.atg.wa.gov/news/news-releases/ag-ferguson-seeks-maximum-246m-penalty-against-facebook-parent-meta [https://perma.cc/79Y3-5J5V].

    Return to citation ^
  56. ^ See Volokh, 656 F. Supp. 3d at 437–38; Minds, Inc. v. Bonta, No. 23-cv-02705, 2023 WL 6194312, at *1 (C.D. Cal. Aug. 18, 2023).

    Return to citation ^
  57. ^ Daphne Keller, Platform Transparency and the First Amendment, 4 J. Free Speech L. 1, 11–14 (2023).

    Return to citation ^
  58. ^ Id.

    Return to citation ^
  59. ^ 944 F.3d 506.

    Return to citation ^
  60. ^ Id. at 513–14.

    Return to citation ^
  61. ^ Id. at 514. But see Amended Appellant’s Opening Brief at 25, Washington v. Meta Platforms, Inc., No. 84661-2, 2023 WL 3234248 (Wash. Ct. App. Apr. 17, 2023) (discussing the Superior Court’s rejection of Meta’s First Amendment challenge to Washington’s online political advertisement transparency law).

    Return to citation ^
  62. ^ 656 F. Supp. 3d 431 (S.D.N.Y. 2023).

    Return to citation ^
  63. ^ N.Y. Gen. Bus. Law § 394-ccc (McKinney 2023).

    Return to citation ^
  64. ^ See Volokh, 656 F. Supp. 3d. at 441–42.

    Return to citation ^
  65. ^ Id. at 441 (quoting Gen. Bus. Law § 394-ccc(3)).

    Return to citation ^
  66. ^ Id. at 442 (quoting Evergreen Ass’n v. City of New York, 740 F.3d 233, 250 (2d Cir. 2014)).

    Return to citation ^
  67. ^ This might raise collateral censorship issues. Collateral censorship is the notion that the “imposition of liability and compliance costs on private intermediary A incentivizes it to use its power to censor the speech of private speaker B.” See Recent Case, Washington Post v. McManus, 944 F.3d 506 (4th Cir. 2019), 134 Harv. L. Rev. 1575, 1578–79 (2021) (citing J.M. Balkin, Essay, Free Speech and Hostile Environments, 99 Colum. L. Rev. 2295, 2298 & n.14 (1999)).

    Return to citation ^
  68. ^ Wash. Post v. McManus, 944 F.3d 506, 518 (4th Cir. 2019).

    Return to citation ^
  69. ^ Id. at 518–19.

    Return to citation ^
  70. ^ Id. at 519.

    Return to citation ^
  71. ^ Volokh, 656 F. Supp. 3d at 445.

    Return to citation ^
  72. ^ Id. at 446.

    Return to citation ^
  73. ^ Compare, e.g., Platform Accountability and Transparency Act, S. 1876, 118th Cong. § 9(c), with McManus, 944 F.3d at 511–12 (“[P]latforms must display somewhere on their site the identity of the purchaser, the individuals exercising control over the purchaser, and the total amount paid for the ad. [And t]hey must keep that information online for at least a year . . . .”).

    Return to citation ^
  74. ^ S. 1876 § 9(d)(2)(A).

    Return to citation ^
  75. ^ Id. § 9(d)(2)(B).

    Return to citation ^
  76. ^ Id. § 9(b)(1)(A).

    Return to citation ^
  77. ^ Id. § 9(b)(3)(D).

    Return to citation ^
  78. ^ Id. § 9(b)(3)(E).

    Return to citation ^
  79. ^ See generally id. § 9.

    Return to citation ^
  80. ^ Unlike OETA, which focuses on political advertising, PATA requires the disclosure of any advertisement the platforms hosts. See Ward v. Rock Against Racism, 491 U.S. 781, 791 (1989) (“Government regulation of expressive activity is content neutral so long as it is ‘justified without reference to the content of the regulated speech.’” (quoting Clark v. Cmty. for Creative Non-Violence, 468 U.S. 288, 293–95 (1984))). Compare, e.g., S. 1876, 118th Cong. § 9(c)(1)–(2), with Wash. Post v. McManus, 944 F.3d 506, 513 (4th Cir. 2019).

    Return to citation ^
  81. ^ Maryland argued that OETA should be reviewed under “exacting scrutiny” rather than strict scrutiny, but both the district court and Fourth Circuit rejected that argument. See Wash. Post v. McManus, 355 F. Supp. 3d 272, 302–05 (D. Md.) (explaining why OETA fails even under “exacting scrutiny,” id. at 302), aff’d, 944 F.3d 506 (4th Cir. 2019); McManus, 944 F.3d at 520–23 (similar).

    Return to citation ^
  82. ^ Herbert v. Lando, 441 U.S. 153, 174 (1979); see also McManus, 944 F.3d at 520–23.

    Return to citation ^
  83. ^ See Keller, supra note 57, at 40–41 (“Laws requiring platforms to speak about their editorial policies and explain their decisions regarding particular user posts are fundamentally different from laws requiring labels on meat or sugary beverages. . . . [They] will likely cause platforms to change their editorial policies and decisions about speech . . . .”).

    Return to citation ^
  84. ^ NetChoice, LLC v. Att’y Gen., 34 F.4th 1196, 1227 (11th Cir. 2022) (quoting Zauderer v. Off. of Disciplinary Counsel, 471 U.S. 626, 651 (1985)), cert. granted in part sub nom. Moody v. NetChoice, LLC, 144 S. Ct. 478 (2023), and cert. denied sub nom. NetChoice, LLC v. Moody, 144 S. Ct. 69 (2023).

    Return to citation ^
  85. ^ NetChoice, LLC, 34 F.4th at 1216 (“A social-media platform that ‘exercises editorial discretion in the selection and presentation of’ the content that it disseminates to its users ‘engages in speech activity.’” (quoting Ark. Educ. Television Comm’n v. Forbes, 523 U.S. 666, 674 (1998))). But see X Corp. v. Bonta, No. 23-CV-01939, 2023 WL 8948286, at *1–3 (E.D. Cal. Dec. 28, 2023) (finding that X was not likely to succeed on the merits in showing that California’s AB 587 violated the First Amendment and holding instead that AB 587 is concerned only with commercial speech, like the platform’s terms of service).

    Return to citation ^
  86. ^ See McManus, 944 F.3d at 514–15.

    Return to citation ^
  87. ^ See Hearing on Platform Transparency, supra note 45, at 5–6 (statement of Professor Nathaniel Persily).

    Return to citation ^
  88. ^ See Platform Accountability and Transparency Act, S. 1876, 118 Cong. § 3 (2023).

    Return to citation ^
  89. ^ McManus, 944 F.3d at 518.

    Return to citation ^
  90. ^ S. 1876 § 2(7).

    Return to citation ^
  91. ^ Id. § 2(8).

    Return to citation ^
  92. ^ Id. § 3(e).

    Return to citation ^
  93. ^ Id. § 3(f).

    Return to citation ^
  94. ^ See id. § 3.

    Return to citation ^
  95. ^ Cf. Eric Goldman, The Constitutionality of Mandating Editorial Transparency, 73 Hastings L.J. 1203, 1231 (2022) (explaining that “malefactors” could gain the status of “researcher” in contravention of the statute’s purpose).

    Return to citation ^
  96. ^ See Daphne Keller, Six Things About Jawboning, Knight First Amend. Inst. Blog (Oct. 10, 2023), https://knightcolumbia.org/blog/six-things-about-jawboning [https://perma.cc/UH7E-8ZKT].

    Return to citation ^
  97. ^ See Daphne Keller, State Abuse of Transparency Laws and How to Stop It, Medium: StanfordCPC (June 30, 2023), https://medium.com/@StanfordCPC/state-abuse-of-transparency-laws-and-how-to-stop-it-6cca8add6e10 [https://perma.cc/4JTL-YZ9S]; see also Keller, supra note 57, at 24–30.

    Return to citation ^
  98. ^ The Supreme Court will likely rule on the constitutionality of transparency mandates for social media platforms this Term. See Moody v. NetChoice, LLC, 144 S. Ct. 478 (2023) (mem.) (granting certiorari); NetChoice, LLC v. Paxton, 144 S. Ct. 477 (2023) (mem.) (granting certiorari). However, its most recent ruling in 303 Creative LLC v. Elenis, 143 S. Ct. 2298 (2023), is likely to be a harbinger for the Justices’ decision in the NetChoice cases and should give Congress pause about transparency mandates. See Scott Shackford, What Will 303 Creative Mean for Social Media Regulation?, Reason (July 3, 2023, 2:55 PM), https://reason.com/2023/07/03/what-will-303-creative-mean-for-social-media-regulation [https://perma.cc/ZU5H-RSDP]. Such an expansive reading of the First Amendment may prove to be a significant hurdle for PATA.

    Return to citation ^
  99. ^ Newton Minow & Martha Minow, Response, Social Media Companies Should Pursue Serious Self-Supervision — Soon: Response to Professors Douek and Kadri, 136 Harv. L. Rev. F. 428, 429 (2023).

    Return to citation ^
  100. ^ Id. at 433 (emphasis added).

    Return to citation ^
  101. ^ Id.

    Return to citation ^
  102. ^ Id. at 435.

    Return to citation ^
  103. ^ Id. at 436.

    Return to citation ^
  104. ^ See Signatories of the 2022 Strengthened Code of Practice on Disinformation, Eur. Comm’n (June 16, 2022), https://digital-strategy.ec.europa.eu/en/library/signatories-2022-strengthened-code-practice-disinformation [https://perma.cc/9VPL-5CZL].

    Return to citation ^
  105. ^ Minow & Minow, supra note 99, at 430.

    Return to citation ^
  106. ^ Id. at 442.

    Return to citation ^
  107. ^ See, e.g., Gilad Edelman, How Facebook Could Break Free from the Engagement Trap, WIRED (Nov. 19, 2021, 7:00 AM), https://www.wired.com/story/jeff-allen-interview-facebook-engagement-trap [https://perma.cc/2Q67-N523].

    Return to citation ^
  108. ^ See Brandeis, supra note 1, at 10.

    Return to citation ^
  109. ^ See Dylan Moses, The Next Era of Tech Accountability? Maybe., Berkeley L.: The Network (Nov. 16, 2021, 8:00 AM), https://sites.law.berkeley.edu/thenetwork/2021/11/16/the-next-era-of-tech-accountability-maybe [https://perma.cc/VPT3-SY2E].

    Return to citation ^