First Amendment: Speech Response 127 Harv. L. Rev. F. 325

The Brave New World of Social Media Censorship

How "terms of service" abridge free speech

Response To:


Thanks to Marvin Ammori for a perceptive overview of the seismic shift in free speech policymaking over the past two decades. Today, as Ammori points out, private companies that run social media sites and search engines are the main arbiters of what gets communicated in the brave new world of cyberspace. And despite their good intentions and their claims to a free-speech-friendly philosophy, these companies employ “terms of service” that censor a broad range of constitutionally protected speech.

I will begin by offering a few examples; then, I will address those two laws that are so critical to online expression, section 230 of the Communications Decency Act1 (“CDA”) and section 202 of the Digital Millennium Copyright Act2 (“DMCA”) (generally known as section 512, its location in title 17 of the U.S. Code). I will conclude with a note on the continuing importance of New York Times Co. v. Sullivan3 and also a thought on how the new arbiters of speech might improve on that much-celebrated decision.


“Facebook,” as Jeffrey Rosen has said, wields “more power [today] in determining who can speak . . . than any Supreme Court justice, any king or any president.”4 Facebook’s “Statement of Rights and Responsibilities” provides: “You will not post content that: is hate speech, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence.”5 Within Facebook’s broad ban, it’s true, reside a few categories of speech that the First Amendment does not protect: threats, incitement, and some subcategory of pornography that might be constitutionally unprotected obscenity under a local community standard.6 But there is no judicial determination of illegality – just the best guess of Facebook’s censors. Facebook’s internal appeals process is mysterious at best.7

And most of what Facebook proscribes is protected by the First Amendment. Its power to suppress nude images, to take one example, has a huge potential impact on communications about visual art — including cinema and photography — as well as sex education and discussions of sexual politics.8 Similarly, judgments about what content is gratuitously violent or hateful toward a religious or ethnic group can vary widely, and the result will be subjective and unpredictable censorship of literature, art, and political discussion.

Professor Ammori tells us that Facebook lawyers have created “a set of rules that hundreds of employees can apply consistently without having to make judgment calls.”9 The details of these rules, however, we do not know. Unlike censorship decisions by government agencies, the process in the private world of social media is secret.

It is probably true that Facebook has a First Amendment right to censor whatever it wants in order to maintain the kind of social space it wants. Facebook is not, and arguably should not be considered a common carrier, and thus it should not be forced into a legal strait jacket that would prohibit content-based terms of service. As several students in my 2013 censorship class at NYU pointed out when we debated this issue, it is a competitive social-media world out there, and if people are unhappy with Facebook’s censorship policies — or YouTube’s, or Tumblr’s, or Twitter’s, for that matter — they will gravitate to another site.

I suppose the same might be said about search engines, a business long dominated by Google. Arguably, people will move elsewhere if they find Google’s results too censorious. But first they have to know what is being censored. In addition, there is a big distinction between a social-media site that has a certain claim to its own character and tone, and a search engine that is basically a sophisticated mechanical tool. Yet Google’s “Custom Search — Terms of Service” page for the United Kingdom, which applies to U.K. websites that use Google’s search engine to provide a website-specific search tool, prohibits any “pornographic, hate-related, violent or [ — a frequent catch-all category — ] offensive content.”10 Deciding what is “offensive” is up to Google.

Google censors general searches as well. Its default is the “moderate” setting of an Internet filter called SafeSearch, described as “designed to screen sites that contain sexually explicit content and remove them from your search results.” Google assures us that “[w]hile no filter is 100% accurate, SafeSearch helps you avoid content you may prefer not to see or would rather your children did not stumble across.”11 As anyone familiar with Internet filtering knows, “not 100% accurate” is a large understatement,12 and in any event, what business is it of a search engine provider to decide that its huge diversity of users “may prefer not to see” what its bots and spiders determine, based on flawed technology, is “sexually explicit”?

In December 2012, Google made it impossible to entirely disable SafeSearch in the United States, although it claimed that users could still access sexually explicit content if they made their search requests more specific, for example, by including the word “porn.”13 But what if you are searching for explicit sex education information, not porn? You will not know what SafeSearch is blocking. There is good reason, I think, for Congress or the FCC to prohibit search engines from imposing filters; instead, filters should be available on request. The FCC could categorize search engines as common carriers, just as it should categorize Internet Service Providers as common carriers, which would oblige them to accept all content and refrain from censorship.14


As Professor Ammori explains, section 230 of the CDA immunizes all Internet users who disseminate content not of their own creation from liability for defamation, invasion of privacy, and virtually everything else except violations of intellectual property.15 Thus, social media sites like Facebook and search engines like Google do not have to censor anything. In fact, one major aim of section 230 is to discourage private-industry censorship, so that free speech can prevail on the Internet, and those actually responsible for criminal or tortious speech, rather than the pipelines through which they communicate, can be prosecuted or sued. But section 230 does not prohibit private censorship; instead, it affirmatively allows it,16 and therein lies the rub: vague, broad terms of service, applied by powerful companies like Facebook with no transparency and no clear avenues for appeal.

As Ammori points out, there is an additional problem: Congress’s special treatment of intellectual property.17 Section 512 of the DMCA gives websites, chatrooms, and other online providers a safe harbor from liability for copyright infringements committed by their users, but only if they comply with take-down notices that do not require any advance judicial determination, and that are often mass-generated by spiders and bots employed by the entertainment industry or its hired hands.18 The assertedly infringing content must be taken down “expeditiously,”19 and only reinstated, after ten days, if the person who posted it files a counter-notice.20 Section 512 was a legislative gift to the media industry: it bestows the power to suppress online content for at least ten days and often permanently, simply on the basis of a demand letter, with no determination whether the company even owns the copyright, whether it applies to the content in question, or whether there might be a defense to copyright infringement, such as fair use.

Section 512 gives a strong incentive to social media sites, search engines, and the like to remove content even though it is questionable whether they would be liable for copyright infringements by their users in the first place. There is not much case law on this issue. The existing precedents are inconsistent, but several of them hold that service providers are not liable.21 The case against liability is even stronger for search engines that simply provide links.22 Intellectual property disputes have been a major free-speech issue at least since the advent of the Internet, and section 512, by encouraging censorship, tips the scales too heavily against free speech.


Finally, with free speech policymaking having shifted to private companies, to legislation like sections 230 and 512, and, as Professor Ammori argues, to international forums, what role is left for the U.S. courts, and for landmarks like New York Times Co. v. Sullivan? Quite a bit, I maintain.

First, as Ammori notes, today’s generation of social media lawyers are inspired by the spirit of Times v. Sullivan, a spirit that extends well beyond the holding of the case. Justice Brennan’s stirringly quotable words about the “uninhibited, robust, and wide open”23 debate essential to democracy have become, like other great Supreme Court perorations, a part of our literature and culture.

But Times v. Sullivan has its weaknesses; as Justices Black and Douglas pointed out in their concurrence, the “actual malice” test invites intrusive, lengthy, and expensive discovery into the notes, the confidential sources, and the states of mind of reporters and editors.24 Black and Douglas made a persuasive argument for absolute immunity from liability for defamatory statements about public officials. So, we can honor Sullivan even while improving upon it. State legislatures can go beyond “actual malice” and simply eliminate liability for defamation of public officials. Our social media policymakers can make clear that they do not censor potentially defamatory content if it involves criticism of government. Indeed, they can extend this policy to criticism of public figures, just as Sullivan was eventually extended.25 And they can extend this policy to other torts, such as infliction of emotional distress, as the principles of Sullivan were extended when Hustler magazine published a crude cartoon mocking the pretensions of the evangelist Jerry Falwell.26

Today, of course, it is sometimes difficult to criticize government policy because we do not know what the policy is. Disclosures by whistleblowers Chelsea Manning and Edward Snowden have revealed secret torture practices, extrajudicial murders through secret drone strikes, and a massive system of secret government surveillance.27 Social media sites can play a positive role in the “unique and wholly new medium of worldwide human communication”28 that is the Internet not only by trimming censorship policies to a minimum but also by fostering robust and wide-open access to the information necessary for a functioning democracy.

* Marjorie Heins is the author, most recently, of Priests of Our Democracy: The Supreme Court, Academic Freedom, and the Anti-Communist Purge (2013).

1. Pub. L. 104-104, 509, 110 Stat. 56, 137 (1996) (codified at 47 U.S.C. § 230 (2006 & Supp. V 2011)).

2. Pub. L. No. 105-304, 201, 112 Stat. 2860, 2877 (1998) (codified as amended at 17 U.S.C. § 512 (2012)).

3. 376 U.S. 254 (1964).

4. Miguel Helft, Facebook’s Mean Streets, N.Y. TIMES, Dec. 13, 2010, at B1.

5. Statement of Rights and Responsibilities, FACEBOOK, (last updated Nov. 15, 2013), archived at Similarly, the “Community Standards” page states: “[W]e do not permit individuals . . . to attack others based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition.” Facebook Community Standards, FACEBOOK, (last visited May 22, 2014), archived at

6. See Miller v. California, 413 U.S. 15, 36–37 (1973) (holding, in pertinent part, that obscenity is to be determined by “community standards,” not “national standards”).

7. See, e.g., Facebook:[email protected], GETHUMAN,–1782/ (last visited May 22, 2014), archived at (containing comments of confused and frustrated Facebook users unable to navigate the appeals process); How Can I Appeal a Decision to Reject a Promoted Post?, FACEBOOK, (last visited May 22, 2014), archived at (providing no answer to user’s request related to Facebook’s appeals process).

8. See, e.g., Lee Rowland, Naked Statue Reveals One Thing: Facebook Censorship Needs a Better Appeals Process, ACLU (Sept. 25, 2013),, archived at (describing the ACLU’s experience in appealing Facebook’s removal of a photo of a nude statue that was the subject of a censorship dispute in Kansas); see also Svetlana Mintcheva, post to Free Expression Network (Feb. 7, 2010) (in author’s files) (describing decision by domain Network Solutions to eject The File Room censorship archive because of a Nan Goldin photograph; the photograph in question shows two little girls playing; one is naked and her vulva can be seen. A link to the photograph on the File Room website has been deleted from this footnote, over the strenuous objection of the author, on the advice of counsel for the Harvard Law Review. Author’s note: that a link to an innocent photograph by one of the country’s major artists should be censored is evidence of both the danger and the absurdity of confusing images of children’s bodies with child pornography).

9. Marvin Ammori, The “New” New York Times: Free Speech Lawyering in the Age of Google and Twitter, 127 HARV. L. REV. 2259, 2278 (2014).

10. Google Custom Search: Terms of Service, GOOGLE UK, (last visited May 22, 2014), archived at The analogous page for U.S. sites provides: “the Site shall not contain any pornographic, hate-related or violent content or contain any other material, products or services that violate or encourage conduct that would violate any criminal laws, any other applicable laws, Service policies, or any third party rights.” Google Custom Search: Terms of Service, GOOGLE, (last visited May 22, 2014), archived at

11. Google’s Safety Tools, GOOGLE, (last visited May 22, 2014), archived at

12. See, e.g., MARJORIE HEINS, CHRISTINA CHO & ARIEL FELDMAN, INTERNET FILTERS: A PUBLIC POLICY REPORT (2006), archived at; see also id. at 65 (providing specific information on Google’s SafeSearch).

13. See Casey Newton, Google Tweaks Image Search to Make Porn Harder to Find, CNET (Dec. 12, 2012, 3:41 PM),, archived at; Matthew Panzarino, Google Tweaks Image Search Algorithm and SafeSearch Option to Show Less Explicit Content, THENEXTWEB (Dec. 12, 2012, 8:36 PM),, archived at; Josh Wolford, Google No Longer Allows You to Disable SafeSearch, and That Makes Google Search Worse, WEBPRONEWS (Dec. 16, 2012),, archived at Thanks to NYU student Rebecca Suss, whose term paper for my fall semester 2013 class alerted me to Google’s decision to make it impossible to disable SafeSearch.

14. Cf. Verizon v. FCC, 740 F.3d 623, 628 (D.C. Cir. 2014) (striking down parts of an FCC order to create “net neutrality” on the basis that the FCC had previously categorized certain Internet service providers as exempt from “common carier” status).

15. See Ammori, supra note 9, at 2262, 2290; 47 U.S.C. § 230(c)(1) (2006 & Supp. V 2011).

16. 47 U.S.C. § 230(c)(2) (“No provider or user of an interactive computer service shall be held liable on account of — (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected . . . .” (emphasis added)).

17. See Ammori, supra note 9, at 2290.


19. 17 U.S.C. § 512(c)(1)(A)(iii); id. § 512(d)(1)(C). For explanation of the various subsections of section 512 and how they apply to different online providers, see Quilter & Heins, supra note 18.

20. 17 U.S.C. § 512(g).

21. See QUILTER & HEINS, supra note 18, at 10 n. 22.

22. See id. at 9 n.19.

23. New York Times Co. v. Sullivan, 376 U.S. 254, 270 (1964).

24. Id. at 293–97 (Black, J., concurring).

25. Curtis Publishing Co. v. Butts, 388 U.S. 130, 155 (1967) (plurality opinion).

26. Hustler Magazine, Inc. v. Falwell, 485 U.S. 46, 57 (1988).

27. See, e.g., Greg Miller, Julie Tate & Barton Gellman, Documents Reveal NSA’s Extensive Involvement in Targeted Killing Program, WASH. POST. (Oct. 16, 2013),, archived at; Greg Mitchell, A Long List of What We Know Thanks to Private Manning, THE NATION (Aug. 23, 2013),, archived at; Jeremy Scahill & Glenn Greenwald, The NSA’s Secret Role in the U.S. Assassination Program, THE INTERCEPT (Feb. 10, 2014, 12:03 AM),, archived at; Peter Walker, Bradley Manning Trial: What We Know from the Leaked Wikileaks Documents, THE GUARDIAN (July 30, 2013),, archived at; Stephanie C. Webster, Edward Snowden’s 10 Biggest Revelations About the NSA, THE PROGRESSIVE (Jan. 17, 2014),, archived at

28. Reno v. ACLU, 521 U.S. 844, 850 (1997) (quoting ACLU v. Reno, 929 F. Supp. 824, 844 (E.D. Pa. 1996)) (internal quotation mark omitted).