We understand why eighty-two percent of Americans regularly use social media.1 Social media networking platforms offer speedy communication and entertainment at no immediate cost to users. But they also produce problems widely acknowledged by groups that otherwise diverge or even disagree. Democrats and Republicans, adults and teens, national security experts, and parents alike2 have criticized social media because it is addictive, is associated with depression and anxiety,3 propels misinformation and disinformation,4 and poses national security concerns.5 The bullying, unrealistic presentations of body image, and high-risk activity challenges propelled through social media sites increase mental health harms for many users.6 Even social media companies acknowledge they face problems that they cannot handle by themselves.7 Amid clashing ideas about government regulation — including some promising ideas, many of which are unlikely ever to become laws — greater and more thoughtful self-regulation by the social media companies would be a wise and productive step. History suggests some missed opportunities. The United States government failed to encourage, and even forbade as anticompetitive, a proposal from the National Association of Broadcasters (NAB) to self-regulate by capping advertisements per hour in the early 1960s.8 That missed opportunity and other self-regulatory efforts offer lessons for this moment, when national governmental action is especially unlikely given partisan divides.9 Hence, the time is ripe for advocacy groups and governmental actors to explore industry self-regulation avenues for social media’s targeted ads and content moderation.
Professor Evelyn Douek recently raised important criticisms of the judicial-style thinking infusing the “Facebook Oversight Board” created by the company now known as Meta.10 Professor Thomas Kadri expanded on this caution by demonstrating how the juridical focus in Meta’s self-regulation entrenches its own power.11 Kadri urges skepticism about any potential contribution of industry self-regulation to the welfare of users, but nonetheless sketches a mix of internal and external regulatory measures to “foster healthier digital environments.”12 Douek identifies how constructive developments, such as transparency reporting, derive from self-regulatory efforts as platform companies seek more legitimacy and responsiveness to users.13 Her analysis also could prompt self-regulatory efforts to pursue systemic, proactive, and innovative approaches, including monitored self-regulation.14
Yes, self-regulation is likely to advance the interests of the companies and benefit incumbents over new entrants, but it also can draw on the knowledge, resources, and flexibility of the private companies.15 Cautions about self-regulation, thus, should not weaken the pressure for action, even if the action takes the form of self-regulation. Self-regulation is far more likely to proceed and to identify effective strategies than are responses from the federal government. Calls for government regulation of social media are both understandable and a surprising point of convergence for people across the political spectrum,16 but they are unlikely to produce actual reforms anytime soon. A majority of Americans responding to surveys favor more attention by elected officials to policy concerns involving technology and tech companies, but Americans across political and racial groups also distrust Congress even more than they distrust the tech companies.17 Government regulation poses severe risks of suppressing speech of both individuals and platform companies as well as harming competition and innovation — even among social media companies that jeopardize the well-being of individuals and their trust in news and communications. One study reports far more confidence in private, independent oversight than in regulation by government or by individual companies.18
A majority of Americans oppose changing existing law to allow individual suits against social media platforms for the content posted by third parties.19 The existing law, Section 230 of the Communications Decency Act of 1996,20 shields the platform companies from the kinds of defamation and fraud suits applicable to news media and publishers while also protecting the good faith moderation and removal of content they view as harmful.21 Intended to promote innovative internet companies, Section 230 has been interpreted to protect platforms from civil liability for leaving content up and also to protect them if they choose to take content down.22 Congress amended the law to permit liability related to content about sex trafficking — but rather than reducing sex trafficking, this change may have made it harder for law enforcement to track it and more difficult for sex workers to obtain information to enhance their safety.23 At the time of writing, the Supreme Court was considering two questions addressing the scope of the platforms’ continuing immunities: Does the immunity persist when a social media company employs recommendation algorithms to target users with certain content posted by third parties?24 And does a separate law — the Antiterrorism Act of 199025 — apply to a social media platform regularly detecting terrorist activity related to third-party content it hosts, or does the Section 230 immunity govern?26
Some want further congressional amendments to Section 230 to permit suits against the platform companies for hosting hateful and harassing speech, and misinformation related to COVID-19 and elections.27 But even before the midterm elections, action was stalled28 and advocates were pursuing changes to state laws.29 New laws in Florida and Texas would allow actions against media companies for the censorship of conservative content, but federal courts have at least temporarily halted implementation of both laws.30 Other proposed state laws would require social media companies to adopt methods for reporting hate speech or misinformation, to ensure transparency of moderation rules, and to protect children against social media addiction31 — though even when enacted, such reforms face court challenges from individuals and from the companies.
The companies seek not only freedom from regulation but also protection against divergent rules in different states.32 Because social media’s reach is global, though, the companies already face divergent rules across the globe. The companies, most of them based in the United States, must confront not only contrasting regulations in Europe, the United Kingdom, and Asia but also challenges in overseeing content in different languages and varied political contexts — and competition from companies outside the United States that may not even try to comply with rules here and elsewhere.
Amidst these complexities and mounting unease about their industry, social media companies should engage in more vigorous self-regulation not only within their own firms but also through industry-wide organizations. Private-sector collaborations can offer some protection against the biases and self-interest of individual companies (although large and established companies may band together for self-regulations that disadvantage smaller newcomers33 — and “regulatory capture” by dominant companies is a danger even with public regulation34). Self-regulation by voluntary standards, adopted through industry-level private groups, has offered apparently effective responses in other economic sectors.
Through voluntary self-regulation, in contrast to “audited” self-regulation, private industry-level organizations create rules and standards with which individual industry actors voluntarily comply.35 Federal agencies may retain some involvement but typically delegate standard setting and licensing to self-regulatory bodies.36
For example, the Financial Industry Regulatory Authority (FINRA) is a private, nongovernmental organization37 that licenses and audits securities dealers to promote transparency and compliance with ethical standards devised through its own rulemaking process.38 Its members — private organizations and professionals — use the self-regulatory approach to enhance integrity and build trust in the securities industry. The government, through the Securities and Exchange Commission (SEC), can review the disciplinary proceedings conducted by FINRA and also can propose changes in the rules.39 The self-regulatory process, thus, draws on the expertise of the industry, the industry’s own interest in earning and expanding trust of the public, and knowledge and concerns channeled through government. FINRA’s influence is substantial.40 Similar self-regulatory approaches oversee trading in derivatives through the National Futures Association, a national industry-wide effort to protect both investors and markets.41 Self-regulation may create barriers to entry for newcomers or otherwise advance companies’ self-interest, but it also harnesses insider knowledge and companies’ desires to be trusted by the public.
Third-party watchdog organizations help regulate industries by offering seals or certifications for meeting certain standards or by rating or accrediting actors in the industry.42 For example, the Good Housekeeping Institute evaluates products and issues a seal on those that perform as intended.43 The Institute provides refunds to consumers for any product with the seal that a consumer finds defective within two years of purchase.44 Similar approaches have also worked in a range of other industries. Independent private bodies provide accreditation that is central to hospitals as they participate in government funding programs.45 Audited self-regulation through local agricultural boards ensures stable markets for certain agricultural products, and industry private audits verify that fruits and vegetables are produced, packed, handled, and stored to minimize risks of microbial food safety hazards.46 For addressing environmental threats, the International Organization for Standardization plays a large role by developing international standards, which are used as benchmarks for external certification bodies that certify industries that pose environmental threats.47 Production of certified products improves regulatory compliance and reduces pollution faster than noncertified production.48
The Federal Trade Commission (FTC) actively encourages and assists private-sector self-regulation efforts.49 Trade institutes governing beer, spirits, and wine have adopted advertising and marketing codes to reduce marketing that would increase underage audiences.50 Alcohol producers in the United States have also signed on to marketing principles established by the International Center for Alcohol Policies.51 And some alcohol producers have created their own internal advertising codes.52 The FTC consulted with the Individual Reference Services Group, a trade association establishing and monitoring compliance with standards to protect data aggregated across internet and private databases.53 These kinds of self-regulation yield impressive compliance rates54 and address issues not handled by the federal regulatory body.55
The motion picture industry, electronic game industry, and music recording industry each have self-regulatory systems for rating or labeling their products with the familiar notations for movies of G, PG, PG-13, and R; the label of “Mature” rating for games; and the label of “Explicit” for music.56 The FTC’s regular reports on the self-regulatory practices of these entertainment industries find substantial compliance with the voluntary standards by the motion picture and game industries and less, but still meaningful, compliance by the music industry.57
What now are lessons from the experiences of self-regulation with broadcasting? Self-regulation efforts should be carefully crafted to comport with antitrust law — or devised in concert with exemptions to antitrust rules.
The history of the NAB is instructive. Starting in 1952, the NAB devised and revised code of conduct rules for programming and advertising, including content standards and advertising time limits.58 The NAB created and enforced these standards in large part to ensure member compliance with the Federal Communication Commission’s requirement that licensees operate in “the public interest.”59 Yet, in 1970, the Department of Justice filed an antitrust suit against the NAB, alleging that portions of the code of conduct “restricted the available amount of commercial time” and therefore restrained competition in violation of the Sherman Act.60 In the agreement settling the suit, the NAB agreed to rescind its code.61 This experience chilled self-regulatory efforts by broadcasters, and led to some broadcasting of ads that the earlier codes would have rejected.62
Self-regulatory approaches have in other settings successfully overcome inadequate information, offered flexible adaptations to changing circumstances, limited costs borne both by industries and governments, engendered a sense of responsibility in private actors,63 and navigated requirements of antitrust laws.64
Voluntary sector-wide self-regulation is especially effective, according to researchers, when there is activist engagement with the industry, an independent monitoring structure, continuing possibilities of enforcement actions outside of the self-regulatory body, pressure from customers and civil society groups,65 threat of private lawsuits, some vertical integration within companies,66 and a strong trade association.67 Social media as a sector holds promise for efficacious sector-wide self-regulation. Again, public approval of private independent regulation contrasts sharply with public distrust of government regulation and of individual-company regulation when compared with the prospect of no government regulation.68 Strikingly, the European Union has constructed a voluntary code regarding disinformation and a variety of stakeholders — including Microsoft, Google, Twitter, and other companies founded in the United States — have signed on to the code.69
The Netherlands employs a hybrid of private and public regulation: sectoral industry organizations can submit codes of conduct to a government supervisory body for approval.70 The Netherlands assigned power to sectoral industry organizations to enact legally binding data protection regulations responsive to questions identified by the government. If the government found self-regulation insufficient in a given sector, the government could decree sectoral privacy regulations.71 For at least the first nine years of this plan, the government found the codes of conduct sufficient and did not decree other sectoral privacy regulations.72 (But Dutch courts remain able to interpret the Data Protection Act73 for themselves and can find a firm in violation of the statute even if it’s acting in compliance with the accepted code of conduct.74)
How could self-regulation proceed with social media? Current and former social media executives warn that their industry harms civil discourse essential to democracy and amplifies misinformation, social divisions, and risks of violence.75 The status quo seems unsustainable.76 Of course, social media platforms and the larger group of digital platform companies are diverse in emphasis, business models, scale, and values.77 Some provide social networking and amplify peer-to-peer communication with algorithmic priority ranking of items and individualized targeting of ads.78 Others are primarily associated with searching information (Google) or sharing videos (TikTok) and photographs (Instagram). Still others — like those in the gaming industry — have social media and data collection practices.79 Internet commerce platforms, such as Amazon, provide a meeting ground for sellers and buyers of goods and services using a recommendation algorithm — all while selling their own data storage and tech services.80 Even with this variety, these companies receive and deploy large quantities of information from individuals and groups, use algorithms to create recommendations, sell and display ads to individuals based on analyses of the data they have shared, and afford varied channels for communication between individuals and groups.81 And the lines between the different fields blur as they acquire businesses (Google owns YouTube82) and experiment with new lines of business.
Unlike concerns raised with other industries, the issues raised by digital platform companies can challenge the viability of democracies, spread risks of physical violence, and put into jeopardy individuals’ mental health even as they offer opportunities to learn, earn a living, and communicate.83 Digital platform companies offer avenues for freedoms of speech,84 access to knowledge, and chances — even for people with very limited resources — to sell their services and goods.85 The scale of these companies can be staggering because they reach people around the globe and enable millions of newly posted content items to circulate each hour.86 Devising and enforcing any common regulations is a potentially impossible task for governments. The particular tasks of moderating and regulating content shareable on social media raise not only fraught dangers from suppressing speech but also potentially unpredictable effects.87 Hence, formulating and enforcing norms are difficult challenges which call for iterative and evolving undertakings by many different actors.88
Focal points for any self-regulatory effort will vary because the businesses vary. But data privacy issues cut across all these companies and devising some coordinated voluntary practices could seem more efficient and coherent than addressing competing rules from the European Union, China, individual states within the United States, and other jurisdictions. Furthermore, the relevant governments might welcome some forms of third-party auditing and other self-regulatory modes given the scale and cost of oversight and enforcement of their rules such as the European Union’s General Data Protection Regulation89 and Digital Services Act.90 Collaborations with private industry and government could address workable definitions of unacceptable harassment and strategies to monitor fake social media profiles.
Similarly, protections for children — including their data privacy, guards against targeted advertising, and moderation of content inappropriate for young people — could be a focal point for sector-level self-regulation. Such efforts could bring more fine-grained tools and nimble approaches to areas that are already subject to some governmental regulation and enforcement.91 Privacy professionals identify data protection for children as a global priority that should be taken up by both governments and private-sector actors and seek to bolster laws in other countries and the self-regulatory approach in the United States.92 The European Union has already directed social media firms to track down and remove content connected with child abuse.93
In the specific context of social media content moderation, social media companies might even collaborate on categories in which to allow outside auditing. Industry groups could collaborate in defining best practices in areas of likely consensus, such as protecting children from violence and preventing extreme harassment.94 They could work to refine the transparency needed for effective fact-checking by third parties, fund and craft crowdsourced verification of accuracy, or work on defining misinformation.95 Self-regulatory efforts would draw on industry knowledge and hold off governmental pressures or provide a road-map for sensible governmental involvement.96 Financial and competitive pressures — and potential exemption from antitrust enforcement threats — could induce industry collaboration.
Meta’s creation of the Facebook Oversight Board seemed to some a publicity stunt; the company created this board as an external body composed of twenty journalists, academics, and politicians, to which people can appeal content enforcement decisions.97 It has rightly prompted serious criticisms.98 Yet now, after two years of its work, both the energy behind it and the critiques it has elicited could provide helpful predicates for more effective self-regulation. The initial remit of the Board was limited essentially to individual objections to removal of individual content.99 But by mobilizing — and compensating — dis-tinguished people outside the company to oversee disputes over moderation decisions, the Oversight Board harnessed talented, independent individuals who care about their own reputations. The members of the Oversight Board are pressing for transparency of the social media entity’s rules and practices, exposing failures by the company to follow its own rules, and overcoming company resistance to welcoming and responding to concerns.100
Some critics of the Oversight Board have established an indepen-dent alternative effort for monitoring and critique.101 Other social media companies can invent alternative approaches. Multiple private efforts offer the public and governmental actors opportunities to learn. Governments intervene with private-sector enterprises when the private sector fails to address harms they produce. Even with federal action in the United States stalled, state governments have started to act — much like governments outside the United States.102 Ongoing calls for new responses offer some grounds for concerns that self-regulation will “lock in” oversight for social media.103 Even governments moving to regulate risks from social media companies find it essential to work with the private enterprises in order to identify and mitigate harms.104 As other social media platforms examine self-regulatory options, collaborations across the sector could bring the kinds of ongoing improvements that have occurred with hospital accreditations, liquor sales, environmental threats, and other sources of serious social risk.
* Newton Minow served as the Chair of the Federal Communications Commission under President John F. Kennedy and was a law firm partner, professor at Northwestern University, Chair of the Public Broadcasting Service, board member of a broadcasting network, and recipient of 14 honorary degrees and other honors, including the American Bar Association Silver Gavel Award, Chicago Bar Association John Paul Stevens Award, Federal Communications Bar Association Lifetime Achievement Award, American Lawyer Lifetime Achievement Award, and the Presidential Medal of Freedom. He passed away on May 6, 2023, but participated in all but the final editing stages of this piece.
** Martha Minow is the 300th Anniversary University Professor at Harvard University; she serves on the board of the public media company, GBH, and previously served on the board of a private media company. The authors thank Evelyn Douek, Lauren Greenawalt, Andrew Celli, Cosimo Fabrizio, and the editors of the Harvard Law Review for assistance with this Response.