The Yale Law Journal

VOLUME
117
2007-2008
Forum

Don’t Censor Search

09 Sep 2007

When search engines lead thousands of searchers to anonymous online harassment, it may seem only natural to look for legal ways to make the harassment disappear from search results. This initially attractive idea is in fact deeply dangerous. It pressures the wrong intermediary, invites abuse by spammers and censors, and misunderstands the relationship between search engines and search users. Search-engine amplification is part of the problem of online harassment, but laws targeting search engines are the wrong solution.

People have always been jerks; the Internet lets them be jerks on an unprecedented scale by combining anonymity and public visibility. Many proposals for cleaning up some of the messes jerks create online focus on Web site operators, since they often have the technical power to unmask or muzzle harassers. Some ideas would require site operators to make their services less anonymous. Other ideas would encourage site operators to be more responsible by limiting section 230 of the Communications Decency Act of 1996, which currently shields Web site operators from liability for the speech torts of their users.

But Web site operators are not the only intermediaries in this picture. The Web is not a megaphone; Web pages are not just blasted to millions of horrified recipients. The millions of horrified recipients choose which Web pages they read. Increasingly often, they choose by using a search engine. Perhaps Alice runs a search on Bob’s name, then clicks on a search result that brings her to a Web page calling Bob a rapist, a robber, and a regicide. Sometimes Alice really is interested in seeing Bob called a king-killer, but more often, Bob simply doesn’t have much of an online presence, so that the libelous accusation bubbles to the top of her search results. Search engines amplify the hate.

This observation suggests a new way to suppress unwanted anonymous speech. Like Web site operators, search engines currently have broad immunity under section 230. Taking away some of the immunity or giving them new duties could hide harassment even without unmasking speakers or shutting down their Web hosts. I have seen three variations of this idea mooted.

First, and least intrusively, Frank Pasquale has proposed a kind of right of reply at the level of search results. If a high-ranking result in a search on Bob’s name is the regicide calumny, Pasquale would give Bob the right to make the search engine place an asterisk on its results page. The asterisk would hyperlink to a Web page where Bob sets the story straight.

Second, there have been calls to create a notice-and-takedown procedure similar to the one in place for copyright infringement to cover other torts. Web site operators who receive notice that something posted on their site is harmful could face liability for it unless they took it down promptly. This idea could also be applied to search engines. If it were implemented, search engines would continue to have blanket immunity from defamation liability for the contents of search results, but only if they removed those results upon receiving proper notice from the defamed victim.

Third, Orin Kerr has suggested pressuring Web sites to make harassment less searchable. His proposal would revoke a Web site’s section 230 immunity with respect to anything it allows search engines to index. (There are several common protocols used by Web sites to request that search engines not index particular pages. The legal effect of these protocols is unclear, but the major search engines all respect these opt-out requests.) A site operator would have to choose between taking full responsibility for content on her site or keeping it out of search engines.

These ideas are attractive because search engines are so powerful. But they all depend on a flawed conception of what search does. Search engines aren’t megaphones, any more than Web sites are. Good search tools help users find the information they want, not the information that others want them to find. Search can help individuals move from being passive consumers of information to active seekers for it. This shift is important for human autonomy, since the ability to locate the information we need is central to our ability to make decisions for ourselves. It is also economically important; search enables more efficient exchange of information goods, and thereby catalyzes a virtuous cycle of creativity. The obvious lesson here is that search is too important to muck up, so we should be cautious when doing things that might muck up search.

There is also, however, a subtler and more important point. Good search favors active users, and good information policy does, too. Interventions that make search less useful make it harder for search users to lead self-directed lives and to be real participants in democratic politics and democratic culture. Crippling search can give content creators and third parties unwarranted power over search users—that is, over most of us who use the Internet. We need to analyze any serious proposal for a change in Internet law for its effects on the search ecosystem. Will it improve the searchability of the Net, or inhibit it?

The proposals above do not fare well under such analysis. Pasquale’s proposal, by mandating what would appear on search results pages, would hinder users’ ability to choose among diverse search engines. A notice-and-takedown system would make some information unfindable. Kerr’s idea would cause many site operators to withdraw from the commons of the searchable Internet. All three are steps down a potentially slippery slope to holding search engines responsible for any harmful speech online; at some point along that slope, search engines will simply stop indexing any unknown content—thereby destroying the general-purpose Internet-wide search that has fueled the explosive growth of the Web in the last decade.

Fundamentally, search engines don’t want to mislead their users with half-truths and libel; they want to present information in a useful context. Taking steps to rip information out of searchability (whether through notice-and-takedown or Kerr’s opting-out proposal) is a step in the wrong direction. Making information unsearchable throws out any babies the bathwater may contain. Pasquale’s idea of annotation recognizes this point; it answers bad speech with more speech. Even still, mandating one kind of corrective annotation might inhibit the development of better, more helpful responses—such as personalized search based on the recommendations of one’s friends, or semantic analyses that can automatically put Web pages in a broader context.

Another problem with mandating what results appear in searches and how those results are presented is that such interventions create opportunities for gamesmanship at an incredibly influential choke point. Notice-and-takedown regimes are especially vulnerable to abuse, as we have seen with overreaching DMCA notices. Extending this system to search and beyond copyright could open the floodgates to the routine use of search censorship as a weapon in flamewars and other online disputes. The problem would be especially severe in that search engines have no preexisting relationship with most site operators, making it much harder to set up a counternotice system to replace search results mistakenly deleted.

Even Pasquale’s annotation proposal creates an opportunity for abuse—in this case, to muscle one’s way onto search results pages. Would George W. Bush be able to asterisk every progressive blog? The search equivalent to the commercial spam industry is a place of mind-boggling tenacity and creativity. There are people who would consider changing their name to “Cheap Mortgage” or “Discount Viagra” if it would get them on the first page of search results for a popular enough phrase.

Suppressing search can also be a way of engaging in censorship without admitting it. Allowing content to stay up online seems to permit the excuse that everyone is still being allowed to speak. But the proper view of search tells us that the focus should be just as much on listeners’ ability to hear. If the problem is that people who don’t want to find harassing content are coming across it by accident, then search engines have every incentive to fix this problem on their own. If not, then it is hard to say with a straight face that something should be available online but that people who want to find it shouldn’t be able to. Whether search engines were to delete results or site owners were to opt out of being searchable, search users’ free speech and autonomy interests in finding information would be frustrated.

Taken together, these concerns warn us that tampering with search is a second-best response. If the content really is sufficiently harmful that its suppression is justified, it would be better to target the owner of the site where it appears. She will typically have better information about what the content is, whether it is false, and who is responsible for creating it. Putting burdens on the search engine rather than on her inhibits search, to the harm of the general public. Harassing and hateful online speech is a real problem, but to blame search engines is to shoot the messenger.

James Grimmelmann is Associate Professor of Law at New York Law School. He thanks Bradley Areheart and Nathaniel Gleicher for their comments.

Preferred Citation: James Grimmelmann, Don’t Censor Search, 117 Yale L.J. Pocket Part 48 (2007), http://yalelawjournal.org/forum/dont-censor-search.