Monday, May 21, 2012
Some Skepticism About Search Neutrality
Posted by Daniel A. Crane
Would it be a good idea for antitrust law to require dominant Internet search engines to be “neutral” in listing their organic search hits, where “neutrality” would prohibit the search engine from giving preference to its own or affiliates’ websites? In two forthcoming symposia essays, I argue that it would be a very bad idea. To be clear, I do argue that dominant search engines like Google in the U.S. and Europe, Baidu in China, and Yahoo in other parts of Asia should have completely immunity from antitrust law for the way they implement their search engines. However, a general principle of search neutrality is unsupported by antitrust principles and would be a disaster for search engine innovation.
My argument is in two parts. First, the argument for a search neutrality principle applicable to Google (for now, I’ll stick with Google) rests on two falsifiable empirical claims: (1) that Google is dominant in Internet search; and (2) that Google is able to leverage its dominance in Internet search to distort competition in Internet commerce. The argument may founder on proposition (1). Studies have shown that most users are willing to multi-home, or switch from one search engine to another. If, as Google claims, competition is always just “one click away” and users are willing to click, then it seems unlikely that Google should be considered dominant in Internet search.
But assuming for the sake of the argument that Google is dominant in search and that leveraging to an adjacent website is, in theory, a rational business move if Google can pull it off, one may ask whether this vision has any correlation with reality. Just because a search engine is dominant vis-à-vis other search engines, it does not necessarily have the power to promote or demote adjacent websites to its advantage and in a way that seriously affected the overall competitiveness of the adjacent market. This would only be true if search engines were indispensible portals for accessing websites. They are not. Users link to websites from many origins—for example, bookmarks, links on other websites, or links forwarded in e-mails—other than search engines. Even dominant search engines account for a relatively small percentage of the traffic origins.
For example, when Google acquired the travel-search software company ITA in 2011, rivals complained that Google would use its dominance in search to steer consumers to a Google travel site instead of rival sites like Expedia, Travelocity, and Priceline. But even if Google did that, it is hard to imagine that this could be fatal to rival travel search sites. According to compete.com data, only a small volume of traffic into the three big travel search sites originated with a Google search—12% for Expedia and 10% for Travelocity and Priceline. The percentage of Yahoo! travel and Bing travel (Microsoft’s service) originating with Google is even smaller—7% and 4% respectively.
Google’s relatively low share in search referral is not limited to the travel sites. It includes news sites like The New York Times, the Huffington Post, Foxnews.com, and Politico, where the percentage of incoming traffic from Google is 20%, 6%, 11%, and 12% respectively. It also includes social media sites like Facebook and Twitter, where Google’s search referral share has recently been around 10-11%. In many cases, other search engines, websites, or services are significantly more important in referring traffic than Google. The Drudge Report refers twice as many readers to Politico (24% to 12%), Yahoo refers more readers to Facebook, and Facebook (11% to 10%) refers more than twice as many users to Twitter (27% to 11%).
In my forthcoming essays, I discuss some reasons to be careful with these data. Still, I have not yet seen a convincing case that search dominance leads to referral dominance. A major flaw in the monopoly leverage story is that even if a particular search engine were dominant as a search vehicle, search engines are not necessarily dominant when it comes to reaching websites. In most cases, a critical mass of users know where they want to go without conducting a search. Manipulation of a search engine to favor particular sites might induce more traffic to visit the site, but it seems unlikely that it could foreclose customers from reaching competitive sites. My second argument is that a general search neutrality principle would freeze search engine innovation. Most of the arguments in favor of the search neutrality seem to imagine Internet search as it was five or ten years ago, when a search engine’s job was to return a list of ten blue links. That vision is outdated. Increasingly, search engines are not merely providing intermediate information but ultimate information, the answers themselves. Or, if the search engine remains a step removed from the ultimate information, it is integrated with the ultimate information. Increasingly, it is not accurate to speak about search engines and websites as distinct spaces or the relationship between search and content as vertical. The lines are quickly blurring.
This progression is driven by users’ own preferences. As the head of Yahoo Labs and Yahoo’s search strategy explained in 2009, “[p]eople don’t really want to search. Their objective is to quickly uncover the information they are looking for, not to scroll through a list of links to Web pages.” Consequently, search engines are no longer just focusing on document retrieval. Instead, they are working towards direct question answering. By figuring out the intent of the person conducting the search and then displaying all the related content that he might want to see, search engines are shifting away from the paradigmatic ten blue links towards a world of richer results.
“Neutrality” is an incomprehensible and undesirable principle in this context. If a search engine’s algorithm determines that a user is probably asking for directions to a country club, the best answer is not to display a set of links about country clubs but to display a map with navigation functionality. That map will often be a proprietary function of the search engine, an opportunity for the search engine to display more advertising and hence further monetize its Internet presence. Displaying a proprietary map in response to a search query isn’t “neutral,” but prohibiting it would lock the search engine into a dated and uninformative way of responding to a user’s query. A general principle of search neutrality would be a disaster for innovation.