Antitrust & Competition Policy Blog

Editor: D. Daniel Sokol
University of Florida
Levin College of Law

A Member of the Law Professor Blogs Network

Monday, May 21, 2012

Is there a basis in antitrust law for requiring ‘neutral’ search results - Comments of James Grimmelman

Posted by James Grimmelman

The heart of the gathering antitrust case against Google appears to be that it sometimes "manipulates" the order in which it presents search results, in order to promote its own services or to demote competitors. The argument has intuitive appeal in light of Google's many representations that its rankings are calculated "automatically" and "objectively," rather than reflecting "the beliefs and preferences of those who work at Google." But as a footing for legal intervention, manipulation is shaky ground. The problem is that one cannot define "manipulation" without some principled conception of the baseline from which it is a deviation. To punish Google for being non-neutral, one must first define "neutral," and this is a surprisingly difficult task.

In the first place, search engines exist to make distinctions among websites, so equality of outcome is the wrong goal. Nor is it possible to say, except in extremely rare cases (such as, perhaps, "4263 feet in meters") what the objectively correct best search results are. The entire basis of search is that different users have different goals, and the entire basis of competition in search is that different search engines have different ways of identifying relevant content. Courts and regulators who attempt to substitute their own judgments of quality for a search engine's are likely to do worse by its users.

Neutrality, then, must be a process value: even-handed treatment of all websites, whether they be the search engine's friends or foes. Call this idea "impartiality." Tarleton Gillespie suggested the term to me in conversation.) The challenge for impartiality is that search engines are in the business of making distinctions among websites (Google alone makes hundreds of changes a year).

A strong version of impartiality would be akin to Rawls's veil of ignorance: algorithmic changes must be made without knowledge of which websites they will help and hurt. This is probably a bad idea. Consider the DecorMyEyes scam: an unethical glasses merchant deliberately sought out scathing reviews from furious former customers, because the attention _qua_ attention boosted his search rank. Google responded with an algorithmic tweak specifically targeted at websites like his. Strong impartiality would break the feedback loops that let search engines find and fix their mistakes.

Instead, then, the anti-manipulation case hinges on a weaker form of impartiality, one that prohibits only those algorithmic changes that favor Google at the expense of its competitors. Here, however, it confronts one of the most difficult problems of high-technology antitrust: weighing pro-competitive justifications and anti-competitive harms in the design of complicated and rapidly changing products. Many self-serving innovations in search also have obvious user benefits.

One example is Google's treatment of product-search sites like Foundem and Ciao. Google has admitted that it applies algorithmic penalties to price-comparison sites. This may sound like naked retaliation against competitors, but the sad truth is that most of these "competitors" are threats only to Google's users, not to Google itself. There are some high-quality product-search sites, but also hundreds of me-too sites with interchangeable functionality and questionable graphic design. When users search for a product by its name, these me-too sites are trying to reintermediate a transaction that has very little need of them. Ranking penalties directed at this category share some of the pro-consumer justification of Google's recent moves against webspam.

A slightly different practice is Google's increasing use of what it calls [Universal Search], in which it offers news, image, video, local, and other specialized search results on the main results page, intermingled with the classic "ten blue links." Since Google has competition in all of these specialized areas, Universal Search favors Google's own services over competitors'. Universal Search is an obvious departure from neutrality, whatever your baseline--but is it bad for consumers? The inclusion of maps and local results is an overwhelming positive: it saves users a click and helps them get the address they're looking for more directly. Other integrations, such as Google's attempts to promote its Google+ social network by [integrating social results], are more ambiguous. Some integration rather than none is almost certainly the best overall design, and any attempt to draw a line defining which integration is permissible will raise sharp questions about regulatory competence.

Some observers have suggested not that Google be prohibited from offering Universal Search, but that it be required to modularize the components, so that users could choose which source of news results, map results, and so on would be included. This idea is structurally elegant, but in-house integration also has important pragmatic benefits. Google and Bing don't just decide _which_ map results to show, they also decide _when_ to show map results, and what the likely quality of any given map result is compared with other possible results. These comparative quality assessments don't work with third-party plugin services.

It makes sense for general-purpose search engines to turn their expertise as well to specialized search. Once they do, it makes sense for them to return their own specialized results alongside their general-purpose results. And once they do that, it also makes sense for them to invite users to click through to their specialized subsites to explore the specialized results in more depth. All of these moves are so immediately beneficial to users that regulators concerned about Universal Search should tread with great caution.

For more on these issues, see my papers Some Skepticism About Search Neutrality, The Google Dilemma, and The Structure of Search Engine Law.

http://lawprofessors.typepad.com/antitrustprof_blog/2012/05/is-there-a-basis-in-antitrust-law-for-requiring-neutral-search-results-comments-of-james-grimmelman.html

| Permalink

TrackBack URL for this entry:

http://www.typepad.com/services/trackback/6a00d8341bfae553ef016766a57ed4970b

Listed below are links to weblogs that reference Is there a basis in antitrust law for requiring ‘neutral’ search results - Comments of James Grimmelman:

Comments

In my opinion, I do not think "WHAT" Goolge is doing is necessarily wrong. I also believe in high quality sites with high level of service. For me, the primary issue with Google is "HOW" they go about executing their changes or penalties.

Let us take one example:

Price Comparison Engines were hit hard during last update. Many of them had no idea that they will get hit until the lightning bolt struck on them.

Google loves these surprise factors but unfortunately they create mass confusion, fear and extremely strong resentment. Many business leaders identify Google with an elephant that has one set of teeth to show and the other to chew!

The second issue with google is that they have cornered themselves in a box with respect to Google Shopping.

Goolge shopping is nothing more than a price comparison engine. If Google truly believed that Price Comparison Engines do not add value, then they should not have their own price comparison engine right?

So instead of getting rid of their own price comparison engine, Google tried hard to get rid of competitor's engines (associating theirs with low quality) while keeping their for free. Now when competitors are almost knocked down, Google throws yet another surprise. They now say they will convert their price comparison engine to paid service as it adds more value if it's a paid service. Go figure! Because of this, many retailers are now comparing Google with a "Drug Dealer" who gets you hooked onto a drug first and then starts charging for it. This kind of surprise again creates a lot of resentment and distrust among peers, competitors and channel partners.

So the entire backdrop of antitrust lawsuit for Google is created by Google by appearing deceptive.

I do believe it will be very difficult to prove the case here and ultimately US Government will settle out but my hope is that after this episode, hopefully Goolge learns to be transparent.

In my opinion, Google needs to be very clear about "Quality Proposition" it expects from other websites. Google should also be proactive about notifying webmaster (proactively and transparently) what kinds of penalties are being enforced and why.

This will allow the internet community to move forward with Google as a leader in Quality. Under current practice, Google simply penalizes a website without them even knowing that they have been punished (i.e. many price comparison engines) and when they find out - usually it's too late. This kind of Wild Wild West Practice must end in order for Google to again become trusted leader with the renewed commitment to doing "No Evil!"

Posted by: Raj Desai | Jul 25, 2012 8:35:05 AM

Post a comment