Antitrust & Competition Policy Blog

Editor: D. Daniel Sokol
University of Florida
Levin College of Law

A Member of the Law Professor Blogs Network

Wednesday, April 28, 2010

UCL Centre for Law and Economics conference, 21 & 22 May 2010, Nicosia, Cyprus

Posted by D. Daniel Sokol

The UCL Centre for Law and Economics would like to cordially invite you to a conference on competition law on 21 and 22 May 2002 in Cyprus.

Conference proceedings on 21 May 2010 will be on consumer protection and on 22 May 2010 on information exchange agreements. For more information on the conference and a programme, please visit the Eventbrite page at http://ucl-cyprus-conference.eventbrite.com/.

Tickets will be available for £95 and the event will be registered for CPD.

You are invited to the following event:
UCL Centre for Law and Economics conference, 21 & 22 May 2010, Nicosia, Cyprus

Date:
Friday, May 21, 2010 at 5:00 PM
- to -
Saturday, May 22, 2010 at 6:00 PM (GMT+0200)


Location:
Administration Centre, National Bank of Greece
Auditorium 15
Arch. Makarios III Avenue
Nicosia
Cyprus
 
  Can you attend this event?  Respond Here  

April 28, 2010 | Permalink | Comments (0) | TrackBack (0)

Global Cartels, Leniency Programs and International Antitrust Cooperation

Posted by D. Daniel Sokol

Jay Pil Choi, Michigan State University - Department of Economics and Heiko A. Gerlach, University of Auckland - Department of Economics explore Global Cartels, Leniency Programs and International Antitrust Cooperation.

ABSTRACT: In this paper we analyze cartel formation and self-reporting incentives when firms operate in several geographical markets and face antitrust enforcement in different jurisdictions. We are concerned with the effectiveness of leniency programs and the benefits of international antitrust cooperation between agencies. When international antitrust prosecution is uncoordinated, multi-market contact allows firms to reduce the amount of self-reporting in equilibrium and sustain cartels more effectively. We then discuss the effects of information sharing among antitrust authorities as a function of how much and which type of information is exchanged. We show that extensive information sharing might have an adverse effect on self-reporting by cartel members.


 

April 28, 2010 | Permalink | Comments (0) | TrackBack (0)

Tuesday, April 27, 2010

The Web Economy, Two-Sided Markets and Competition Policy

Posted by D. Daniel Sokol

David Evans (U Chicago - Law, UCL - Law, LECG) explains The Web Economy, Two-Sided Markets and Competition Policy

ABSTRACT: The web economy has grown rapidly in the last decade. Online businesses have several key features that are important for understanding the pro-competitive and anti-competitive strategies they may engage in. The two-sided markets literature helps elucidate many of these strategies. It also provides guidance for the antitrust analysis of market definition and exclusionary practices for web-based businesses.

April 27, 2010 | Permalink | Comments (0) | TrackBack (0)

Stolt-Neilsen Supreme Court Case is Out

Posted by D. Daniel Sokol

See here regarding the case involving antitrust arbitration.

April 27, 2010 | Permalink | Comments (0) | TrackBack (0)

Competition with Local Network Externalities

Posted by D. Daniel Sokol

Øystein Fjeldstad, Norwegian School of Management, Espen R. Moen, Norwegian School of Management, and Christian Riis, Norwegian School of Management have a new paper on Competition with Local Network Externalities.

ABSTRACT: Local network externalities are present when the utility of buying from a firm not only depends on the number of other customers (global network externalities), but also on their identity and/or characteristics. We explore the consequences of local network externalities within a framework where two firms compete offering differentiated products. We first show that local network externalities, in contrast to global network externalities, don't necessarily sharpen competition. Then we show that the equilibrium allocation is inefficient, in the sense that the allocation of consumers on firms does not maximize social surplus. Finally we show that local network externalities create a difference between the marginal and the average consumer, which gives rise to inefficiently high usage prices and too high level of compatibility between the networks.


 

April 27, 2010 | Permalink | Comments (0) | TrackBack (0)

Media Mergers and Media Bias with Rational Consumers

Posted by D. Daniel Sokol

Simon P. Anderson, University of Virginia (UVA) - Department of Economics and John McLaren, University of Virginia have some interesting findings on Media Mergers and Media Bias with Rational Consumers.

ABSTRACT: We present an economic model of media bias and media mergers. Media owners have political motives as well as profit motives, and can influence public opinion by withholding information that is pejorative to their political agenda - provided that their agenda is not too far from the political mainstream. This is true even with rational consumers who understand the media owners' biases, because the public do not know how much information the news organizations have and so do not know when news is being withheld. In line with conventional wisdom, this problem can be undone by competition; but competition can be defeated in equilibrium by media mergers that enhance profits at the expense of the public interest. We thus derive a motive for media merger policy that is completely distinct from the motives behind conventional antitrust. While media bias may reduce the profit incentives to merge, media markets nonetheless err to being insufficiently competitive, and the consequences of merger are more severe than in other markets.

April 27, 2010 | Permalink | Comments (0) | TrackBack (0)

Innovation and the Limits of Antitrust

Posted by D. Daniel Sokol

Geoffrey A. Manne, International Center for Law & Economics (ICLE), Lecturer in Law, Lewis & Clark Law School and Joshua D. Wright, George Mason University School of Law explore the important topic of Innovation and the Limits of Antitrust.

ABSTRACT: This paper offers an opportunity to reflect on Frank Easterbrook’s seminal work on the Limits of Antitrust and to discuss its particular relevance to the problem of antitrust enforcement in the face of innovation. The error-cost framework in antitrust originates with Easterbrook’s analysis, itself built on twin premises: first, that false positives are more costly than false negatives because self-correction mechanisms mitigate the latter but not the former, and second, that errors of both types are inevitable because distinguishing pro-competitive conduct from anti-competitive conduct is an inherently difficult task in a single-firm context.

While economists have applied this framework fruitfully to several business practices that have attracted antitrust scrutiny, our goal in this paper is to harness the power of this framework to take an Easterbrookian, error-cost minimizing approach to antitrust intervention in markets where innovation is a critical part of the competitive landscape. While much has been said about the relationship between innovation and antitrust, often in the way of broad pronouncements that innovation either renders antitrust essential to economic growth or entirely unnecessary, the error-cost framework allows for greater precision in policy prescriptions and a more nuanced approach. Some of the implications are well understood in the current body of literature and others have been frequently ignored or remain entirely unrecognized.

Both product and business innovations involve novel practices, and such practices generally result in monopoly explanations from the economics profession followed by hostility from the courts (though sometimes in reverse order) and then a subsequent, more nuanced economic understanding of the business practice usually recognizing its pro-competitive virtues. This sequence and outcome is exactly what one might expect in a world where economists’ career incentives skew in favor of generating models that demonstrate inefficiencies and debunk the Chicago School status quo, while defendants engaged in business practices that have evolved over time through trial and error have a difficult time articulating a justification that fits one of a court’s checklist of acceptable answers. From an error-cost perspective, the critical point is that antitrust scrutiny of innovation and innovative business practices is likely to be biased in the direction of assigning higher likelihood that a given practice is anticompetitive than the subsequent literature and evidence will ultimately suggest is reasonable or accurate.

Given recent activities in the antitrust enforcement landscape - identifying innovating firms in high-tech markets as likely antitrust targets combined with recent discussions of error costs from leading enforcers, at the Section 2 Hearings and elsewhere - we hope to begin a more rigorous discussion of the relationships between innovation, antitrust error, and optimal liability rules that goes beyond merely selecting economic models that fit regulator’s prior beliefs.

We begin by discussing some principles for application of the error cost framework in the innovation context in Part II before discussing the historical relationship between antitrust error and innovation in Part III. Part IV concludes by challenging the conventional wisdom that the error cost approach implies that the rule of reason should apply to most forms of business conduct rather than per se rules. While we agree that per se rules should not apply to cases involving product or business innovation, broadly defined, we argue that the error cost approach should not require generalist judges to evaluate state of the art economic theory and evidence on a case by case basis. Instead, we favor an approach that is consistent with the spirit of Easterbrook’s original analysis, identifying simple filters aiming to harness the best existing economic knowledge to design simple rules that minimize error costs. We conclude with five such proposals for simple rules based on existing economic theory, empirical evidence, and acknowledgment of the institutional biases toward false positives discussed above.

April 27, 2010 | Permalink | Comments (0) | TrackBack (0)

Monday, April 26, 2010

Competition Policy and Financial Distress

Posted by D. Daniel Sokol

Ezra Friedman, Yale University - Department of Economics, Northwestern University - School of Law and Marco Ottaviani, Northwestern University - Kellogg School of Management provide thoughts on Competition Policy and Financial Distress.

ABSTRACT: Traditional analyses of competition policy assume that firms operate in perfect credit markets. We argue that imperfections in credit markets should be taken into account, and show one channel by which accounting for financial conditions could alter the welfare effects of a merger. In line with empirical evidence, we posit that the presence of financial distress might diminish price competition by reducing firms' willingness to undertake long-term investments in their customer base. Mergers that reduce the probability of financial distress can induce the merging firms to compete more fiercely for customers, thus partly offsetting the traditional effects of an increase in market power. We use this framework to derive implications for competition policy.

April 26, 2010 | Permalink | Comments (0) | TrackBack (0)

Department of Justice and USDA Announce Poultry Workshop on May 21 in Alabama

Posted by D. Daniel Sokol

The Department of Justice and the U.S. Department of Agriculture (USDA) announced today additional details regarding the public workshop that will be held on May 21 at Alabama A&M University in Normal, Ala., to explore competition and regulatory issues in the agriculture industry.


April 26, 2010 | Permalink | Comments (0) | TrackBack (0)

Price Discrimination, Two-Sided Markets, and Net Neutrality Regulation

Posted by D. Daniel Sokol

Dennis Weisman, Kansas State University - Department of Economics and Robert B. Kulick, Navigant Economics have posted Price Discrimination, Two-Sided Markets, and Net Neutrality Regulation.

ABSTRACT: In an October 22, 2009 Notice of Proposed Rulemaking, the Federal Communications Commission posed a number of questions regarding the merits of price discrimination given the two-sided structure of broadband markets. The law and economics literature finds that price discrimination is presumptively welfare-enhancing, that it is frequently a response to competitive market forces rather than the absence of such forces, and that the merits of price discrimination are likely enhanced in a two-sided markets framework. This is the case because the platform provider must use prices to solve the “chicken and egg” problem - both sides of the market must be brought on board under conditions in which the relative valuations placed on the transaction can vary markedly across the two sides of the market. Hence, price discrimination is necessary to unleash the full potential of broadband markets. Another form of conduct of concern to the Commission is access tiering, in which broadband providers market different levels of service quality to content providers. Access tiering is an example of differential pricing rather than discriminatory pricing. Prohibitions on such practices would likely serve to reduce consumer welfare, suppress competition, and discourage investment in network infrastructure.


 

April 26, 2010 | Permalink | Comments (0) | TrackBack (0)

Efficiency and Market Power Gains in Megabank Mergers

Posted by D. Daniel Sokol

Erik Devos, University of Texas at El Paso - College of Business Administration - Department of Economics and Finance, Srinivasan Krishnamurthy, NC State University, SUNY at Binghamton - School of Management, and Rajesh P. Narayanan, Louisiana State University have some thoughts on Efficiency and Market Power Gains in Megabank Mergers.

ABSTRACT: This paper uses Value Line forecasts to estimate and trace merger-related gains to their ultimate sources in efficiency improvements and enhanced market power. Sidestepping methodological issues that have hampered past attempts at assessing bank mergers, this approach yields estimates that indicate that, on average, megabank mergers generate gains that derive from improvements in cost efficiencies. With the removal of restrictions on bank expansion, these gains decline and previously observed efficiency gains give way to gains that arise at the expense of customers as the extent of geographic overlap between the merging banks increases. In mergers where the resulting size of the combination allows merging banks potential access to previously unavailable regulatory subsidies, gains are substantial and arise from projected revenue increases that are linked to increased risk taking.

April 26, 2010 | Permalink | Comments (0) | TrackBack (0)

The Plausibility of Twombly: Proving Horizontal Agreements after Twombly

Posted by D. Daniel Sokol

Alvin K. Klevorick, Yale University - Law School and Issa Kohler-Hausmann, New York University explain The Plausibility of Twombly: Proving Horizontal Agreements after Twombly.

ABSTRACT: We address a longstanding issue in antitrust doctrine: what must a plaintiff adduce at various procedural stages to show an agreement under Sherman Act § 1? Our major goal is to offer the most compelling interpretation and formalization of the Supreme Court’s statements on evidentiary and procedural standards for showing a § 1 violation, while recognizing that, as some have claimed, the Court has addressed the required proof of pleading without a well-articulated definition of agreement or, more narrowly, conspiracy. Our contribution attempts to formalize what we believe is the most reasonable interpretation of the Court’s reigning rules, which now include the one in Twombly. For now we do not offer a normative prescription for those rules based on foundational policy principles, such as promotion of efficiency or minimization of social costs, which include the costs of errors in the decision process. This is not to say that those policy concerns do not motivate the doctrine we are addressing, but rather that here we set ourselves the more limited task of presenting the most plausible and reasonable interpretation of the doctrine and of formalizing it in a way that aids understanding.

The primary target of our interpretive task is the Supreme Court’s 2007 decision in Twombly, which is the most recent installment in a line of cases that has addressed the requirements for establishing the existence of an agreement as a matter of substantive antitrust law and civil procedure. We begin by locating Twombly in the context of Sherman Act § 1 jurisprudence that courts have adopted to try to minimize overall expected error costs in deciding conspiracy cases. Then we describe a leading - perhaps the leading - approach that courts have followed in delineating the types of facts a plaintiff relying on circumstantial evidence must adduce to prove her § 1 conspiracy claim, the doctrine commonly know as “plus factors.” This discussion is complemented by an account of how Monsanto and Matsushita together set the standard for how much of that type of evidence such a plaintiff needs to offer to withstand a motion for summary judgment. In these discussions we introduce a conceptual distinction between the type and quantum of proof and argue that it is key to understanding and formalizing the rules announced in the line of cases addressing evidentiary sufficiency of § 1 claims. Consequently, this duality helps us distinguish the issues in Monsanto and Matsushita from the motion to dismiss question addressed in Twombly.

We bring these two elements of types of evidence and quantum of evidence together in a description of the way that the circuit courts integrated the consideration of plus factors and the holding in Matsushita into a test the plaintiff must pass at the summary judgment stage. We introduce the notation of Bayesian probability to formalize the tests that have emerged across the circuit courts for § 1 plaintiffs to withstand summary judgment.

The case of principal concern, Twombly, then takes center stage as we describe the difference between the District Court’s and the Second Circuit’s analyses of the case, and then take up the argument between the parties at the Supreme Court. Finally, we offer our reading of the Supreme Court’s Twombly decision and propose a formalization of our interpretation again using the notation of Bayesian probability. The latter helps to distinguish our proposed reading of the case’s holding from the settled doctrine around summary judgment. We briefly conclude with remarks on the future of Twombly’s “plausibility” standard and the task of proving horizontal agreements in the future.


 

April 26, 2010 | Permalink | Comments (0) | TrackBack (0)