Legal Theory Blog

All the theory that fits!


This is Lawrence Solum's legal theory weblog. Legal Theory Blog comments and reports on recent scholarship in jurisprudence, law and philosophy, law and economic theory, and theoretical work in substantive areas, such as constitutional law, cyberlaw, procedure, criminal law, intellectual property, torts, contracts, etc.

This page is powered by Blogger. Isn't yours?
Thursday, June 30, 2005
Call for Papers: Respect
    Res Publica: A Journal of Legal and Social Philosophy invites submissions for a special issue (Volume 12/1): Respect
      'Respect' is a ubiquitous, multi-faceted and frequently under-theorised concept in ethical, social, political and legal philosophy. While it is generally regarded as a 'good thing', exactly what kind of thing it is remains in many ways contested or opaque. What does it mean to respect another person? How does this relate to 'toleration', or 'recognition', or to dignity, merit or social status? Can, or should, respect for individuals translate into respect for groups, or ways of life? Where, or with whom, does the call to respect another become inappropriate? Is respect of any relevance in addressing social inequalities, or as a component part of social justice? How does it relate to questions of power? Does pursuing it risk the aggressive imposition of partial norms across diverse social contexts? How does respect for others relate to disrespect? Or to self-respect? Does the term merit its typically central place in professional codes of ethics? Can we respect the dead, or those yet to be born? Can the term coherently be applied to non-human entities (animals, the natural environment, property)?
    For this special issue, to appear in early 2006, we aim to attract a range of treatments of these, and related, questions. We especially encourage the submission of papers relating issues of 'respect' to:
      - Human rights - Equality - Anti-discrimination - Status recognition - Social justice - Professional ethics
    ... but we welcome submissions addressing any of the questions raised above, whether individually or in combination. All articles will be submitted to our standard process of double-blind review. Deadline for submissions: 31 July, 2005 Maximum paper-length: 8000 words For further information, please contact the co-editors of this issue:
      Dr David Middleton, Open University: or Dr Gideon Calder, University of Wales, Newport:

Pettit on the Many as One Philip N. Pettit (Princeton University - Department of Politics) has posted On the Many as One (Philosopy and Public Affairs, September 2005) on SSRN. Here is the abstract:
    In a recent paper on "The Many as One', Lewis A. Kornhauser and Lawrence G. Sager look at an important issue in political theory? How far should groups in public life try to speak with one voice, and act with one mind? How far should public groups - in particular, politically authoritative groups like the judiciary and the legislature - try to display what Ronald Dworkin calls integrity? While we agree with many of the points they make about this issue, we do not think that they do justice to the challenge they identify. Our comments fall into three sections. We address, first, the nature of the integrity challenge; second, the range of cases in which the challenge arises; and third, the question of whether public groups should try to satisfy it. While starting from our differences with Kornhauser and Sager, the main aim of the paper is to advance the discussion of these topics, generating a more general perspective than they provide.

Wednesday, June 29, 2005
Read Ernest Miller on Grokster If you are interested in Grokster, you really want to read this post by Miller. Like much of Miller's work, his analysis of Grokster is very fine indeed. Very highly recommended!

Grokster and the Future of P2P What are the implications of the Grokster decision for the future of P2P filesharing? Superficially, the fact that MGM prevailed in the Supreme Court might seem like a negative for P2P, I believe that quite the opposite is true. Why?
  • The first question that needs to be answered when evaluating the impact of Grokster is: What is the baseline? The thought that Grokster was a victory for content providers is a result of the assumption that the Ninth Circuit's decision provided the legal baseline, and that baseline essentially immunized P2P providers from legal liability. But that assumption was never correct. The Ninth Circuit decision was just one decision in one case, and the decision itself was largely a product of the poor litigation strategy of plaintiffs--who were trying to get the Ninth Circuit to eviscerate the Sony (substantial noninfringing use) decision. The Ninth Circuit did buy the plaintiff's argument that the choices were eviscerate Sony or let P2P run amok, but decided to let P2P run amok. In my opinion, that result was unstable and never could have been the long-term status quo. The real baseline was a state of uncertainty--with the NInth Circuit Grokster decision, the Seventh Circuit's Aimster decision, and many other possibiities unresolved.
  • The Supreme Court's decision in Grokster makes it clear that P2P companies are subject to liability if there is direct proof of intentional inducement of copyright violations. Good legal engineering of P2P communications and business models can easily insure that such proof will be unavailable. Therefore, the intentional inducement cause of action is likely to drop out of the picture from an ex ante perspective. Of course, ex post some of the existing P2P services may have residual legal problems, but that will only shift users from some P2P services to others.
  • Among the P2P services that are likely to flourish in a post-Grokster legal environment is BitTorrent. Of course, shifting users to BitTorrent is no victory for content providers--it is actually a defeat. It is not in the interest of content providers to encourage users to shift to more capable P2P engines. (For more on this, see Ernest Miller's post with many links.)
  • Of course, there is a certain air of unreality about all of this discussion, because, as we all know, the cat is out of the bag, the barn door is open, and the genie is out of the bottle!!! No change in the legal environment can possibly contain P2P, and even if it could, the fundamental architecture of the Internet is such that massively distributed file sharing cannot be subject to effective legal control. If not "P2P," then something else, piggybacking on email, IM, or something else completely different.
The Grokster decision may have been a minor tactical victory for content providers, but it is a stupendous strategic loss.

Garnett on Transportation and the Urban Poor Nicole Stelle Garnett (Notre Dame Law School) has posted The Road from Welfare to Work: Informal Transportation and the Urban Poor (Harvard Journal on Legislation, Vol. 38, No. 73, 2001) on SSRN. Here is the abstract:
    Individuals struggling to move from welfare to work face numerous obstacles. This Article addresses one of those obstacles: lack of transportation. Without reliable transportation, many welfare recipients are unable to find and maintain jobs located out of the reach of traditional forms of public transportation. Professor Garnett argues that lawmakers should remove restrictions on informal van or jitney services, allowing entrepreneurs to provide low-cost transportation to their communities. This reform would not only help people get to work, but it could also provide jobs for low-income people.
I always enjoy Garnett's work.

Ellman on Unanimity in Brown v. Board Stephen Ellmann (New York Law School) has posted The Rule of Law and the Achievement of Unanimity in Brown (New York Law School Law Review Vol. 49, pp. 741-784, 2004-2005) on SSRN. Here is the abstract:
    How did Justice Stanley Reed come to join the Supreme Court's unanimous decision in Brown v. Board of Education? It is clear from the historical record that Reed's first inclination was to uphold the constitutionality of racially segregated education, and clear as well that in the end he put this inclination aside and joined, without any public qualification, in the Court's decision banning segregation. Perhaps Reed changed his mind about the meaning of the constitution; perhaps he changed his mind about the legitimacy of judges' making social policy in the name of the constitution; perhaps he decided to uphold the Supreme Court's strength as an institution by helping make this momentous decision unanimous; perhaps (though this I particularly doubt) he explicitly or implicitly traded his vote in Brown I for anticipated concessions on the remedy issue that the Court would address in Brown II. The fascinating historical record is ultimately elusive, and exactly what happened will likely never be completely certain, but each of these possibilities raises important questions about the meaning of the "rule of law." I argue here, inter alia, that if Reed's thinking was swayed by the gentle personal touch of Earl Warren and other justices, that emotional impetus was no breach of the rule of law; that if he voted against his own view of the law for the sake of unanimity, this too was within the historical, and legitimate, bounds of Supreme Court justices' decisionmaking discretion; and that if, in voting as he did, he found himself having to disregard some deeply-held beliefs, such as his opposition to judicial policymaking, for the sake of others, this need to act in light of, or in the face of, crosscutting moral demands is ultimately a central part of the rule of law. It is possible to imagine judges obliged to breach the rule of law - in Nazi Germany, for example, or in ante-bellum fugitive slave cases in the United States – but I do not see Justice Reed as having faced such a situation. Instead, the rule of law, rightly understood as the complex and supple social structure that it is, provided room for the choice that he made.
Fascinating topic.

Stark on Globalization, Women, and the Law Barbara Stark (Hofstra) has posted Women, Globalization, and Law: A Change of World on SSRN. Here is the abstract:
    Adrienne Rich describes a radical global change in the deliberately inconsequential - and gendered - terms of fashion. In the poem, however, fashion transforms mountains and oceans more venerable than patriarchy itself. Historically inconsequential women, similarly, are shaping globalization even as globalization transforms their lives. This change of world is profound and deeply contested. This Article first provides an overview and then analyzes this change of world in three specific contexts. It is not intended to be comprehensive; rather, I simply hope to suggest a few of the ways in which globalization affects the world's women and how they in turn affect globalization. I am particularly interested in the ways in which human rights law legitimates and furthers women's multiple, often conflicting, agendas and how feminist theories can be used to interrogate them and expose their complexity.

Smith on Autonomy, Equality, and Voting Rights Terry Smith (Fordham University School of Law) has posted Autonomy Versus Equality: Voting Rights Rediscovered on SSRN. Here is the abstract:
    Using empirical and other evidence, this paper examines the success of Shaw v. Reno and the Supreme Court's wrongful districiting cases in reducing the role of race in politics. Observing that the Court has shown an inability to distinguish between race as such and politics, the author argues that rather than the reduce the role of race, more than a decade of the wrongful districting cases has simply reduced minority political autonomy under the stalking horse of color-blindness. The author argues that this autonomy ought not be so easily disposable becaause it is rooted in the Constitution and in the exercise of the franchise.

Tuesday, June 28, 2005
Pearce on Inequality in the Market for Justice Russell G. Pearce (Fordham University School of Law) has posted Redressing Inequality in the Market for Justice: Why Access to Lawyers Will Never Solve the Problem and Why Rethinking the Role of Judges Will Help (Fordham Law Review, Vol. 73, p. 969, 2004) on SSRN. Here is the abstract:
    Commentators have argued that the solution to addressing unequal justice under law lies in increased government funding for legal services for the poor and increased pro bono hours from private lawyers. While these proposals could result in providing more lawyers for more low income people, they fail to account for the pervasive inequality resulting from the distribution of legal services primarily through the market. Given the influence of market distribution of legal services on legal outcomes, government-funded legal services and pro bono assistance can provide a valuable form of charity, but not an effective means of equalizing justice. The Essay instead suggests that a more effective way to enhance equal justice under law would be to rethink the proper role of the judge and make the judge an "active umpire" responsible for the quality of justice.

Brodin on "Fact Verdicts" Mark S. Brodin (Boston College - Law School) has posted Accuracy, Efficiency, and Accountability in the Litigation Process - The Case for the Fact Verdict (University of Cincinnati Law Review, Vol. 59, pp. 15-111, 1990) on SSRN. Here is the abstract:
    Although the jury trial is regarded as a lynchpin of the American concept of justice, ambivalence about the institution persists, particularly in the context of civil litigation. Some question whether the civil jury is an inefficient anachronism. This article argues that many of the concerns raised about civil juries in general are really concerns about the routine use of the general verdict, an institution that merges the jury's fact finding function and its role as an applier of law. The article argues that in many instances, replacing a general verdict with a special verdict would allow the jury to play to its strength as reporter of fact. At the same time, it would free the jury from the burden of interpreting and applying elaborate instructions of complex legal doctrine. Despite criticism that the special verdict weakens the constitutional powers of the jury, the article proposes the use of the special verdict in a manner that presents the jury with questions of actual fact while leaving the task of applying law to the judge. The special verdict, if used correctly, enhances the reliability and efficiency of the litigation process.

Leib on Choices About Choice Ethan Leib (Hastings) has posted Responsibility and Social/Political Choices about Choice; Or, One Way To Be a True Non-Voluntarist on SSRN. Here is the abstract:
    Linking choice with responsibility is a seduction our voluntarist society often cannot resist. We generally wish to hold people responsible in our tort and criminal law for their free choices—and conceive of responsibility as intimately bound up with personal choice. Samuel Scheffler may have diagnosed why many redistributive forms of liberalism often fail to command support in the public sphere: because they regularly deny what seems to be a basic moral intuition of our society—that people should be held responsible for their free choices. To be sure, the contours of what counts as a free choice and what counts as a product of duress, genetics, or upbringing sufficient to vitiate or mitigate responsibility is always a matter of vigorous ongoing contestation. Still, there remains a strong intuition in our society’s collective moral psychology that responsibility is somehow deeply connected to free choices. Indeed, we might not be able to make sense of ourselves as selves without feeling justified in claiming responsibility first and foremost for what we perceive to be our own free choices. The potential that the “Causal Thesis” may be true—that some weak form of determinism obtains —does not deter us: to reinforce our aspiration for free will, we tend to design our punitive policies and moral practices of praise and blame consistent with it, in spite of our failure to have a clear faith that our institutions contribute to members’ true freedom. We do this, some would argue, to retain the basic connection of responsibility to choice; the business of apportioning responsibility somehow seems manageable, justifiable, and legitimate if it is tied to choice. Accordingly, even the determinists among us are compatibilists. Here, I make an effort to think hard about the purported connection between responsibility and choice—and try to avoid the seduction of voluntarism. I build from the work of Meir Dan-Cohen, who has done the most to develop a theory of responsibility unmoored from choice. In the process, I touch upon love and creativity, two areas of social life that provide a window into a different conception of responsibility that can be used to guide our practices of praise and blame in morality, the criminal law, and torts.

Pettit on Contractual Morality Philip N. Pettit (Princeton University - Department of Politics) has posted Can Contract Theory Ground Morality? (MORAL THEORIES, J. Dreier, ed., Blackwell, 2005) on SSRN. Here is the abstract:
    The paper in is in three sections. In the first I offer a characterization of contractualism, explaining along the way that under this representation it is proof against two more or less obvious consequentialist objections. In the second section I argue that even when characterized in this manner, however, there remains an attractive and plausible way of taking contractualism that would make it consistent with consequentialism; this would cast it as a theory of the relatively right - the right relative to a practice - rather than the absolutely right. And then in the third section I show that even if this relativised way of taking it is rejected, as Scanlon himself would certainly reject it, there is a second way in which contractualism can in principle be rendered consistent with consequentialism; it may be cast as a partial rather than a complete theory of the absolutely right. Under neither of these ways of taking the doctrine would contractualism ground morality - not at least in every relevant sense - but under each it would retain a significant place in moral theory.

Monday, June 27, 2005
The Grokster Remand & Mandate The final paragraphs of Grokster reward a second & careful look. Here's what Souter writes:
    MGM?s evidence in this case most obviously addresses a different basis of liability for distributing a product open to alternative uses. Here, evidence of the distributors? words and deeds going beyond distribution as such shows a purpose to cause and profit from third-party acts of copyright infringement. If liability for inducing infringement is ultimately found, it will not be on the basis of presuming or imputing fault, but from inferring a patently illegal objective from statements and actions showing what that objective was.
    There is substantial evidence in MGM?s favor on allbelements of inducement, and summary judgment in favorbof Grokster and StreamCast was error. On remand, reconsiderationb of MGM?s motion for summary judgmentb will be in order.
    The judgment of the Court of Appeals is vacated, and the case is remanded for further proceedings consistent with this opinion.
I see two things of interest here:
  • The first bolded passage--which nixes "imputed" fault and impliedly requires "statements and actins showing" intent to induce--reinforces the very narrow door the Supreme Court has opened for the music industry.
  • The second bolded passage--which suggests that MGM's motion for summary judgment might prevail--emphasizes the Court's belief that this record did have very strong evidence of intentional inducement.
Thanks to Glenn Edward for emphasizing the latter point in email!

Geidner on Volokh on Divisiveness and the Ten Commandments Cases Check out Chris Geinder's post on Law Dork. Here's a taste:
    Volokh wrote:
      What has caused more religious divisiveness in the last 35 years -- (1) government displays or presentations of the Ten Commandments, creches, graduation prayers, and the like, or (2) the Supreme Court's decisions striking down such actions? My sense is that it's the latter, and by a lot . . . .
    Volokh is missing a step that occurs in between the displays and the rulings, and I think it's a relevant step. What is missing? It's the lawsuit. Volokh's "divisiveness" claim appears to be a more anti-"judicial activism" incarnation (pre-vacancy announcement?) of the good ol' "if it weren't for that damn ACLU (or Lambda or NAACP) stirring up problems where there aren't any, we'd all be happy" argument.

More on Footnote 12 For more on Footnote 12 of the Grokster decision, surf on over to Edward Lee's Lee Blog. Here is a taste:
    1. For the first time, the Supreme Court has expressly recognized that the Sony ruling is not just a doctrine, it is a "safe harbor "-- it is meant to allow technology developers a way to avoid liability by following certain guidelines. 2. The Court states that, if there is no other evidence of intent to induce infringement, the design of a product itself is not evidence of inducement. The Sony safe harbor protects product designs that are capable of substantial noninfringing uses in such cases. For example, the design of iPod won't support an active inducement claim. 3. The Court recognizes that technology developers don't have to adopt filtering or other techniques to stop copyright infringement sought by copyright holders. Such "affirmative steps" are not required (i) in the absence of other evidence of intentional inducement and (ii) if the device is otherwise capable of substantial noninfringing uses. Again, even though Apple could have designed the iPod to be incompatible with infringing files (or non-iTunes files), Apple's under no legal duty to do so.
Read the whole post!
Update: And for more on Footnote 12, check out Randy Picker on Picker's MobBlog with really terrific post.

Footnote 12 in Grokster Because Grokster was decided on an inducement theory, the crucial question--from a practical point of view--is what constitutes sufficient evidence of inducement. In particular, is evidence of "intent" required. If so, then "legal engineering" (see post below) can circumvent liability. If not, then things would get much more interesting. So consider this passage from Justice Souter's opinion:
    While the Ninth Circuit treated the defendants? failure to develop such tools as irrelevant because they lacked an independent duty to monitor their users? activity, we think this evidence underscores Grokster?s and StreamCast?s intentional facilitation of their users? infringement. 12
And what is in Footnote 12?
    12 Of course, in the absence of other evidence of intent, a court would be unable to find contributory infringement liability merely based on a failure to take affirmative steps to prevent infringement, if the device otherwise was capable of substantial noninfringing uses. Such a holding would tread too close to the Sony safe harbor.
Once again, the "beef" is in the footnote!

The Grokster Concurrences Six justices joined concurring opinions in Grokster. Ginsburg was joined by Rehnquist and Kennedy. Breyer was joined by Stevens and O'Connor. Ginsburg and Breyer disagree about the meaning of the Sony "substantial noninfring use" test, and that disagreement is potentially important to the future of P2P litigation, and hence to the future of copyright. What is the significance of these opinions? Here is a key passage from near the conclusion of Justice Ginsburg's concurrence:
    In sum, when the record in this case was developed, there was evidence that Grokster?s and StreamCast?s products were, and had been for some time, overwhelmingly used to infringe, ante, at 4?6; App. 434?439, 476? 481, and that this infringement was the overwhelming source of revenue from the products, ante, at 8?9; 259 F. Supp. 2d, at 1043?1044. Fairly appraised, the evidence was insufficient to demonstrate, beyond genuine debate, a reasonable prospect that substantial or commercially significant noninfringing uses were likely to develop over time. On this record, the District Court should not have ruled dispositively on the contributory infringement charge by granting summary judgment to Grokster and StreamCast.
And here is a passage from Justice Breyer's opinion:
    When measured against Sony?s underlying evidence and analysis, the evidence now before us shows that Grokster passes Sony?s test?that is, whether the company?s product is capable of substantial or commercially significant noninfringing uses.
    As in Sony, witnesses here explained the nature of the noninfringing files on Grokster?s network without detailed quantification. Those files include:
      --Authorized copies of music by artists such as Wilco, Janis Ian, Pearl Jam, Dave Matthews, John Mayer, and others. . . . --Free electronic books and other works from various online publishers, including Project Gutenberg. . . . --Public domain and authorized software, such as WinZip 8.1. . . . --Licensed music videos and television and movie segments distributed via digital video packaging with the permission of the copyright holder. . . .
And Justice Breyer adds:
    To be sure, in quantitative terms these uses account for only a small percentage of the total number of uses of Grokster?s product. But the same was true in Sony, which characterized the relatively limited authorized copying market as "substantial."
In other words, Ginsburg plus two disagrees with Breyer plus two about the meaning of the Sony "substantial noninfring use" standard. Ginsburg seems to endorse the idea that the standard is not met when the overwhelming majority of uses are infringing; Stevens disagrees with that proposition. The concurrences may indicate where the lines are drawn for the next wave of P2P litigation!

Grokster: A "Legal Engineering" Failure Over at MobBlog, Doug Lichtman has a post bemoaning the legal standard for inducement adopted by the Court:
    MGM won on paper today, but my first reading of the opinion makes me wonder whether the victory will have any bite outside of this specific litigation. Intent-based standards, after all, are among the easiest to avoid. Just keep your message clear -- tell everyone that your technology is designed to facilitate only authorized exchange -- and you have no risk of accountability.
I disagree with Doug about the normative question--but I agree with his reading of the Opinion of the Court, which brings me to "legal engineering." Each of the important P2P filesharing cases has involved a failure of "legal engineering"--the legal design of the P2P business. In the Napster case, the failures were the most egregious--with "smoking gun" memos indicating that the purpose of Napster was to faciliate copyright infringement. In Grokster, the failures were almost as bad. Here is an excerpt from the Opinion of the Court:
    It is undisputed that StreamCast beamed onto the computer screens of users of Napster-compatible programs ads urging the adoption of its OpenNap program, which was designed, as its name implied, to invite the custom of patrons of Napster, then under attack in the courts for facilitating massive infringement. Those who accepted StreamCast?s OpenNap program were offered software to perform the same services, which a factfinder could conclude would readily have been understood in the Napster market as the ability to download copyrighted music files. Grokster distributed an electronic newsletter containing links to articles promoting its software?s ability to access popular copyrighted music. And anyone whose Napster or free file-sharing searches turned up a link to Grokster would have understood Grokster to be offering the same filesharing ability as Napster, and to the same people who probably used Napster for infringing downloads; that would also have been the understanding of anyone offered Grokster?s suggestively named Swaptor software, its version of OpenNap. And both companies communicated a clear message by responding affirmatively to requests for help in locating and playing copyrighted materials.
If there had been good "legal engineering," then Napster or Streamcase or Grokster might well have prevailed in court. Of course, product promotion is a business decision, and it is possible that a deliberate choice was made--pay the price of increased liklihood of legal liability in order to market more effectively by emphasizing copyright infringement. But one wonders whether effective marketing strategies that were more subtle might have been available--if anyone had bothered to try.

The Treatment of Sony in Grokster One of the most important issues in Grokster case is the fate of Sony, the prior Supreme Court case,in which the Supreme Court held that the Betamax (VCR) would not serve as the basis for a contributory infringement action against Sony, because it was capable of "substantial noninfringing uses." Here is what today's unanimous opinion says about Sony:
    In sum, where an article is ?good for nothing else? but infringement, Canda v. Michigan Malleable Iron Co., supra, at 489, there is no legitimate public interest in its unlicensed availability, and there is no injustice in presuming or imputing an intent to infringe, see Henry v. A. B. Dick Co., 224 U. S. 1, 48 (1912), overruled on other grounds, Motion Picture Patents Co. v. Universal Film Mfg. Co., 243 U. S. 502 (1917). Conversely, the doctrine absolves the equivocal conduct of selling an item with substantial lawful as well as unlawful uses, and limits liability to instances of more acute fault than the mere understanding that some of one?s products will be misused. It leaves breathing room for innovation and a vigorous commerce. See Sony Corp. v. Universal City Studios, supra, at 442; Dawson Chemical Co. v. Rohm & Haas Co., 448 U. S. 176, 221 (1980); Henry v. A. B. Dick Co., supra, at 48.
This is a restatement of Sony that is actually quite favorable to the Sony rule--because the Court seems to say that the limit of Sony is where the good in question is "good for nothing else" but infringing uses. The Court continues:
    The parties and many of the amici in this case think the key to resolving it is the Sony rule and, in particular, what it means for a product to be ?capable of commercially significant noninfringing uses.? Sony Corp. v. Universal City Studios, supra, at 442. MGM advances the argument that granting summary judgment to Grokster and StreamCast as to their current activities gave too much weight to the value of innovative technology, and too little to the copyrights infringed by users of their software, given that 90% of works available on one of the networks was shown to be copyrighted. Assuming the remaining 10% to be its noninfringing use, MGM says this should not qualify as ?substantial,? and the Court should quantify Sony to the extent of holding that a product used ?principally? for infringement does not qualify. See Brief for Motion Picture Studio and Recording Company Petitioners 31. As mentioned before, Grokster and StreamCast reply by citing evidence that their software can be used to reproduce public domain works, and they point to copyright holders who actually encourage copying. Even if infringement is the principal practice with their software today, they argue, the noninfringing uses are significant and will grow.
But the Court sees the issue quite differently from the way it was framed by the parties:
    [The Ninth Circuit's] view of Sony, however, was error, converting the case from one about liability resting on imputed intent to one about liability on any theory. Because Sony did not displace other theories of secondary liability, and because we find below that it was error to grant summary judgment to the companies on MGM?s inducement claim, we do not revisit Sony further, as MGM requests, to add a more quantified description of the point of balance between protection and commerce when liability rests solely on distribution with knowledge that unlawful use will occur. It is enough to note that the Ninth Circuit?s judgment rested on an erroneous understanding of Sony and to leave further consideration of the Sony rule for a day when that may be required.
These may be the most important passages in Grokster--the dog that did not bark--because they leave Sony as it was.

Holding in Grokster Here is the statement from Justice Souter's Opinon for a unanimous Court:
    The question is under what circumstances the distributor of a product capable of both lawful and unlawful use is liable for acts of copyright infringement by third parties using the product. We hold that one who distributes a device with the object of promoting its use to infringe copyright, as shown by clear expression or other affirmative steps taken to foster infringement, is liable for the resulting acts of infringement by third parties.

AP Story on Grokster The AP Story is out:
    Internet file-sharing services will be held responsible if they intend for their customers to use software primarily to swap songs and movies illegally, the Supreme Court ruled Monday, rejecting warnings that the lawsuits will stunt growth of cool tech gadgets such as the next iPod. The unanimous decision sends the case back to lower court, which had ruled in favor of file-sharing services Grokster Ltd. and StreamCast Networks Inc. on the grounds that the companies couldn't be sued. The justices said there was enough evidence of unlawful intent for the case to go to trial.
The full story can be found here on MSNBC.

No Announcement on Supreme Court Retirements With the rumors flying about a possible Rehnquist (and/or O'Connor) retirement, the Court has adjourned without an announcement.

Grokster Announced The result in the Grokster has been announced. The vague radio report suggested a loss for Grokster, and Scotus Blog just reports a loss as well:
    The Supreme Court ruled unanimously that developers of software violate federal copyright law when they provide computer users with the means to share music and movie files downloaded from the internet.
From the courtroom, we hear reports that it is a 9-0, with "inducement" as the core of the Court's rationale. The AP wire report suggestst he key is an "intent" to induce unlawful copying.

Grokster Today The Grokster decision should be announced momentarily. I will be participating in a group effort on Picker MobBlog, others include Doug Lichtman, Jessica Litman, Jim Speta, Julie Cohen, Lior Strahilevitz, Phil Weiser, Randy Picker, Ray Ku, Stuart Benjamin, Tim Wu, Tom Hazlett, and Wendy Gordon. Scotus Blog will have continuing coverage. On the Docket has a collection of links, the question presented, etc. Ernest Miller has a round-up of pre-decision commentary on Corante. I'll have comments in full here, with short posts over at MobBlog.

McGowan on Speech, Approximately David McGowan (University of San Diego School of Law) has posted Approximately Speech (Minnesota Law Review, Forthcoming) on SSRN. Here is the abstract:
    This essay comments on papers presented at the University of Minnesota by Lillian BeVier and Frederick Schauer. The essay argues that the proper subject of free speech analysis is the thing or things that make speech different from other, more freely regulated activities. From the well-accepted premise that it takes more than the presence of expression (which is present in almost all cases) to make a case a free speech case, the essay argues that conventional forms of free speech analysis, including content neutrality and scrutiny of regulatory motives, are imperfect proxies used to identify combinations of expressive costs and benefits. A case becomes a free speech case when the ratio of expressive gains to costs is high enough to justify courts in believing that expressive activity should be protected as speech. Neither content neutrality nor motive analysis can explain what should count as a cost or benefit, however, nor determine when they should be applied. These tools therefore must be treated as tools - proxies - for normative analysis, rather than as the end of analysis. To exemplify these claims, the essay defends the result in Bartnicki v. Vopper and the claim that non-traditional speakers, such as bloggers, should receive no less free speech protection than traditional media businesses.
I always learn from McGowan's work. Highly recommended!

Goldfarb on Ethics, Feminism, and Clinical Education Phyllis Goldfarb (Boston College - Law School) has posted A Theory-Practice Spiral: The Ethics of Feminism and Clinical Education (Minnesota Law Review, Vol. 75, 1599-1699, 1990) on SSRN. Here is the abstract:
    Should law school classes cultivate professional skills or should they advance a broad intellectual agenda? This Article examines the relationship between theory and practice from the standpoint of two movements within law’s academy: clinical education and feminist jurisprudence, the fundamental methodological similarity of these two movements, and the problematic nature of the theory-practice label. This Article also examines the ethical impulse that sparks clinical education and feminism, suggesting that each movement’s perceptions of the theory-practice relationship are embedded in ethical concerns and have far reaching ethical implications. Section I of this Article begins with the reading of a text from the perspective of each of these intellectual movements. Section II examines more specifically the respective ways that feminist and clinical educators describe and justify their choice of methods. Section III surveys the similarities and differences between the methodologies of the two movements. This leads, in Section IV, to an analysis of the implications for clinical education if it drew more explicitly from feminist methods, the implications for feminist jurisprudence if it drew more explicitly from clinical methods, and the implications for legal education if it drew more explicitly from each. Section IV concludes with an exploration of how recasting our understanding of the theory-practice relationship also recasts our understanding of ethics and ethical inquiry.

Borgen on Treaty Conflicts Christopher Borgen (St. John's University - School of Law) has posted Resolving Treaty Conflicts (George Washington International Law Review, Vol. 37, 2005) on SSRN. Here is the abstract:
    If treaties do not hang together, then the international legal system will fall apart. But as treaties proliferate, they increasingly overlap and frustrate each other's goals. This article assesses whether there are effective rules and tools that policymakers can use to avert or resolve treaty conflicts in general, regardless of the specific policy-areas at issue. It finds that the Vienna Convention on the Law of Treaties (VCLT) neither addresses the types of conflicts that are relevant today nor is it successful in resolving certain types of conflicts it does address. In order to be a more coherent system, international law needs rules and procedures that assist States in avoiding or resolving treaty conflicts in a principled manner, without necessarily having to resort to international tribunals. This requires more than just a revision of the VCLT but rather a more rigorous approach to envisioning how treaties affect one another, a practice of drafting treaty clauses that takes this interplay into account, a purposive method of treaty interpretation that will better spot potential conflicts, and the acceptance and application of "default rules" in cases of conflict. The article has five main parts. Part I sets out a typology of treaty conflicts. Part II describes classic methods of conflict avoidance and resolution and how these norms have not been applied consistently and are of little use in the types of conflicts that are now common. Part III assesses the VCLT and shows, through recent examples from a variety of substantive areas, that the VCLT is not equipped for the types of problems States face today. Part IV reconsiders what one may learn from analogies to contractual and legislative conflicts. Part V draws from the previous sections to suggest options in addressing treaty conflicts and considers the implications of the preceding discussion on the status of international law as a coherent legal system.

Petit on Freedom in the Market Philip N. Pettit (Princeton University - Department of Politics) has posted Freedom in the Market (Philosophy, Politics and Economics, Forthcoming) on SSRN. Here is the abstract:
    This essay looks at how the market appears from the republican perspective of freedom as non-domination. It outlines this conception of freedom, identifying some relevant aspects of the approach and then goes on to look at three features of the market - property, exchange, and regulation - and at the ways in which they appear from within the republican perspective. The republican conception of freedom argues for important normative constraints on these arrangements without supporting rhetorical extremes to the effect that 'property is theft' or 'taxation is theft' or anything of that kind.

Sunday, June 26, 2005
Legal Theory Lexicon: Libertarian Theories of Law
    Introduction The dominant approaches to normative legal theory in the American legal academy converge on fairly robust role for the state and government subject to the constraints imposed by an equally robust set of individual rights. Normative legal theorists of all stripes--conservatives and liberals, welfarists and deontologists—tend to agree that the institution of law is fundamentally legitimate and that the legal regulation has a large role to play. There is, however, a counter-tradition in legal theory that challenges the legitimacy of law and contends that the role of law should be narrowly confined. This entry in the Legal Theory Lexicon will examine libertarian theories of law. As always, the Lexicon is aimed at law students—especially first year law students—with an interest in legal theory.
    The libertarian tradition of social, political, and legal thought is rich and varied, no brief summary can do it justice. So the usual caveats apply. This is a brief introduction to libertarian thought with an emphasis on its role in normative legal theory. Debates about the true meaning of the term “libertarian” will largely be ignored, and will disputes over the advantages of “liberalism,” “classical liberalism,” and “libertarianism” as the best label for libertarian ideas. Enough with the caveats, here we go!
    Historical Roots of Contemporary Libertarianism One good way to approach contemporary libertarian legal theory is via its historical roots. A good place to begin is with John Locke’s conception of the social contract.
      John Locke and the Social Contract The idea of a “social contract,” by which individuals in a state of nature contract with each other (or with a sovereign) to enter a “civil society” is one of the most important in all of political philosophy. Hobbes, Rousseau, and Locke all have distinctive theories of the social contract, but Locke’s version is important—both to libertarian theory and American constitutionalism. For the purposes of this discussion, the important idea is that a legitimate (or perhaps just) civil society has authority that is limited to those powers that the citizens-to-be would agree to delegate to the government in a social contract. Locke himself argued that the inconveniences of the state of nature would motivate a social contract that delegated to the government the power to protect property—understood in a broad sense that encompasses personal security and liberty—and the power to resolve disputes. But the Lockean social contract would not authorize government to restrict fundamental liberties or to take property from one citizen and transfer it to another. Of course, there is much more to day about Locke, but we are concerned here only with getting the gist of those Lockean ideas that are historically important to libertarian theory. Kant and Spheres of Autonomy Kant also made an important contribution to libertarian theory via his ideas of autonomy. There is no good way to summarize Kant’s theory of autonomy in a sentence or two, but the gist of his notion is the humans, as rational beings, have an interest in being autonomous in the sense of “self governing.” The role of law is to protect individual “spheres of autonomy” or “zones of liberty” in which individuals can act without interference from others. Suppose then, that our theory of proper legislation was that the laws should create maximum equal liberties for each, consistent with the same liberty for all. These two Kantian ideas—autonomy and maximum equal liberty—have played an important role in libertarian thinking about law.
      John Stuart Mill and the Harm Principle John Stuart Mill was a liberal utilitarian, and so, in a sense, it is odd that he is also the author of one of the most important works in the libertarian tradition, On Liberty, a rich, complex, and easily misunderstood work. I am afraid I may be contributing to the misunderstanding by emphasizing just one idea from On Liberty--the so-called “harm principle.” Here is how Mill states the principle:
        . . . the sole end for which mankind are warranted, individually or collectively, in interfering with the liberty of action of any of their number, is self-protection. That the only purpose for which power can be rightfully exercised over any member of a civilised community, against his will, is to prevent harm to others. His own good, either physical or moral, is not sufficient warrant. He cannot rightfully be compelled to do or forbear because it will be better for him to do so, because it will make him happier, because, in the opinion of others, to do so would be wise, or even right...The only part of the conduct of anyone, for which he is amenable to society, is that which concerns others. In the part which merely concerns himself, his independence is, of right, absolute. Over himself, over his own body and mind, the individual is sovereign.
      The harm principle is almost as controversial as it famous. In particular, there is a persistent worry about the problem of the baseline against which “harm” as opposed to “lack of advantage” might be measured.
    Theoretical Foundations of Libertarianism This very brief introduction to the historical roots of libertarianism in Locke, Kant, and Mill prepares the way for a discussion of the theoretical roots of libertarian legal theory. Libertarianism operates at the level of political theory: it is a view about questions like “What is the proper role of government?” and “When is coercive legislation legitimate?” Theories at this level of abstraction need foundations of some sort, either deep foundations in comprehensive moral theories like utilitarianism or shallow foundations that explain why deeper foundations are unnecessary. Let’s take a look at both sorts of foundations for libertarian legal theories.
      Consequentialist Foundations The consequentialist case for libertarianism is contingent—it depends on empirical and theoretical questions about the effects that various legal regimes have. Consequentialist libertarians believe that minimum government interference with individual liberty and free markets produces better consequences that extensive government regulation or redistribution of income. Historically, both John Stuart Mill and Adam Smith are associated with both libertarianism and consequentialism.
      There are many different flavors of consequentialism, but in the legal academy, the most prominent strands of consequentialist thinking are associated with law and economics and assume a preference-satisfaction (or “welfarist”) notion of utility. Even among theorists who accept welfarism, there are major disagreements about how much and when government should regulate. But the general idea behind the consequentialist case for libertarianism is that markets are more efficient than regulation. This conclusion follows from fairly straightforward ideas in neoclassical microeconomics. Markets facilitate Pareto-efficient (welfare enhancing) transactions; regulations thwart such transactions.
      Markets may lead to substantial disparities in wealth and income, but from the consequentialist perspective, such inequalities may not justify legislation that redistributes wealth and income. First, for a strict utilitarian, the distribution of utility itself is of no moral significance: classical utilitarians believe that the sum of utilities should be maximized, even if that means that some will be very well off and others very poor. Of course, there is a well-known utilitarian argument for the redistribution of wealth and income based on the idea of diminishing marginal utility, but this argument might be outweighed by the massive utility losses caused by redistributive programs—providing a utilitarian argument against government-mandated redistribution of wealth and income. Second, even consequentialists who believe in some form of egalitarianism might believe that the worst off members of society will be better served by a libertarian regime than by a social-welfare state. We are already on a tangent, so I’m going to leave the topic of redistribution—noting that this is an issue upon which consequentialists themselves many differ in a variety of ways.
      In contemporary legal theory, Richard Epstein is the “libertarian” thinker who is most strongly associated with consequentialist foundations. Because he is a consequentialist, Epstein may not be a pure libertarian, but on a variety of issues (e.g. antidiscrimination laws), Epstein takes strongly libertarian positions.
      Deontological Foundations Although some libertarians are consequentialists, many others look to deontological moral theory for the foundations of their libertarianism. There are many different strategies for arguing for libertarianism based on deontological premises. One method starts with the idea of self-ownership or autonomy. Each of us has a moral right to control our own bodies, free of wrongful interference by others. This might imply that each individual has a right against theft, battery, false-imprisonment, enslavement, and so forth. Of course, these rights might justify a certain kind of government—one that protects us against invasions of our rights. But when government goes beyond the protection of these rights, then government itself operates through force or threats of force. For example, the redistribution of income might be accomplished by taxing income to finance a welfare system. Taxes are not voluntary; tax payments are “coerced” via threats of violence and imprisonment. Without consent, it might be argued, these threats are wrongful actions.
      In my mind, the deontological approach to the foundations of libertarian political theory is most strongly associated with the late Robert Nozick and his magnificent book, Anarchy, State, and Utopia (see reference below).
      Pluralist Foundations There is an obvious problem with locating the foundations of a political theory, like libertarianism, in a deeper moral theory, such as some form of deontology or consequentialism. In a pluralist society, it seems very unlikely that any one view about morality will ever become the dominant view. Instead, modern pluralist societies are usually characterized by persistent disagreements about deep moral questions. If a particular form of libertarianism rests on deep moral foundations, then most of us will reject that form of utilitarianism, because we reject the foundations. One alternative would be to try to argue for libertarianism on the basis all of the different moral theories, but that is obviously a very time-consuming and difficult task. Another approach would be to articulate shallow foundations for utilitarianism—foundations that are “modular” in the sense that they could be incorporated into many different comprehensive theories of morality. This general strategy was pioneered by the liberal political philosopher, John Rawls—himself, of course, no libertarian.
      One contemporary libertarian legal theorist who has pursued the pluralist strategy is Randy Barnett. In his book, The Structure of Liberty, Barnett argues that anyone who wishes to pursue their own interests—whatever those might be-- have good reasons to affirm a generally libertarian framework for government. Barnett’s case for libertarianism is complex, but his basic idea is that human nature and circumstances are such that the law must establish and protect property rights and liberty of contract. The key to Barnett’s argument is his identification of what he calls the problems of knowledge, interest, and power. For example, the problems of knowledge include the fact that each individual has knowledge of his or her circumstances that are relevant to how resources can best be utilized. This fact, combined with others, make decentralized control of resources through a private property regime superior to a centralized command and control system. For our purposes, it is not the details for Barnett’s argument, but his general strategy that is important: Barnett attempts to create a case for libertarianism that does not depend on either consequentialist or deontological moral theory.
    Libertarian Agendas for Legal Reform (or Revolution!) Even thought this is “Legal Theory Blog,” we should say something about the practical agendas of various libertarian legal theories. Let’s begin with modest libertarianism and proceed to its most radical (anarchist) forms.
      Modest Libertarian Reforms: Deregulation, Privatization, and Legalization At the very least, libertarians favor less government—as measured against the baseline of the current legal order in the United States. So, libertarians are likely to be in favor of more reliance on markets and less reliance on government. Hence, libertarians are likely to support programs of deregulation and privatization. Deregulation might include measures like abolition of consumer product safety regulations and the elimination of rent control laws. Privatization might include the federal government selling off the national park system or the Tennessee Valley Authority.
      A libertarian reform agenda might also include the legalization of various forms of conduct that are currently prohibited. Examples of this kind of reform might include the legalization of recreational drugs, the end of prohibitions on various consensual sexual activities, and the elimination of restrictions on gambling and prostitution.
      Comprehensive Libertarian Reform: The Night-Watchman State A more ambitious libertarian agenda might be the establishment of what has been called the night-watchman state. The idea is that government would limit its role to the protection of individual liberty. Government would continue to provide police protection, national defense, and a court system for the vindication of private rights (property, tort, and contract rights, for example), but nothing else. In other words, the function of law would be limited to those activities that are necessary for the protection of private property and liberty.
      The difference between the advocacy of modest and comprehensive libertarian reform may be more a matter of tactics than of principle. One might believe that there is no realistic chance of a transition to a night-watchman state. Those who advocate such comprehensive reform may undermine their own political effectiveness by sounding “radical.” So as a matter of practical politics, it may be that libertarians are most effective when they advocate marginal reforms that move the system incremental in libertarian directions.
      Libertarian Revolutions: Anarchy and Polycentric Constitutional Orders Some libertarians advocate an agenda that is even more radical than the night-watchman state. One might question whether there is a need for the nation state at all. One version of this more radical approach is pure anarchism—the view that no government is necessary because individuals can coexist and cooperate without any need for state action. Another variation of this idea is sometimes called a “polycentric constitutional order.” The idea is that individuals could subscribe to private firms that would provide the police and adjudication functions of the night watchman state. Such a society would have entities that functioned like governments in some ways—with the important exception that individuals would enter into voluntary agreements for their services.
    The Rivals of Libertarian Legal Theory Libertarian theory can be criticized in a variety of ways. Sometimes the disagreement is mostly empirical: libertarians believe that life without the state would be better, and anti-libertarians believe it would be worse. But sometimes the critics of libertarianism have a radically different vision of the fundamental purposes of government. One such rival is egalitarianism—the view the distributive justice requires that goods (let’s leave the definition of good at the abstract level) should be divided equally, and that the creation of social equality is the primary aim of government. Some libertarians might accept this goal, but argue that maximum liberty is the best way to achieve it. Other libertarians might argue that liberty is the good that should be equally divided. But many libertarians see equality as the wrong goal for government. That is, sometimes libertarians and egalitarians differ fundamentally over the purpose of government.
    Another rival to libertarianism is the view that legislation should aim at the promotion of virtue in the citizenry. If one believes that the aim of government is to make humans into better people, then one is likely to see a variety of restricts of liberty as justified. (Let’s call views that see virtue as the end of government “aretaic political theories.”)
    Aretaic political theorists are likely to disagree with libertarians over what might be called “moral legislation.” For instance, one might believe that legal prohibitions on gambling, drugs, and prostitution are justified because they help promote a moral climate where most citizens don’t want to engage in these activities. Many libertarians would say it is simply not the business of government to decide that a taste for gambling is a bad thing; whereas many virtue theorists are likely to say that this is precisely the sort of work that governments should be doing.
    Conclusion Libertarian legal theory is interesting on the merits—as one of the most significant normative theories of law. But there is another important reason for legal theorists to be interested in libertarianism even if they ultimately reject it. Libertarian legal theories call into question the very purpose of law and government. A really careful evaluation of libertarianism requires that one form views about the function of law and the purposes of government, and to confront a variety of criticisms of conventional views about those topics. For that reason, thinking about libertarian legal theory is an excellent way of thinking about the most fundamental questions in normative legal theory.
    Once again, this entry is bit too long, but I hope that I’ve provide a good starting point for your investigations of libertarianism. I’ve provided a very brief set of references for further exploration.

Saturday, June 25, 2005
Legal Theory Bookworm And speaking of Philip Pettit, the Legal Theory Bookworm recommends Republicanism: A Theory of Freedom and Government by Philip Pettit. The short-lived "republican revival" in American constitutional theory was what spurred by interest in this very rewarding book. Here's a blurb:
    This is the first full-length presentation of a republican alternative to the liberal and communitarian theories that have dominated political philosophy in recent years. Professor Pettit's eloquent and compelling account opens with an examination of the traditional republican conception of freedom as non-domination, contrasting this with established negative and positive views of liberty. The first part of the book traces the rise and decline of this conception, displays its many attractions and makes a case for why it should still be regarded as a central political ideal. The second part of the book looks at what the implementation of the ideal would imply for substantive policy-making, constitutional and democratic design, regulatory control and the relation between state and civil society.
Highly recommended

Download of the Week The Download of the Week is Rawls's Peoples (Rex Martin and David Reidy eds, ENVISIONING A NEW INTERNATIONAL ORDER: ESSAYS ON RAWL'S LAW OF PEOPLES, Blackwell, Oxford, 2005) by Philip Pettit. Here is the abstract:
    Social ontology does not drive political theory as axioms drive a theorem, but it can have an important shaping or constraining effect; this fits with Rawls's idea that our views on normative and related topics should be in 'wide reflective equilibrium' This paper tries to document the shaping effect of Rawls's social ontology on his theory of international justice. It begins with a characterization of Rawls’s rejection of cosmopolitanism. It reviews the claims that he makes about peoples and tries to articulate the ontology of peoples that they support. And then in the final section it shows how that ontology helps to explain his position on cosmopolitanism.
Download it while its hot!

Friday, June 24, 2005
Whittington on Pickerill Keith E. Whittington (Princeton University - Department of Politics) has posted James Madison has Left the Building: A Review of J. Mitchell Pickerill, Constitutional Deliberation in Congress (University of Chicago Law Review, Vol. 72, No. 3, Summer 2005) on SSRN. Here is the abstract:
    Empirical work on judicial and legislative politics sheds valuable light on the importance of judicial review and the ways in which constitutional limitations are most effectively maintained. Mitchell Pickerill's examination of constitutional deliberation in Congress in the latter half of the twentieth century helps us understand the limited policy impact of the Supreme Court's constitutional rulings, which in turn begins to explain the political sustainability of the power of judicial review. It also suggests the ways in which the judiciary and the legislature can complement one another in recognizing, debating, and implementing constitutional values and commitments, while cautioning us against overly optimistic conclusions about the possibilities of legislative constitutionalism.

Pettit on Rawls's Law of Peoples Philip N. Pettit (Princeton University - Department of Politics) has posted Rawls's Peoples (Rex Martin and David Reidy eds, ENVISIONING A NEW INTERNATIONAL ORDER: ESSAYS ON RAWL'S LAW OF PEOPLES, Blackwell, Oxford, 2005) on SSRN. Here is the abstract:
    Social ontology does not drive political theory as axioms drive a theorem, but it can have an important shaping or constraining effect; this fits with Rawls's idea that our views on normative and related topics should be in 'wide reflective equilibrium' This paper tries to document the shaping effect of Rawls's social ontology on his theory of international justice. It begins with a characterization of Rawls’s rejection of cosmopolitanism. It reviews the claims that he makes about peoples and tries to articulate the ontology of peoples that they support. And then in the final section it shows how that ontology helps to explain his position on cosmopolitanism.
A must read for those interested in theories of international justice. Pettit is superb!

Thursday, June 23, 2005
Originalism in the Blogosphere Brian Leiter recently had the following to say about originalism:
    Why is it even remotely relevant what those words meant when the Constitution was adopted? The right has been pushing this non-sequitur for a couple of decades now, but they still have no answers to the simplest questions about the legal or moral relevance of the "original meaning" or "original intent" of Constitutional provisions. Those who produced the "original" meanings have no claim of democratically sanctioned authority over us; they have no claim of special moral expertise or insight; to make the meaning of Constitutional provisions turn on historical details invisible in the text itself undermines rule of law values like the need for public and intelligible legal standards; and so on.
And Michael Rappaport replies:
    Laws that must pass under a strict supermajority rule are apt to be better than laws passed by majority rule. While the specific effects of supermajority rules depend on the type of laws being passed, the circumstances, and the model of the legislative process that one employs, one can make certain generalizations. First, that supermajority rules require the approval of a greater percentage of the legislature operates to protect minority interests from being exploited. Second, the greater support required under supermajority rules also means that laws must in general produce significant public benefits in order to pass. (For other arguments, see the paper.) While supermajority rules don’t make sense in all circumstances, they are desirable when applied to the passage of constitutional norms that will be entrenched against change by ordinary legislative majorities.
And Jack Balkin responds:
    Consider that often when the language of a Constitution is relatively abstract or vague, the language chosen is chosen because it is a compromise that many people with different expectations can agree upon. An example would be the words "privileges or immunities" or the words "equal protection of the laws." Supermajorities may rally around these words not because they limit future interpreters, but precisely because the words do not have clear boundaries of application, and they expect that people will fight out their application later on. Indeed, in particularly contested issues like fundamental rights (or federalism) this vagueness is precisely what is necessary to gain assent from a supermajority with very different substantive views.
    In addition, supermajorities may believe that it is better to speak in abstract or general terms rather than address constitutional provisions to specific problems of their day, because of a desire to allow the language to be applied in new ways to meet the challenges of the future. This seems to be the case with respect to the history of the adoption of the Fourteenth Amendment, to take only one example.
    Although Mike argues that his supramajority argument shows why appeals to original meaning operate as a constraint on judges, it is far from clear why it does so if we understand why abstract and vague constitutional language about rights and powers sometimes commands a supermajority. This language does so because it does not constrain, because it leaves things open for future development.
Yes and No. Yes, I think Balkin is surely right on two scores: (1) the constitution contains language that is general, abstract, and vague, and (2) sometimes these features were likely a product of the need for supermajority support. But no, that does not mean that the language does not constrain. Balkin's own argument turns against him at precisely this point. The need for supermajority support is one of these reasons that the Constitution rarely (perhaps never) is phrased in ways that leave things entirely open--with no constraining force. Imagine how difficult it would have been to get supermajority support for language that gave Congress the power "To regulate in ways that may or may not be limited," or for a clause that stated, "Congress shall not infringe rights of a scope to be determined at a future date." Precisely because supermajorities are required, the constitution is full of constraint. Provisions that provided no constraint at all would never have mustered supermajority support. To the extent that Balkin suggest otherwise, his argument is woefully underdeveloped. But that does not mean that there is no grain of truth to Balkin's point. As Randy Barnett argues in his book, Restoring the Lost Constitution, when the constitution uses general, abstract, and vague language--such as the phrase "privielges and immunities" or "freedom of speech," there will be much to determine within what Fred Schauer calls a "frame with fuzzy endges." Balkin is right to observe that in such cases, the constitutional "underdetermines" outcomes.
Balkin continues:
    Let me distinguish these concepts: Original public meaning asks what did the words used in the Constitution generally mean at the time they became law. Original intention asks what did the persons who had authority to create the law intend to be law (prohibited or permitted) by their use of those words. Original application asks how did people who lived at the time expect that the words of the Constitution, taken in their original meaning, would be applied to various situations?
    In many contexts, original meaning, original intention, and original application converge. However, where the words used in a constitution are relatively abstract, these three ideas tend to come apart. An example are the words "cruel and unusual punishments." Under original public meaning originalism the original meanings of the concepts used (and their meaning in combination with each other) should be preserved, but we are not necessarily bound by either the intentions of the persons who framed the words, or by the general public expectation of how those words would be applied. The concept of cruelty stays the same, but what we have to figure out what that concept means in our own time.
    Evidence of how people used words at a certain point in time is evidence of their original public meaning, but it is not conclusive evidence, because original public use conflates both the content of a concept and its expected application. It also conflates the nature of a concept with the particular set of issues before people at the time they considered constitutional language.
Yes and yes! I think Balkin has gotten this exactly right and his distinction between original meaning and original application is quite helpful. Bravo to Rappaport and Balkin!

Wednesday, June 22, 2005
Willis on Predatory Lending Lauren E. Willis (Loyola-LA Law School) has posted Decisionmaking & the Limits of Disclosure: The Problem of Predatory Lending on SSRN. Here is the abstract:
    Despite the importance of the transaction, many Americans are not making optimal home loan decisions, in two important respects. First, many borrowers are not obtaining home loans at optimal price terms, prices that a competitive market of borrowers engaged in effective price-shopping would produce. Second, the home loan decisions of many borrowers are not optimal choices with regard to risk of loss of the home, both in that the benefits of the loan are outweighed by the risk of loss of the home imposed by the loan, and in that borrowers are failing to take advantage of alternatives that are preferable, in cost-benefit terms, to shouldering that risk of loss. The households paying these high prices and facing this high risk of foreclosure are disproportionately African-American, Latino, and low to moderate income, households that already have fewer financial resources and significantly lower homeownership rates. The sale of these overpriced and overly risky home loans constitutes what has come to be known as "predatory lending." From a legal and policy perspective, what is puzzling about this problem is that borrowers are agreeing to these loans against their own self-interest and despite federally-mandated disclosures regarding loan price, and, for some loans, risk of foreclosure. This paper argues that the problem is not so puzzling when the structure of the home loan market and consumer decisionmaking within that market are carefully analyzed. Federal law regarding home lending is based on a rational actor model of borrower decisionmaking, with some allowances for bounded rationality. But borrowers frequently depart from the law's model due to widespread cognitive limitations, heuristics, biases, and emotional coping mechanisms. This paper explains how sellers are able to take advantage of these impediments to optimal decisionmaking and the structure of the market to convince significant numbers of borrowers to take loans that are overpriced and overly risky. The paper also makes a number of suggestions for reform.

de Figueiredo on Telecommunications Litigation John M.P. de Figueiredo (Princeton University - Program in Law and Public Affairs) has posted Strategic Plaintiffs and Ideological Judges in Telecommunications Litigation (Journal of Law, Economics and Organization, Forthcoming) on SSRN. Here is the abstract:
    This paper examines the effect of judicial ideology on the selection and outcome of telecommunications regulatory cases. Using a dataset on Federal Communications Commission orders and trials from 1990 to 1995, this paper shows that changes in the make-up of the bench of the D.C. Circuit Court of Appeals affects not only who wins the cases, but also the cases selected for litigation. Specifically, firms are more likely to bring cases when the agency decisions are ideologically distant from the bench than when the two actors are close ideologically. Judges, who are subsequently randomly selected, vote ideologically as the firms' actions predict they will, with Republicans judges overturning Democratic agency decisions and vice versa. The effect of judicial ideology on case election is larger than the effect of judicial ideology on case outcomes. Additionally the paper also shows that plaintiff characteristics have little impact in determining case outcomes, but a statistically significant impact on cases selected for litigation. Finally, the paper provides initial evidence that regulatory uncertainty may lead to more litigation.

Nolan-Haley on Law and Mediation Jacqueline M. Nolan-Haley (Fordham University School of Law) has posted The Merger of Law and Mediation: Lessons from Equity Jurisprudence and Roscoe Pound (Cardozo Journal of Dispute Resolution, Vol. 6, p. 57, 2004) on SSRN. Here is the abstract:
    This article examines Roscoe Pound's concerns with the decline of equity jurisprudence in the American legal system, suggesting that they resonate with those of modern ADR scholars who worry about the effects of blending settlement with adjudication and mediation with the law. It examines court-connected mediation with particular emphasis on the historic parallels between equity and mediation. Both equity and mediation offer a form of "individualized justice" unavailable in the official legal system, and each allow room for mercy in an otherwise rigid, rule-bound justice system. Yet, scholars question whether equity today is still equitable and whether institutionalized mediation offers anything that looks like justice. This article argues that if court-connected mediation is to offer alternatives to traditional rule-bound justice, it must return to its complementary role to litigation and adjudication.

Tuesday, June 21, 2005
Sunstein on Chevron Cass R. Sunstein (University of Chicago Law School) has posted Chevron Step Zero (Virginia Law Review, Forthcoming) on SSRN. Here is the abstract:
    The most famous case in administrative law, Chevron U.S.A. v. Natural Resources Defense Council, Inc., has come to be seen as a counter-Marbury, or even a McCulloch v. Maryland, for the administrative state. But in the last period, new debates have broken out over Chevron Step Zero - the initial inquiry into whether Chevron applies at all. These debates are the contemporary location of a longstanding dispute between Justice Scalia and Justice Breyer over whether Chevron is a revolutionary decision, establishing an across-the-board rule, or instead a mere synthesis of preexisting law, inviting a case-by-case inquiry into congressional instructions on the deference question. In the last decade, Justice Breyer's case-by-case view has enjoyed significant victories. Two trilogies of cases - one explicitly directed to the Step Zero question, another implicitly so directed - suggest that the Chevron framework may not apply (a) to agency decisions not preceded by formal procedures and (b) to agency decisions that involve large-scale questions about agency authority. Both of these trilogies threaten to unsettle the Chevron framework, and to do so in a way that produces unnecessary complexity for judicial review and damaging results for regulatory law. These problems can be reduced through two steps. First, courts should adopt a broader understanding of Chevron's scope. Second, courts should acknowledge that the argument for Chevron deference is strengthened, not weakened, when major questions of statutory structure are involved.
Highly recommended!

Two by Ribstein Larry Ribstein has posted two papers on SSRN:
    Accountability and Responsibility in Corporate Governance:
      Managers' accountability to shareholders and corporations' responsibility to society are two important objectives of corporate governance. Some scholars argue that managers who are accountable to shareholders must neglect society's interest. But loosening this accountability leaves managers free to serve themselves, thereby increasing agency costs. This article makes three main contributions to the debate on the appropriate roles of accountability and responsibility. First, it shows how modern markets cause managers who are accountable to shareholders also attend to society's interests. Second, it shows that the debate is actually less important than it might first appear because the logistics of publicly held corporations substantially free managers from accountability to shareholders irrespective of whether society’s needs should compel that freedom. Third, the paper shows that the debate may be joined over whether partnership-type devices compelling distributions and allowing owner cash-out should be imported into publicly held firms. These devices would provide for more managerial accountability to shareholders, and therefore less flexibility to serve society's interests, than standard corporate governance mechanisms. The main impediment to use of these devices is the double corporate tax, which provides tax benefits for earnings retention and thereby encourages managerial control over corporate earnings. The future of the corporate tax may depend at least in part on the debate over accountability and responsibility in corporate governance.
    Sarbanes-Oxley after Three Years:
      This article reports on the experience with the Sarbanes-Oxley Act of 2002 in the three years since its passage. In general, the costs have been significant and the benefits elusive. This suggests some lessons for future regulation.
    Highly recommended!

Walker on the Problem of Collective Saving David I. Walker (Boston University School of Law) has posted The Social Insurance Crisis and the Problem of Collective Saving: A Commentary on Shaviro's 'Reckless Disregard' (Boston College Law Review, Vol. 45, pp. 1347-1361, 2004) on SSRN. Here is the abstract:
    Long-range Social Security and Medicare spending projections vastly exceed projected program revenues. If left unchecked, the resulting fiscal imbalance (estimated at $40 to $70 trillion in present value terms) would fall primarily on future generations. To avoid generational inequity, and perhaps fiscal meltdown, Professor Daniel N. Shaviro and others propose immediate fiscal austerity. This reply Commentary argues that near-term austerity is unlikely to play a significant role in overcoming the fiscal imbalance, which can be thought of as a balloon payment due mid-twenty-first century. Significant near-term fiscal austerity would eliminate the public debt and replace it with a public surplus. Political economy theory and U.S. public debt history suggest that this path is infeasible. This Commentary also stresses the importance of disaggregating the "Social Security and Medicare" problems. Contrary to popular belief, Medicare is by far the larger problem, and the Medicare imbalance is driven by projected spending increases outpacing overall economic growth indefinitely. These observations suggest that a focus on Medicare cost control, rather than revenue enhancement, is called for.

Guthrie & George on the Futility of Appeal Chris Guthrie and Tracey George (Vanderbilt University - School of Law and Vanderbilt University - School of Law) have posted The Futility of Appeal: Disciplinary Insights into the 'Affirmance Effect' on The United States Court of Appeals (Florida State University Law Review, Symposium Issue, Vol. 32, p. 357, 2005) on SSRN. Here is the abstract:
    In contrast to the Supreme Court, which typically reverses the cases it hears, the United States Courts of Appeals almost always affirm the cases that they hear. We set out to explore this affirmance effect on the U.S. Courts of Appeal by using insights drawn from law and economics (i.e., selection theory), political science (i.e., attitudinal theory and new institutionalism), and cognitive psychology (i.e., heuristics and biases, including the status quo and omission biases).

Parisi, Palmer and Bussani on Pure Economic Loss Francesco Parisi , Vernon V. Palmer and Mauro Bussani (George Mason University School of Law , Tulane Law School and University of Trieste School of Law) have posted The Comparative Law and Economics of Pure Economic Loss. Here is the abstract:
    Law and economics shows that a key factor in determining the optimal economic loss rule is found in the relationship between pure economic loss and social loss. Economic loss should be compensable in torts only to the extent that it corresponds to a socially relevant loss. In this paper we undertake a comparative evaluation of the economic loss rule to verify whether modern legal systems, although not formally adopting the economic criterion, define the exclusionary rule in light of efficiency considerations. The comparative analysis reveals that the substantive applications of the economic loss rule in European jurisdictions are consistent with the predicates of economic analysis.

Monday, June 20, 2005
Stadler on Law School Teaching Sara K Stadler (Emory University - School of Law) has posted The Bulls and Bears of Law Teaching (Washington and Lee Law Review, 2006) on SSRN. Here is the abstract:
    This Essay provides readers with a unique perspective on the world of law teaching: Employing a quirky methodology, Professor Stadler predicts which subjects are likely to be most (and least) in demand among faculties looking to hire new professors in future - rating those subjects, like so many stocks, from "strong buy" to "weak buy" to "weak sell" to "strong sell." To generate the data on which her methodology is based, Professor Stadler catalogued, by subject, almost every Article, Book Review, Booknote, Comment, Essay, Note, Recent Case, Recent Publication, and Recent Statute published in the Harvard Law Review between and including the years 1946 and 2003. In the end, she found an interesting (and, she thinks, predictive) relationship between the subjects on which faculty choose to write and the subjects on which students choose to write.
I love this essay! Download it right away! Read it. Think about it! To whet your appetitite hear some of Stadler's recommendations:
    Strong Buys
      Bankruptcy Law
      Education Law
      Energy Law
      Family & Gender Law
      Health Law
      Labor & Employment Law
      Tax Law
    Weak Buys
      Alternative Dispute Resolution
      First Amendment Law
      Intellectual Property Law
      International and Compartive Law
      International Trade
      Law and . . .
      Media Law
    Weak Sells
      Civil Procedure and Evidence
      Contract Law
      Criminal Law & Procedure
      Election Law
      Legal History
      Property Law
      Tort Law
    Strong Sells
      Administrative Law
      Antitrust Law
      Commercial Law
      Constitutional Law
      Environmental Law
      Admiralty Law & Trusts and Estates
Download it while its hot hot hot!

Two by Yoo Christopher Yoo (Vanderbilt) has posted two papers on SSRN:
    Rethinking the Commitment to Free, Local Television: A Public Goods Analysis (Emory Law Journal, Vol. 52, Fall 2003):
      One of the most enduring tenets of U.S. television policy has been the commitment to localism. I suggest that the FCC's localism policy can be disaggregated into four, more specific commitments: (1) the preference for locally oriented over nationally oriented programming, (2) the preference for free (i.e., advertising-supported) over pay television, (3) the preference for single-channel over multi-channel television technologies, and (4) the preference for incumbents over new entrants and new technologies. I then analyze each of these commitments in light of what is perhaps the most distinctive feature of the television industry, which is the fact that its cost structure gives television programming many of the qualities of a public good, and conclude that each of these four commitments is fundamentally flawed. I then employ the public goods analysis I develop to critique the manner in which policy makers are regulating conventional television broadcasting, cable television, direct broadcast satellite systems (DBS), digital television, and third-generation wireless devices (3G).
    Architectural Censorship and the FCC (Southern California Law Review, Vol. 78, March 2005):
      Most First Amendment analyses of U.S. media policy have focused predominantly on behavioral regulation, which either prohibits the transmission of disfavored content (such as indecent programming) or mandates the dissemination of preferred content (such as children's educational programming and political speech). In so doing, commentators have largely overlooked how program content is also affected by structural regulation, which focuses primarily on increasing the economic competitiveness the media industries. In this symposium contribution, Professor Christopher Yoo employs economic analysis to demonstrate how structural regulation represents a form of architectural censorship that has the unintended consequence of reducing the quantity, quality, and diversity of media content. The specific examples analyzed include: (1) efforts to foster and preserve free television and radio, (2) rate regulation of cable television, (3) horizontal restrictions on the number of outlets one entity can own in a local market, and (4) regulations limiting vertical integration in television and radio. Unfortunately, current First Amendment doctrine effectively immunizes architectural censorship from meaningful constitutional scrutiny. As a result, Congress and the FCC must bear the primary responsibility for safeguarding free speech values against these dangers.
    I always learn from Yoo's work. Both papers are recommended!

Rubenstein on Private Attorneys General William B. Rubenstein (University of California, Los Angeles - School of Law) has posted On What a "Private Attorney General" is - And Why it Matters (Vanderbilt Law Review, Vol. 57, No. 6, p. 2129, November 2004) on SSRN. Here is the abstract:
    Although the phrase private attorney general is commonly employed in American law, its meaning remains elusive. The concept generally serves as a placeholder for any person who mixes public and private features in the adjudicative arena. Yet there are so many players who mix public and private functions in so many different ways that the idea holds the place for a motley cast of disparate characters. My goal in this Article is to map these mixes - to distill from the singular private attorney general concept a range of distinct private attorneys general - and then to show why this new taxonomy is a helpful heuristic device. Specifically, I argue that the new taxonomy illuminates a weakness in the governing model of the class case. Scholars loosely associated with the law and economics movement have helpfully described class action lawsuits as presenting a classic agency problem: class action attorneys (agents) pursue the interests of their class member clients (principals) with little oversight or control. Consequently, class action scholarship has focused on identifying ways to better align the interests of the agents with those of their principals. This obsession with agent incentives assumed, without significant investigation, that there existed a stable group of principals with easily-identifiable interests. My typology demonstrates that different types of private attorneys general serve different types of principals, each of which combine public and private interests in different ways. If the goal of class action law is to align the attorneys' interests with those of their clients, it is necessary to identify clearly the precise nature of these underlying principals. That is the contribution of this piece.

Richman on Communities Creating Economic Advantage Barak D. Richman (Duke University School of Law) has posted How Communities Create Economic Advantage: Jewish Diamond Merchants in New York on SSRN. Here is the abstract:
    This paper argues that Jewish merchants have dominated the diamond industry because of their ability to enforce diamond credit sales. Diamonds are portable, easily concealable, and extremely valuable, thereby rendering courts powerless in policing diamond theft and credibly enforcing diamond credit sales. Since credit sales are highly preferable to simultaneous exchange, success in the industry requires an ability to enforce executory agreements that are beyond the reach of public courts. Relying on a reputation mechanism that is supported by a distinctive set of industry, family, and community institutions, Jewish diamond merchants have been able to enforce such contracts and have thus maintained industry leadership for several centuries. An industry arbitration system publicizes instances where promises are not kept. Intergenerational legacies induce merchants to deal honestly through their very last transaction, so that their children may inherit valuable livelihoods. And ultra-Orthodox Jews, for whom participation in their communities is paramount, provide important value-added services to the industry without posing the threat of theft and flight.

Sunday, June 19, 2005
Legal Theory Lexicon: The Counter-Majoritarian Difficulty
    Introduction The counter-majoritarian difficulty may be the best known problem in constitutional theory. The phrase is attributed to Alexander Bickel—a Yale Law School Professor—who is said to have introduced it in his famous book The Least Dangerous Branch. Whatever Bickel actually meant by the phrase, it has now taken on a life of its own. The counter-majoritarian difficulty states a problem with the legitimacy of the institution of judicial review: when unelected judges use the power of judicial review to nullify the actions of elected executives or legislators, they act contrary to “majority will” as expressed by representative institutions. If one believes that democratic majoritarianism is a very great political value, then this feature of judicial review is problematic. For at least two or three decades after Bickel’s naming of this problem, it dominated constitutional theory.
    This entry in the Legal Theory Lexicon explores the counter-majoritarian difficulty, efforts to solve the problem and to dissolve it. As always, the Lexicon is aimed at law students, especially first-year law students, with an interest in legal theory. As is frequently the case with the Lexicon, we will explore a very big topic in just a few paragraphs. Many articles and books have been written about the counter-majoritarian difficulty; we will only scratch its surface. Moreover, any really deep discussion of the counter-majoritarian difficulty would lead (sooner or later) to almost every other topic in constitutional theory. The Lexicon is “quick and dirty,” and definitely not deep, comprehensive, or authoritative.
    Democracy and Majoritarianism The counter-majoritarian difficulty is rooted in ideas about the relationship between democracy and legitimacy (see the Legal Theory Lexicon entry on Legitimacy ). We all know the basic story: the actions of government are legitimate because of their democratic pedigree, and democratic legitimacy requires “majority rule.” Of course, it isn’t that simple. Among the complexities are the following:
    • There are many different theories of democratic legitimacy, and only some of them emphasize “majoritarianism” as the key factor.
    • Some theories of democratic legitimacy rely on the idea of “consent of the governed,” but it is very difficult to mount an argument for actual consent to existing majoritarian institutions or their actions.
    • The idea of “legitimacy” is itself deeply controversial and might even be called obscure. What legitimacy is and why it is important are themselves deep and controversial questions.
    Despite these complexities, most of us have a rough and ready appreciation for the idea that actions by democratic majorities have some kind of legitimacy that is lacking in the actions of unelected judges. At any rate, that idea is the normative foundation of the counter-majoritarian difficulty.
    Constitutional Limits on Majoritarianism The counter-majoritarian difficulty is sometimes characterized as a problem with the institution of judicial review, but it could also be understood as a difficulty for any constitution that constrains majority will. Of course, there could be constitutions that impose no limits at all on the will of democratically elected legislatures. For example, a regime of unicameral parliamentary supremacy might be said to have a constitution that allows a parliamentary majority to pass any legislation that it pleases and to override the courts or executive whenever the legislature is in disagreement with their actions. Of course, even this simple constitution might constrain the legislature in a certain sense. For example, legislation that attempts to constrain the action of a future legislature might be “unconstitutional.” Another example might be legislation that abolishes elections and substitutes a system of self-perpetuating appointments. Similarly, a legislature might pass a “bill of rights” that purports to bind future legislatures, even in the absence of an institution of judicial review.
    The Institution of Judicial Review Even though the counter-majoritarian difficulty might be a feature of any system with a binding constitution, the difficulty is especially acute for a regime that incorporates the institution of judicial review incorporating judicial supremacy. In the United States, for example, the courts have the power to declare that acts of Congress are unconstitutional, and if the Supreme Court so declares, the Congress does not have the power to override its decision.
    The institution of judicial review is counter-majoritarian in part because federal judges are not elected and they serve life terms. Presidents are elected every four years; members of the House of Representatives every two years; and Senators serve staggered six year terms. Of course, judges and justices are nominated by the President and confirmed by the Senate and these features create some degree of democratic control of the judiciary. Nonetheless, on the surface, it certainly looks like judicial review is an antidemocratic institution. Unelected judges strike down legislation enacted by elected legislators: that is certainly antidemocratic and antimajoritarian in some sense.
    The counter-majoritarian difficulty is compounded by the nature of judicial review as it has been practiced by the modern Supreme Court. If the Supreme Court limited itself to enforcing the separation of powers between the President and Congress or to the enforcement of the relatively determinate provisions of the constitution that establish the “rules of the game” for the political branches, then the counter-majoritarian difficulty might not amount to much. But the modern Supreme Court has been involved in the enforcement of constitutional provisions that general, abstract, and seemingly value laden—examples include the freedom of speech, the equal protection clause, and the due process clause of the constitution. The counter-majoritarian difficulty seems particularly acute when it comes to so-called “implied fundamental rights,” like the right to privacy at issue in cases like Griswold v. Connecticut and Roe v. Wade.
    Answering the Countermajoritarian Difficulty How have constitutional theorists attempted to answer the counter-majoritarian difficulty? The problem with answer that question is that there are so many answers that it is difficult to single out three or four for illustrative purposes. So remember, the “answers” that are discussed here are arbitrary selections from a much longer list.
      Discrete and Insular Minorities One famous answer to the counter-majoritarian difficulty focuses on the idea of “discrete and insular minorities.” The background to this answer is the premise that in the long run, most individuals win some and lose some in the process of democratic decision making. Shifting coalitions among various interest groups “spread the wealth” and the pain—no one wins all the time or loses all the time. Or rather, normally wins and losses are spread across the many different groups that constitute a given political society. However, there may be some groups that are excluded from the give and take of democratic politics. Some groups may be so unpopular (or the victims of such extreme prejudice) that they almost always are the losers in the democratic process. The famous “Footnote Four” of the United States Supreme Court’s decision in the Carolene Products case can serve as the germ of an answer to the counter-majoritarian difficulty. Judicial review is arguably legitimate when it serves to protect the interests of “discrete and insular minorities” against oppressive actions by democratic majorities.
      Anti-Democratic Political Theory Another answer to the counter-majoritarian difficulty admits that judicial review is antidemocratic but seeks to justify this feature by appeal to some value that trumps democratic legitimacy. This isn’t really just one answer to the difficulty—it is a whole lot of answers that share a common feature—the appeal to anti-democratic political values. For example, it might be argued that “liberty” is a higher value than “democracy” and hence that judicial review to protect liberty is justified. Or it might be argued that “equality” is a higher value, or “privacy,” or something else. Obviously, there is a lot more to be said about this kind of answer to the counter-majoritarian difficulty, but for the purposes of this Lexicon entry, this incredibly terse explanation will have to suffice.
      Dualism and High Politics Yet a third approach to the counter-majoritarian difficulty attempts to turn the problem upside down—arguing that judicial review is actually a democratic institution that checks the antidemocratic actions of elected officials. Whoa Nelly! How does that work? This third approach is strongly associated with the work of Bruce Ackerman—perhaps the most influential constitutional theorist since Alexander Bickel. Ackerman’s views deserve at least a whole Lexicon entry, but the gist of his theory can be stated briefly. Ackerman argues for a view that can be called “dualism,” because it distinguishes between two kinds of politics—“ordinary politics” (the kind practiced every day by legislators and bureaucrats) and “constitutional politics.” What is “constitutional politics”? And how is it different from “ordinary politics”? Ackerman’s answers to these questions begin with the idea that ordinary politics isn’t very democratic. Why not? We all know the answer to that question. Ordinary politics are dominated by self-interested politicians and manipulative special interest groups. The people (or “We the People” as Ackerman likes to say) don’t really get involved in ordinary politics, and therefore, ordinary politics are not really very democratic. Constitutional politics, by way of contrast, involve extraordinary issues that actually “get the attention” of the people. For example, the ratification of the Constitution of 1789 caught the attention of ordinary citizens, as did the Reconstruction Amendments (the 13th, 14th, and 15th) following the Civil War. When “We the People” become engaged in constitutional politics, we are giving commands to our agents—Congress and the President—and the Courts are merely enforcing our will when they engaged in judicial review—so long as they are faithful to our commands.
      Whew! That was a lot of “We the People” talk. I need a break from channeling Ackerman, before I can finish this entry! OK. I’m back!
      Ackerman’s theory emphasized the idea of distinct regimes that resulted from “constitutional moments”—periods of intense popular involvement in constitutional politics. Recently, Jack Balkin and Sandy Levinson have advanced a similar theory—which emphasizes that idea of “high politics”—the great popular movements that seek to influence the decisions of the Supreme Court on issues like abortion or affirmative action. I can’t do justice to their theory here, but the idea is that the Supreme Court may be responding to democratic pressures when it makes the really big constitutional decisions.
    Dissolving the Counter-Majoritarian Difficulty So far, I’ve been discussing responses to the counter-majoritarian difficulty that operate within normative constitutional theory. There is another important line of attack, however. The counter-majoritarian difficulty rests on a positive (factual) assumption—that the Supreme Court does, in fact, act contrary to political majorities. Some political scientists have argued that this positive assumption is incorrect—that the Supreme Court rarely, if ever, acts contrary to the wishes of the dominant political faction. There could be many reasons for that—one of them being the Supreme Court’s awareness that if it were to buck Congress and the President, it is vulnerable to a variety of political reprisals. Congress might strip the Court of jurisdiction. Ultimately, the President might simply refuse to cooperate with Court’s decisions.
    There is another side to this story. There may be reasons why elected politicians prefer for the Supreme Court to “take the heat” for some decisions that are controversial. When the Supreme Court acts, politicians may be able to say, “It wasn’t me. It was that darn Supreme Court.” And in fact, the Supreme Court’s involvement in some hot button issues may actually help political parties to mobilize their base: “Give us money, so that we can [confirm/defeat] the President’s nominee to the Supreme Court, who may cast the crucial vote on [abortion, affirmative action, school prayer, etc.].” In other words, what appears to be counter-majoritarian may actually have been welcomed by the political branches that, on the surface, appear to have been thwarted.
    Conclusion Once again, I’ve gone on for too long. I hope you will forgive me, and I hope that this Lexicon entry has given you food for thought about the counter-majoritarian difficulty. Below, I’ve included a list of references to articles that focus on the difficulty itself and also to some of the authors who have attempted to give answers to Bickel’s famous problem.
    References This is a very incomplete list, emphasizing the works that are focused on “the counter-majoritarian difficulty” in particular and omitting many important works of constitutional theory that deal with the counter-majoritarian difficulty as part of a larger enterprise.
      Bruce Ackerman, We the People: Foundations (1993) & We the People: Transformations (1998).
      Jack M. Balkin & Sanford Levinson, Understanding the Constitutional Revolution, 87 Va. L. Rev. 1045 (2001).
      Alexander Bickel, The Least Dangerous Branch: The Supreme Court at the Bar of Politics 16-18 (2d ed. 1986).
      Steven G. Calabresi, Textualism and the Countermajoritarian Difficulty, 66 Geo. Wash. L. Rev. 1373 (1998); Barry Friedman, The Counter-Majoritarian Problem and the Pathology of Constitutional Scholarship, 95 Nw. U. L. Rev. 933 (2001).
      Barry Friedman, The History of the Countermajoritarian Difficulty, Part One: The Road to Judicial Supremacy, 73 N.Y.U. L. Rev. 333, 334 (1998).
      Barry Friedman, The History Of The Countermajoritarian Difficulty, Part II: Reconstruction's Political Court , 91 Geo. L.J. 1 (2002).
      Barry Friedman, The History Of The Countermajoritarian Difficulty, Part Three: The Lesson Of Lochner, 76 N.Y.U. L. Rev. 1383 (2001).
      Barry Friedman, The History Of The Countermajoritarian Difficulty, Part Four: Law's Politics, 148 U. Pa. L. Rev. 971 (2000).
      Barry Friedman, The Birth Of An Academic Obsession: The History Of The Countermajoritarian Difficulty, Part Five, 112 Yale L.J. 153 (2002).
      Ilya Somin, Political Ignorance and the Countermajoritarian Difficulty: A New Perspective on the Central Obsession of Constitutional Theory, 89 Iowa L. Rev. 1287 (2004).
      Mark Tushnet, Policy Distortion and Democratic Debilitation: Comparative Illumination of the Countermajoritarian Difficulty, 94 Mich. L. Rev. 245 (1995).