The Laboratorium
February 2013

This is an archive page. What you are looking at was posted sometime between 2000 and 2014. For more recent material, see the main blog at http://laboratorium.net

HathiTrust Appeal: The Authors Guild’s Opening Brief


The Authors Guild and its co-plaintiffs have filed their opening brief appealing from their decisive loss in the district court. Most of the arguments should be familiar if you’ve been following the case, so I’m going to mention only the significantly new or modified points, along with a few details I found striking.

  • The brief starts by comparing Google’s mass digitization of books to “an exercise in eminent domain.” The metaphor is confused, perhaps deliberately so. Eminent domain involves the government taking ownership of property. Google’s digitization, even if it is an unauthorized and infringing use of the plaintiffs’ copyrights, does not affect their ownership. If any of the plaintiffs’ attorneys need a refresher, I would be happy to welcome them to my first-year Property class when we cover eminent domain later this semester. (UPDATE: On further reflection, I think the metaphor works, but in a different respect. Google and the libraries are assembling a large number of smaller individual holdings into a larger parcel that they believe will be more valuable and better serve the public interest. The mechanism and effect on property rights are different, but this does resemble one typical use of eminent domain. So my initial snark was un-called-for.)

  • The brief makes much of the defendants’ sovereign immunity to lawsuits for damages. In theory, it is an irrelevant point, as this is a lawsuit for declaratory and injunctive relief, not damages. But I think the plaintiffs were right to emphasize the unavailability of damages. For one thing, sovereign immunity under the Eleventh Amendment is a bit of a crock; the states use it to engage in all kinds of skulduggery. For another, sovereign immunity gives the plaintiffs’ ripeness arguments a particular sharpness: since compensation ex post isn’t available, the court might be more concerned to reach the legal issues and define the parties’ respective rights ex ante.

  • The brief does an effective job portraying HathiTrust’s dancing around the Orphan Works Project as a whipsaw. The libraries announced the Orphan Works Project, then suspended it, and say that if they relaunch it, then and only then would it be ripe for adjudication. The brief points out that the libraries could also re-suspend the project if challenged a second time, perpetually evading review. In one sense, this isn’t a problem for copyright holders: if the project never actually launches, nothing has been lost. But the brief calls this “an expensive game of ‘Whac-a-Mole,’” in an effective turn of phrase that shows why it’s unfair to deny the authors their ruling on the Orphan Works Project as it was announced and almost implemented. If the libraries want to avoid that ruling, they really ought to be prepared to make a stronger commitment that the project will not come back in a similar form.

  • The plaintiffs renew their security argument, saying:

    Each copy is connected to a “campus network,” and the primary and mirror HathiTrust sites include World Wide Web servers, compounding the risks of exposure. Furthermore, HathiTrust grants remote access to the complete image and text files to nearly one hundred HathiTrust administrators and researchers located throughout the country.

    They made a similar argument at the district court, and it’s spectacularly bad. If keeping documents on a network with remote users and publicly accessible servers is an inherent security risk, then Frankfurt Kurnit Klein and Selz is committing malpractice by keeping client information on its office computers. (See Rule 1.6(c) of the New York Rules of Professional Conduct.)

  • To make their security argument, the plaintiffs rely on Ben Edelman’s expert declaration. He’s a smart guy who’s done important work on Web privacy and online marketing, but he’s not remotely qualified as a network security expert, nor does his report pass the Rule 702 standard that it be “based on sufficient facts or data.” Continued citation of the report on appeal is a classic sunk costs mistake: we spent a lot of money on this report, so we’d better cite it to get our money’s worth. (The same goes for Daniel Gervais’ report predicting that collective licensing spontaneously arises whenever there is the vital heat of a market need for it.)

  • At the district court, the plaintiffs argued that fair use under section 107 was never available to go beyond section 108’s codified privileges for libraries. (They wrote, “Congress included these rules to carefully delineate the boundaries of fair use in the context of library copying.”) On appeal, that argument is gone, replaced by the weaker claim that exceeding section 108 “should weigh heavily against a finding of fair use.” That view depends on a reading of the history of section 108 that the defendants will of course dispute.

  • The plaintiffs’ brief tries to disaggregate the different uses for the scans: even if searching is a fair use, there’s no need to retain numerous electronic copies of the full texts of the works. Judge Baer’s opinion anticipated this particular objection: “Not to mention that it would be a tremendous waste of resources to destroy the electronic copies once they had been made for search purposes, both from the perspective of the provision of access for print-disabled individuals and from the perspective of protecting fragile paper works from future deterioration.” The plaintiffs respond that they don’t want to destroy the digital files, “but rather to have them taken offline and stored under lock and key.”

  • Judge Baer found both that the digital collection was transformative (factor one) and that it had no market impact (factor four). The plaintiffs challenge both findings. I find their argument on the first factor plausible — at the very least, the idea of “transformative” use has been pushed beyond what the word will bear if it is “transformative” to provide books to the print-disabled. Their argument on the fourth factor doesn’t do much for me: they continue to cling to the self-refuting argument that, “Each Unlicensed Digital Copy Represents a Lost Sale.” But there are numerous books in the collection that are not available for sale or license for the activities in suit, on any terms, at any price.

  • In trying to distinguish the cases finding search engine indexes of webpages to be fair use, the brief argues,

    The Ninth Circuit cases cited by the District Court that address the legality of copying web pages for the purpose of creating a search index, Perfect 10, Inc. v. Amazon.com, Inc., 508 F.3d 1146 (9th Cir. 2007) and Kelly v. Arriba Soft Corp., 336 F.3d 811 (9th Cir. 2003), are distinguishable because copyright owners who publish material on the Internet do so because they want their content to be found and viewed. For all intents and purposes, such owners have provided an implied license to search engines to copy and index their contents. Moreover, unlike HathiTrust’s perpetual storage of high resolution image files and text files of every book, the Web pages copied by a search engine are incidental to the search function. As noted by one court, after copying full size images onto its server for the purpose of creating “thumbnails,” the search engine deleted the original copy from its server. See Kelly, 336 F.3d at 815. Thus, even if ingesting a copyrighted work into a search engine is transformative, it does not follow that the permanent storage of the original content is also transformative.

    There are some subtle bits of misdirection in there. First, Perfect 10 wasn’t plausibly an implied license case, because there, Google was making a search index of infringing websites. Whatever authorization those websites gave Google was worthless: the case rose or fell on fair use, specifically the transformativeness of a search index. And as for deleting the files, the only reason that an Internet search engine can get away with deleting originals is that it regularly crawls the Web, so that it can get a fresh copy whenever needed. Digitizing a book, on the other hand, is not something you want to do every two weeks.

  • The associational standing claim is a bit of a procedural backwater at this point, but the plaintiffs’ brief does an excellent job of explaining why it makes sense to let associations sue to vindicate their members’ rights. Its legal analysis here is clear, succinct, and persuasive.

  • The brief also makes a clear — but oddly rushed — argument that the libraries do not qualify as “authorized entities” under section 121 of the Copyright Act for purposes of making accessible editions for the print-disabled. The section ends with the following:

    The Authors do not seek a remedy that would foreclose the print-disabled from gaining access to the digital library, but one that would require any access to be facilitated in accordance with the statutory scheme established by Congress.

    I expect the NFB to have some choice words about this particular argument in their response brief.

Overall, I found the brief well-written, but not compelling. The brief doesn’t particularly undermine the authority of Judge Baer’s opinion. I am adjusting upwards my estimate of the likelihood that the Second Circuit will affirm.

The Illegal Process


By pure coincidence, I have another paper to announce: The Illegal Process: Basic Problems in the Making and Application of Censorship. It’s a shortish (13-page) essay in the University of Chicago Law Review’s online supplement, Dialogue. In it, I respond to a provocative proposal by Derek Bambauer that the United States enact a statute permitting Internet filtering but setting strict procedural limits on its use. This idea sounds disconcertingly close to promoting censorship, but Bambauer’s point is that the U.S. already engages in Internet filtering through less accountable means, and it would be better to bring the process under legal control.

This is a process-oriented argument: it deliberately focuses on the legitimacy of the procedures used to install filters, rather than on the substance of what’s filtered and what isn’t. So I had the idea of critiquing the argument using the tools of the Legal Process school of jurisprudence, which tried to use process-oriented arguments to understand rigorously the roles of courts and legislatures in a democracy under the rule of law, and to develop a clearer understanding of when various procedural devices are appropriately employed. My title is a play on the bible of the Legal Process school, Hart and Sacks’ The Legal Process: Basic Problems in the Making and Application of Law.

My format is a play on Hart and Sacks, as well. The Legal Process is notable for its barrage of questions to the reader, questions which range from the subtle to the sublime. The Illegal Process consists of a long series of “Notes and Queries” on Bambauer’s article — over a hundred and fifty questions, in all. Some are pointed, some are cheeky, some are gently leading; all of them, I hope, help to bring out the implications of his argument and the challenges of trying to build legal bulwarks against would-be censors. Here’s a sample:

Professor Bambauer refers to his criteria as a “process-based methodology” and defends them as being “compatible with divergent views on what material should be banned.” How far can procedural criteria go in settling questions about censorship? Does it follow that because procedurally regular censorship is more legitimate than procedurally irregular censorship, it is legitimate in an absolute as well as a relative sense? Is this a question that can be settled in the abstract, without reference to the material to be censored? Is it right that whether Winston Smith shall be permitted to read The Theory and Practice of Oligarchical Computation should turn only on the process Comrade O’Brien follows and not on the contents of the book? But if it is necessary to make normative judgments about whether particular material can appropriately be censored, is it possible to say anything about global censorship that does not rest on contested moral and social values? Is Professor Bambauer’s theory an attempt to apply a quintessentially liberal methodology—procedural justice—to a quintessentially illiberal subject—censorship?

Future Conduct and the Limits of Class-Action Settlements


My latest article, Future Conduct and the Limits of Class-Action Settlements, has just been published in the North Carolina Law Review. I’ve been working on this one for a long time—two and a half years—and have been struggling with the ideas for even longer—nearly five. I’ve kept it under wraps until now because I wanted to be sure I had the details right.

This is my fullest and strongest argument against the late Google Books settlement. In the course of studying it, I came to realize that it was only the most visible example of a new and deeply worrying trend in class-action law. I found half a dozen other settlements, from antitrust to real estate, that used the same dirty trick the Google Books settlement did: giving the defendant a release for the future that would allow it inflict in new and unprecedented harms on the members of the class suing it. This article is my attempt to make sure that no one ever tries such a thing again—and that if anyone does try, the courts are ready to stand guard against it. It’s a sustained (nearly 90 pages) explanation of how these releases work, why they’re deeply dangerous to class members, and why they’re fundamentally illegal. As I said at a conference, “The Google Books settlement is dead; I would like you to come with me to the graveyard with pitchforks and stakes.”

Here’s an example, to give a sense of the kinds of unearthly forces from the outer darkness the Google Books settlement was trying to summon. Imagine that in 2003, BP had had a minor oil spill from a well in the Gulf of Mexico: a few thousand barrels. Now imagine a class action supposedly on behalf of all the residents of the Gulf states, and imagine a “settlement” of that class action that released BP from all liability not just from this past spill, but from all future spills. If such a settlement had been in place at the time of the Deepwater Horizon explosion, Tony Hayward could have stayed on his yacht all spring and summer without lifting a finger to stop the spill.

The courts should not be in the business of handing out these unprecedented future-conduct releases in class actions. The article is a careful explanation of why. Here’s the abstract:

This Article identifies a new and previously unrecognized trend in class-action settlements: releases for the defendant’s future conduct. Such releases, which hold the defendant harmless for wrongs it will commit in the future, are unusually dangerous to class members and to the public. Even more than the “future claims” familiar to class-action scholars, future-conduct releases pose severe informational problems for class members and for courts. Worse, they create moral hazard for the defendant, give it concentrated power, and thrust courts into a prospective planning role they are ill-equipped to handle.

Courts should guard against the dangers of future-conduct releases with a standard and a rule. The standard is heightened scrutiny for all settlements containing such releases; the Article describes the warning signs courts must be alert to and the safeguards courts should insist on. The rule is parity of preclusion: a class-action settlement may release future-conduct claims if and only if they could have been lost in litigation. Parity of preclusion elegantly harmonizes a wide range of case law while directly addressing the normative problems with future- conduct releases. The Article concludes by applying its recommendations to seven actual future-conduct settlements, in each case yielding a better result or clearer explanation than the court was able to provide.