IPSC and the Future of Legal Scholarship

Last week I attended the 14th edition of the “Intellectual Property Scholars Conference,” or IPSC. I came back to Pittsburgh inspired, challenged, and a little … well, down. Did I see into the scholarly soul of the discipline(s) that we call intellectual property law? Into the future of legal scholarship in general? If so, I came away with mixed feelings. I’ve been away from this blog; now back to the blog I go.

To set the stage a bit, consider this. IPSC is a working papers conference. It was launched in 2001 by senior faculty at Cardozo, DePaul, and Berkeley (then Boalt Hall) law schools as a way for a small number of seniors to give thoughtful feedback to an equally small number of emerging juniors. That feedback constituted partly comments on papers; it also constituted broader mentorship built into the structure of the conference and the relationships that emerged from it.

IPSC changes. Each year, the conference host tweaks the format a bit. The conference rotation has expanded to include Stanford as well as the original three host schools. But the biggest change is that the conference has steadily expanded in size. The original two conferences were really workshops; all attendees fit into a single (large) conference room. The 2014 edition featured more than 200 attendees and more than 150 paper presentations. There were talks by senior scholars as well as by grad students, postdocs, and junior faculty members, and at times there were as many as six concurrent tracks. Rebecca Tushnet provided a nearly real-time account of many of the papers that she heard, but even her record of events, amazing as it is, portrays only a small portion of the IPSC landscape. I probably heard 20 papers, read abstracts for all 150+ and downloaded a fair number of them for later consumption, and interacted socially with several dozen people. Everyone and everything else was essentially invisible to me. On the printed program, I saw the names of many friends and colleagues who I never saw in the flesh.

What do we make of this?

First, the good news:

1/ Legal scholarship is going global in a big way.

For the first time in my memory, a not insignificant number of presenters at IPSC were scholars from outside the US, including Asia (China in particular) and South America as well as Canada and Europe. (Several of the presenters based at European and UK universities are natives of South American and Asian countries.) Much of the research on offer from our non-US colleagues was/is of a type and style – conceptual and/or empirical — that only a few years ago we might have stereotyped as “American” in contrast with a stereotypically duller, less ambitious European doctrinalism.

Question: Is there any conceivable sense in which this is a bad thing?

Next, the bad news:

2/ Plus ça change.

Several years ago I came back from an earlier edition of IPSC discouraged by what I felt was a lack of historical sensitivity among my IP colleagues, particularly (but not entirely) the junior ones. I wrote about that on the blog, here, and later tried to address the problem in part with a series of posts that I titled “Lost Classics of IP.” I’ve now combined and reshaped those posts into a paper that I posted to SSRN recently.

My views have not changed.

Question: Is there anything meaningful that can be done about this?

Finally, the so-so news:

3/ The purpose(s) of working papers conferences.

IP scholars joke that IPSC has become a cocktail party, or a form of intellectual speed dating (or both; choose your own metaphor). And they’re right about the metaphors, though the metaphors do more than punch up a a joke. In truth, the conference is modestly useful at introducing junior scholars to some senior scholars and to the norms of IP scholarship. It is very useful at enabling mid-level and senior scholars to meet and hang out with their friends. It is not useful at all with respect to its original purpose, which is feedback and mentorship. Mentorship is a high-bandwidth activity, which means that it doesn’t scale, least of all across six concurrent tracks and 20 minutes per presentation (including Q&A).

For years, IP cultivated a reputation as the welcoming discipline. The field suffered from little of the hierarchy and sense of exclusion that (I hear from friends) defines other fields. Junior people were (on the whole) welcomed, supported, mentored, and encouraged. They became (on the whole) welcoming, supportive, mentoring, encouraging senior people. And the scholarship that came out of the field was, in my view and on the whole, as ambitious, thoughtful, and challenging as the scholarship in any legal domain.

I look at the junior people in the field today, and I wonder: At 20 minutes a presentation, how welcoming and supportive can the field really be? Sure, virtually everyone who asks gets a presentation slot, which guarantees access to junior scholars. But that also means that effectively anyone who asks gets a presentation slot. If mentoring is happening, where and when is it happening? (In small, more private and less-IP-specific settings, if it’s happening at all.) How will today’s junior scholars behave when (if) they become senior scholars? And what kind of scholarship is this dynamic producing? On that last question, my tentative answer is this: IP is supporting a lot of “normal science” research that is asking, or re-asking, versions of questions that have been asked before.

Remember, this is the FIRST TIME that anyone has asked these important questions, although they are virtually indistinguishable from questions that people senior to me, who signal what’s important in the field and what’s safe to argue, have asked many times before.

Question: Is IP eating its seed corn? Put differently: Is IP, which is relatively young by scholarly standards, maturing into a typical academic discipline, with hierarchies and implicit norms and “right” and “wrong” sorts of scholarship?

4/ What about our students?

Last but by no means least, I came away as never before from IPSC wondering whether any of the scholarship on display has any bearing on how we teach our students. Lots of presentations had explicit or implicit “hooks” with respect to public policy and advocacy; on the whole, that’s a good thing. But very few presentations suggested to me, even implicitly, that the scholarship at hand either emerged from the challenges of teaching law students today or would affect how we teach law students today. This may be what troubled me most about my experience last week: the sense that I was wearing a “scholar’s hat” that was detached from my “ordinary” (but changing) role as a law teacher, and more detached than it has been for a long time, considering the “normal science” style of scholarship that I witnessed. The legal profession and law schools are confronting some extraordinary challenges. There was little sense at IPSC that those challenges are affecting scholarly practice.

Questions: Is this distinction, between modes of legal scholarship and modes of law teaching and the practices of the legal profession, sustainable? If it’s not, what synthesis (or more likely, syntheses) of scholarship and teaching are likely to take its place?

The Supreme Court Considers Google Street View

Google Street View carAll of the interest in the Supreme Court tomorrow is likely to be focused on Hobby Lobby and, to a lesser extent, Harris v. Quinn. But I’ll be watching something that happens before either of those decisions is announced. I’ll be looking to see if the Supreme Court granted cert in the StreetView case. I hope the answer is no.

The StreetView case — Google v. Joffe — is one that I’ve blogged extensively about over the past year. See Part I, Part II; see also my coverage of the Ninth Circuit opinion, Google’s petition for rehearing, and the filing of Google’s cert. petition.) Briefly, Google’s StreetView cars intercepted the contents of transmissions from residential wi-fi routers whose owners had not turned on encryption. A number of class actions have been filed claiming that the interceptions were violations of the federal Wiretap Act. Google moved to dismiss them, arguing that radio communications (like wi-fi) basically have to be encrypted to be protected by the Wiretap Act. The district court and the Ninth Circuit disagreed, holding that the exception Google points to applies only to traditional AM/FM radio broadcasts.

Although I disagree with the Ninth Circuit’s reasoning and would find it professionally advantageous if the Supreme Court decided to take the case, I hope it denies cert. Here’s why. Continue reading

Is Hachette Being Hoisted by Its Own DRM Petard?

oldbooks2.JPGRebecca Tushnet points to this column by Cory Doctorow arguing that Hachette is being held hostage in its fight with Amazon over e-book versions of its books because of its “single-minded insistence on DRM”: “It’s likely that every Hachette ebook ever sold has been locked with some company’s proprietary DRM, and therein lies the rub.” Doctorow argues that because of the DMCA Hachette can no longer get access, or authorize others to get access to, its own books:

Under US law (the 1998 Digital Millennium Copyright Act) and its global counterparts (such as the EUCD), only the company that put the DRM on a copyrighted work can remove it. Although you can learn how to remove Amazon’s DRM with literally a single, three-word search, it is nevertheless illegal to do so, unless you’re Amazon. So while it’s technical child’s play to release a Hachette app that converts your Kindle library to work with Apple’s Ibooks or Google’s Play Store, such a move is illegal.

It is an own-goal masterstroke.

Everyone loves irony, but I can’t figure out how to make Doctorow’s argument work. First, I can’t figure out what the anticircumvention problem would be. Second, I can’t figure out why Hachette wouldn’t be able to provide other distributors with e-book versions of its books. Continue reading

Oracle v. Google Reversed – Why Framing Matters

Two years to the day since my last blog post on this subject, the Federal Circuit has reversed Judge Alsup’s ruling that the Java API (the list of function and variable -a/k/a parameter- names) is uncopyrightable. The Federal Circuit held that the structure, sequence, and organization of the APIs renders them sufficiently original and non-functional to be copyrightable. As such, the case is remanded to determine whether there is fair use by Google in using them wholesale to make Android. For more background, see my prior post.

The problem with this ruling is twofold. First, it is surely correct. Second, it is surely wrong. Why is it correct? Because structure, sequence, and organization can be creative. This has long been true, and well should be. I won’t relitigate that here, but holding that these APIs were simply not copyrightable was a stretch in the 9th Circuit, and the Federal Circuit is correct to say so.

Why is it wrong? Because Google should surely be privileged to do what it did without having to resort to fair use. The court says: “We disagree with Google’s suggestion that Sony and Sega created an ‘interoperability exception’ to copyrightability.”

It is here that framing is important. The court’s statement is accurate; we don’t get rid of copyrightability just to allow interoperability. But Sega is crystal clear that we do allow interoperability reuse: “To the extent that a work is functional or factual, it may be copied,Baker v. Selden, as may those expressive elements of the work that ‘must necessarily be used as incident to’ expression of the underlying ideas, functional concepts, or facts….” This is not the merger doctrine that the court applied, but rather a defense to infringement.

In short, this should have been an abstraction-filtration-comparison case, and the Federal Circuit makes clear that Judge Alsup did not perform that analysis. The appeals court also makes clear that if the APIs are directly taken, you can jump directly to filtration, but this does not mean you need to hold the APIs uncopyrightable in order to filter them out in the infringement analysis. Instead, Oracle gets its copyright, and Google gets interoperability. It is here that the appellate decision misses the boat.

I hate to be critical after the fact, but this case should never have gone to the jury. It should have been decided as a non-infringement summary judgment case pre-trial where Oracle kept its copyright but infringement was denied as a matter of law due to functional reuse. Maybe that would have been reversed, too, but at least the framing would have been right to better support affirmance.

May 12, 2014 update: Two commenters have gone opposite ways on Sega, so I thought I would expand that discussion a bit:

Sega is about intermediate copying fair use, yes. But that intermediate copying was to get to the underlying interoperability. And I quote the key sentence from Sega above – even if that functionality is bound up with expression (as it is in this case), we still consider that a privileged use (and thus a worthy end to intermediate copying, which is not a privileged use).

Now, in this case, we don’t need to get to the intermediate copying part because the interoperability information was published. But the privileged use that allowed the intermediate copying didn’t suddenly go away simply because Google didn’t have to reverse engineer to expose it. So, so say Sega doesn’t apply because it is a fair use case completely misunderstands Sega. The fair use there was not about fair use of the APIs. That use was allowed with a simple hand wave. The fair use was about copying the whole program to get to those APIs, something that is not relevant here. So sending this case back for a fair use determination is odd.

That said, Sega pretty clearly makes the use a defense to infringement, rather than a 102(b) ruling that there can be no copyright.

The Supreme Court heard Alice v. CLS Bank – in 1976

As patent system followers eagerly await the outcome of Alice v. CLS Bank, it occurred to me that the Court has already heard this exact case – back in the 1970s. My prior discussion of Alice is here as background.

In Dann v. Johnston, the applicant sought a patent on software that allowed banks to report account spending by category (e.g. rent, utilities) rather than having customers calculate this themselves. This patent is little different than the patent in Alice in concept. The Alice patent covers software that allows banks to reconcile transactions through the use of “shadow” accounts kept in data records.

The Court took the case up on two questions: subject matter and obviousness. In the end, the Court dodged the subject matter question, but instead ruled that the patent was obvious, in large part because it simply implemented something that already existed in paper:

Under respondent’s system, what might previously have been separate accounts are treated as a single account, and the customer can see on a single statement the status and progress of each of his “sub-accounts.” Respondent’s “category code” scheme, see supra, at 221, is, we think, closely analogous to a bank’s offering its customers multiple accounts from which to choose for making a deposit or writing a check. Indeed, as noted by the Board, the addition of a category number, varying with the nature of the transaction, to the end of a bank customer’s regular account number, creates “in effect, a series of different and distinct account numbers. . . .” Pet. for Cert. 34A. Moreover, we note that banks have long segregated debits attributable to service charges within any given separate account and have rendered their customers subtotals for those charges.

The utilization of automatic data processing equipment in the traditional separate account system is, of course, somewhat different from the system encompassed by respondent’s invention. As the CCPA noted, respondent’s invention does something other than “provide a customer with . . . a summary sheet consisting of net totals of plural separate accounts which a customer may have at a bank.” 502 F. 2d, at 771. However, it must be remembered that the “obviousness” test of § 103 is not one which turns on whether an invention is equivalent to some element in the prior art but rather whether the difference between the prior art and the subject matter in question “is a difference sufficient to render the claimed subject matter unobvious to one skilled in the applicable art. . . .” Id., at 772 (Markey, C. J., dissenting).

You could cut something like this out and plop it right into the Alice opinion. Except now, obviousness is not on the table. The question is whether the Court got it wrong by not ruling on subject matter in 1976. I don’t think so, but I do think that Dann v. Johnston is the most underused Supreme Court opinion in the software area.