Google Files Cert. Petition in Street View Case

Google Street View carI noted back in October that Google had hired “noted Supreme Court advocate Seth Waxman” as it was preparing its petition for rehearing in the Street View case, “indicating perhaps how far they intend to take this.” (For background, see my earlier posts Part I, Part II, after the panel decision, and on the petition for rehearing.) My suspicions were accurate — after losing again at the rehearing stage in late December, Google has now filed a petition for certiorari, asking the Supreme Court to reverse the Ninth Circuit.

Google’s petition primarily makes the same substantive arguments it made in its petition for rehearing. The Ninth Circuit in the decision below adopted what I’ve called the “radio means radio” approach — “radio communications” in the Wiretap Act means only communications that you can receive with, you know, an ordinary AM/FM radio. I’ve argued that that is mistaken, and Google unsurprisingly agrees with me. Google provides three reasons why the Ninth Circuit’s interpretation cannot be sustained. Continue reading

Schneier on the NSA, Google, Facebook Connection But What About Phones?

Bruce Schneier argues that we should not be fooled by Google, Facebook, and other companies that decry the recent NSA data grabs, because the nature of the Internet is surveillance; but what about phone companies? The press has jumped on the Obama administration’s forthcoming plan that

would end its systematic collection of data about Americans’ calling habits. The bulk records would stay in the hands of phone companies, which would not be required to retain the data for any longer than they normally would. And the N.S.A. could obtain specific records only with permission from a judge, using a new kind of court order.

The details are to come, but Schneier’s point about the structure of the system applies to phone companies too, “The biggest Internet companies don’t offer real security because the U.S. government won’t permit it.”

There are few things to parse here. OK there are many things to parse, but a blog post has limits. First, Schneier’s point about Internet companies is different than his one about the government. His point is that yes, many companies have stepped up security to prevent some government spying, but because Gooogle, Microsoft, Facebook, Yahoo, Apple and almost any online company needs access to user data to run their businesses and make money, they all have built “massive security vulnerability” “into [their] services by design.” When a company does that, “by extension, the U.S. government, still has access to your communications.” Second, as Schneier points out, even if a company tried to plug the holes, the government won’t let that happen. Microsoft’s Skype service has built in holes. The government has demanded encryption keys. And so it goes. And so we have a line on the phone problems.

The proposed changes may solve little, because so far the government has been able to use procedure and sheer spying outside procedure to grab data. The key will be what procedures are required and what penalties follow for failing to follow procedure. That said, as I argued regarding data security in January 2013, fixing data security (and by extension phone problems) will require several changes:

A key hurdle is identifying when any government may demand data. Transparent policies and possibly treaties could help better identify and govern under what circumstances a country may demand data from another. Countries might work with local industry to create data security and data breach laws with real teeth as a way to signal that poor data security has consequences. Countries should also provide more room for companies to challenge requests and reveal them so the global market has a better sense of what is being sought, which countries respect data protection laws, and which do not. Such changes would allow companies to compete based not only on their security systems but their willingness to defend customer interests. In return companies and computer scientists will likely have to design systems with an eye toward the ability to respond to government requests when those requests are proper. Such solutions may involve ways to tag data as coming from a citizen of a particular country. Here, issues of privacy and freedom arise, because the more one can tag and trace data, the more one can use it for surveillance. This possibility shows why increased transparency is needed, for at the very least it would allow citizens to object to pacts between governments and companies that tread on individual rights.

And here is the crux of Schneier’s ire: companies that are saying your data is safe, are trying to protect their business, but as he sees it:

A more accurate statement might be, “Your data is safe from governments, except for the ways we don’t know about and the ways we cannot tell you about. And, of course, we still have complete access to it all, and can sell it at will to whomever we want.” That’s a lousy marketing pitch, but as long as the NSA is allowed to operate using secret court orders based on secret interpretations of secret law, it’ll never be any different.

In that sense he thinks companies should lean on the government and openly state security is not available for now. Although he knows no company can say that, the idea that we should all acknowledge the problem and go after the government to change the game is correct.

The point is correct for Internet companies and for phone companies. We should not over-focus on phones and forget the other ways we can be watched.

Public Service Announcement for Google Glass Team

The Google Glass team has a post about the so-called myths about Google Glass, but the post fails to see what is happening around Glass. That is sad. Instead of addressing the issues head on, the post preaches to the faithful (just read the comments). As Nate Swanner put it “We’re not sure posting something to the tech-centric Google+ crowd is really fixing the issues though.” Google and other tech companies trying to do something new will always face challenges, fear, and distrust. The sad part for me is when all sides line up and fail to engage with the real issues. Some have asked what I did when at Google. Part of the job was to present the technology, address concerns, and then see where all of us saw new, deep issues to come. I loved it, because I knew the technology was driven by high-standards. The problems flowed from not explaining the tech. This post highlights talking past each other. Furthermore the truly wonderful advances that might be possible with Glass are not discussed. That distresses me, as no one really wins in that approach. But I will show what is not great about the post as a possible public service announcement for the Glass Team and others in the tech space.

First, the post sets an absurd tone. It starts with “Mr. Rogers was a Navy SEAL. A tooth placed in soda will dissolve in 24 hours. Gators roam the sewers of big cities and Walt Disney is cryogenically frozen. These are just some of the most common and — let’s admit it — awesome urban myths out there.” Message: Glass critics are crazy people that by into extreme outlying beliefs, not truth. And if you think I am incorrect, just look at this next statement: “Myths can be fun, but they can also be confusing or unsettling. And if spoken enough, they can morph into something that resembles fact. (Side note: did you know that people used to think that traveling too quickly on a train would damage the human body?).” Hah! We must be idiots that fear the future.

That said maybe there are some myths that should be addressed. Having worked at Google, I can say that while I was there, technology was not done on a whim. I love that about the company and yes, the Glass Team fits here too. Furthermore, as those who study technology history know, even electricity faced myths (sometimes propagated by oil barons) as it took hold. Most of the Glass myths seem to turn on cultural fears about further disconnection from the world, always on or plugged in life, and so on. But the post contradicts itself or thinks no one can tell when its myth-busting is self-serving or non-responsive.

On the glass is elitist issue: Google is for everyone, but high priced, and not ready for prime time. Huh? Look if you want to say don’t panic, few people have it, that is OK and may be true. But when you also argue that it is not elitist because a range of people (not just tech-worshiping geeks) use Glass; yet nonetheless the $1500 price tag is not about privilege because “In some cases, their work has paid for it. Others have raised money on Kickstarter and Indiegogo. And for some, it’s been a gift” the argument is absurd. That a few, select people have found creative ways to obtain funds for Glass does not belie the elite pricing; it shows it.

The surveillance and privacy responses reveal a deeper issue. Yes, Glass is designed to signal when it is on. And yes that may limit surveillance, but barely. So too for the privacy issue. Check this one in full:

Myth 10 – Glass marks the end of privacy
When cameras first hit the consumer market in the late 19th century, people declared an end to privacy. Cameras were banned in parks, at national monuments and on beaches. People feared the same when the first cell phone cameras came out. Today, there are more cameras than ever before. In ten years there will be even more cameras, with or without Glass. 150+ years of cameras and eight years of YouTube are a good indicator of the kinds of photos and videos people capture–from our favorite cat videos to dramatic, perspective-changing looks at environmental destruction, government crackdowns, and everyday human miracles. 

ACH!!! Cameras proliferated and we have all sorts of great, new pictures so privacy is not harmed?!?!?! Swanner hits this one dead on:

Google suggests the same privacy fears brought up with Glass have been posed when both regular cameras and cell phone cameras were introduced in their day. What they don’t address is that it’s pretty easy to tell when someone is pointing a device they’re holding up at you; it’s much harder to tell when you’re being video taped while someone looks in your general direction. In a more intimate setting — say a bar — it’s pretty clear when someone is taping you. In an open space? Not so much.

So tech evangelists, I beg you, remember your fans are myriad and smart. Engage us fairly and you will often receive the love and support you seek. Insult people’s intelligence, and you are no-better than those you would call Luddite.

How Is Privacy Not a Class at all Law Schools?

Privacy law does not exist, but it should be taught at every law school. There is no one law of privacy. That is why I love teaching Information Privacy (Solove and Schwartz (Aspen) is the text I use). The class requires students to reengage with and apply torts, Constitutional law (First and Fourth Amendment at least), and statutory interpretation. It also lends itself to learning about sectoral approaches to regulation in health, finance, commerce, and education. Given that the idea and problems of privacy are everywhere, there are jobs in them thar hills. Yet, schools often see the course as a luxury or somehow part of IP. That is a mistake.

Schools should not pander to skills and job training demands, but sensitivity to areas of practice that have large needs is not pandering. Much of the skills, ready-to-practice rot comes from a small segment of the legal practice (i.e., big firms with huge profits who are not willing to pay for training their employees). That said, law schools tend to use the same playbook. For example, the rarified world of public corporation law is a standard part of business associations course materials. Yet according to the Economist, the number of public companies peaked at around 7,888 in 1997. Of course folks will say “Don’t teach to the bar.” Amen brothers and sisters, but why teach for a tiny portion of students in a core course? To be clear, I love teaching business associations and think it is useful, because agency and limited liability forms are so important. They are important, because being able to compare and contrast the forms for a client makes the attorney worth her pay. Grasping the beauty and nuances of the system unlocks the ability to be a true counselor. There are many, many businesses that are not, and may never become, public and that could benefit from having an attorney set up their project from the start. Privacy is similar. It reaches across many aspects of our lives and businesses.

Privacy issues come up in such a large range of practice that the course can allow one to address doctrinal mastery while also moving students beyond the silo approach of first year law. Seeing how property and trespass ideals reappear in criminal procedure, how assumption of risk permeates issues, and so on, shows students that the theories behind the law work in not so mysterious, but perhaps unstated ways. The arguments and counter-arguments come faster once you know the core idea at stake. That is the think-like-a-lawyer approach working well. It does not hurt that along the way students pick up knowledge of an area such as HIPPA or criminal procedure and technology that will make them a little more comfortable telling an employer or future client “Yes, I know that area and here’s how I’d approach it.”

It’s About Data Hoards – My New Paper Explains Why Data Escrow Won’t Protect Privacy

A core issue in U.S. v. Jones has noting to do with connecting “trivial” bits of data to see a mosaic; it is about the simple ability to have a perfect map of everywhere we go, with whom we meet, what we read, and more. It is about the ability to look backward and see all that information with little to no oversight and in a way forever. That is why calls to shift the vast information grabs to a third party are useless. The move changes little given the way the government already demands information from private data hoards. Yes, not having immediate access to the information is a start. That might mitigate mischief. But clear procedures are needed before that separation can be meaningful. That is why telecom and tech giants should be wary of “The central pillar of Obama’s plan to overhaul the surveillance programs [which] calls for shifting storage of Americans’ phone data from the government to telecom companies or an independent third party.” It does not solve the problem of data hoards.

As I argue in my new article Constitutional Limits on Surveillance: Associational Freedom in the Age of Data Hoarding:

Put differently, the tremendous power of the state to compel action combined with what the state can do with technology and data creates a moral hazard. It is too easy to harvest, analyze, and hoard data and then step far beyond law enforcement goals into acts that threaten civil liberties. The amount of data available to law enforcement creates a type of honey pot—a trap that lures and tempts government to use data without limits. Once the government has obtained data, it is easy and inexpensive to store and search when compared to storing the same data in an analog format. The data is not deleted or destroyed; it is hoarded. That vat of temptation never goes away. The lack of rules on law enforcement’s use of the data explains why it has an incentive to gather data, keep it, and increase its stores. After government has its data hoard, the barriers to dragnet and general searches—ordinarily unconstitutional—are gone. If someone wishes to dive into the data and see whether embarrassing, or even blackmail worthy, data is available, they can do so at its discretion; and in some cases law enforcement has said they should pursue such tactics. These temptations are precisely why we must rethink how we protect associational freedom in the age of data hoarding. By understanding what associational freedom is, what threatens it, and how we have protected it in the past, we will find that there is a way to protect it now and in the future.