My Appearance Before The ETHI

On June 12th, I was called to appear before the Standing Committee on Ethics, Accountability & Privacy in a hearing on privacy and social media. I told the Committee that ‘big data’ is the ‘new sugar’ and that we stand at the precipice of what one might call the ‘late onset diabetes’ of the information age.

I made a number of recommendations and will submit a detailed brief to the Committee later this summer.


Ian Kerr

Canada Research Chair in Ethics, Law and Technology

University of Ottawa, Faculty of Law

Appearance before the Standing Committee on

Ethics, Accountability & Privacy

June 12, 2012

Almost exactly one year ago, I was sitting in a boardroom like this one — only much fancier. The daylong meeting was at 1601 South California Avenue, Palo Alto, CA.

If the addy isn’t familiar to you, it is the Facebook campus. A guy called Mark Zuckerberg works there. And it is spectacular. Vibrant. Pounding with energy. Everyone jacked into headphones. I felt like a kid in a candy store.

Because I was required to sign a non-disclosure agreement upon arrival, I cannot tell you many of the interesting things I learned about Facebook. Apparently Zuck’s FB tagline (which reads: “I’m trying to make the world a more open place by helping people connect and share”) does not apply to FB’s business operations.

There is one thing I will disclose, however. I got sick to my stomach that day from eating way too many sourpatch kids. The roof of my mouth was practically torn to shreds. Imagine an extremely well stocked candy store — Sugar Mountain, or the Bulk Barn — but in seemingly endless supply, at every coffee station throughout the entire FB campus.

In defence of my gluttony, let me just say that I was not the only one. What I witnessed that day was 25 of the world’s most important privacy scholars and advocates stuffing their faces, lining their pockets and filling their knapsacks with candy. Grown adults, earning six figure salaries.

We were not stealing. Excessive and free consumption was encouraged. We were simply reacting to the offer of a ubiquitous, abundant and highly addictive form of fuel.

Why have I wasted 3 of my precious 10 minutes talking to ETHI about eating sourpatch kids at the Facebook campus?

Because, information is the new sugar. Big data, Big sugar. Get candy, get candy, get candy.

Just as health practitioners urge us to consume fewer refined sugars and to develop policies to safeguard Canadians from increasingly unhealthy consumption habits, I appear before you today, as a privacy practitioner, urging you to safeguard Canadian citizens and global corporations from the complex and increasingly unmanageable desire to collect, use and disclose more and more personal information.

Because — big data is like big sugar. The more ubiquitous, abundant, pleasurable, efficient, and profitable it is, the more we want it. And, sometimes, the more we want it, the more blinded we are by its consequences.

We stand at the precipice of what one might call the “late onset diabetes” of the information age. And, we should be doing much more to prevent it.

You have already heard excellent submissions from two fantastic Commissioners: Ann Cavoukian and Elizabeth Denham – as well as my hugely talented University of Ottawa colleagues, Professors Scassa, Geist and Steeves. They have overlapped on a number of crucial recommendations that must be followed. To recap four key points:

1. You need to finish what you started. You are way behind on a number of necessary legislative reforms to PIPEDA. Studying social media may grab headlines, but ETHI should focus first on the PIPEDA Review. I learned as a kid — leave the drum solos till later. Its not as sexy, but the rudiments must come first.

2. Perhaps most rudimentary, the Privacy Commissioner needs much greater powers, including the ability to make orders, award damages and issue penalties. These enforcement powers must have serious teeth.

3. Also rudimentary is mandatory notification requirements for certain kinds of security breaches.

4. Another privacy basic is the need to mandate far greater transparency — not only about the collection of personal information, but about how it is being used and to whom it is being disclosed… We need this both at the front and back end of social media transactions. To be clear: this is not just about tweaking privacy policies or making more understandable notice provisions. Its about legislating “mandatory minimum” standards for privacy transparency, requiring that they be embedded into technologies and social techniques. We don’t sell cars without speedometers, odometers, fuel or pressure gauges; likewise, our social media should be required to have feedback mechanisms that allow us to look under the hood and warn us when conditions are no longer safe.

I have two further submissions of my own.

The first concerns privacy’s default settings. In his appearance, Professor Geist generously referred to my work titled, “The Devil is in the Defaults”.

In short, the architecture of every technology includes a number of design choices. Some design choices create default positions. For example, a car’s default position is “stop”. When we enter a car and turn it on, the car is in “park.” For safety’s sake, its design requires that we consciously put it in gear in order to go. Although it would be possible to design things the other way around, we recognize the danger of a car that defaults to “go” rather than “stop”. And we have regulated against it.

The same should be true for privacy — but it isn’t. For example, following a lengthy investigation of FB in 2008-2009, the Privacy Commissioner found that FB needed more privacy safeguards. Responding with a complete overhaul of its so-called “privacy architecture”, FB offered new settings to its then nearly 500 million users. Although deemed a “privacy U-turn” by major media at the time, the net effect of these new settings was, ironically, a massive and unprecedented information grab by FB. In a rather subtle and ingenious move, FB very politely gave our Privacy Commissioner the new settings that she wanted. But when FB gaveth, it also swiftly tooketh away.

Choosing to create default settings that collect more information than ever before, FB knew that 80-92 percent of its users would never change those settings. Behavioural economics made it clear that, like a bad sugar habit, FB could get away with nudging us further towards unhealthy information consumption habits.

Currently, the Privacy Commissioner is powerless to do anything about this. Without changes to our laws, Canadian legislators are allowing social media sites to build vehicles that default to “go” rather than “stop”. Zuckerberg knows how unsafe this is; that is why he has rejigged his own settings. He knows that FB’s defaults are dangerous.

So, why isn’t what is good enough for the geese, good for the gangster?

The devil is in the defaults. We need to fix this through legislation that contemplates settings with privacy as the default. So, while I agree with Professor Geist that Twitter should be commended for its do not track, Google for its privacy dashboard, etc., I would take it one step further. We need legislation that would make some of these amazing features of our online experience non-optional. They should be factory-built and installed, with privacy as the default.

I will make my second submission much more succinctly since it is similar to testimony I offered during the first PIPEDA Review.

The biggest threat to privacy is not social networks, or surveillance cameras, or wireless mobile, or databases, or GPS tracking devices. It is the standard form contract. Under current law, almost all of the privacy safeguards built into privacy legislation can be easily circumvented by anyone who provides goods or services by way of standard form agreements. By requiring users click “I agree” to their terms on a take-it-or-leave-it basis, companies can use contract law to sidestep privacy obligations. In short, this is based on a mistaken approach to the issue of consent. In my written submission, I offer detailed legislative reforms that would help prevent companies from doing an end-run around the protections set out in Canadian privacy legislation.

Thank you for your consideration in these matters. I hope during the question period, Committee members will give me the opportunity to expand on my 3 main recommendations: (i) “mandatory minimums” for privacy transparency; (ii) mandatory privacy-default settings; and (iii) mechanisms that prevent contracting-out of privacy through standard form agreements.