This is the final part of a three part series based on a talk given for the Institute of Practical Philosophy at Vancouver Island University in April 2015.
The series includes:
The views expressed in this series are my own.
Unpacking (some of) the issues …
In the second part of this series, we talked about examples of how big data can be used, from credit card fraud to government surveillance. In this third and final part, let’s change gears again and look at some of the issues surrounding big data and the surveillance of everything.
Living with big data
“A surveillance society is not only inevitable, it’s worse. It’s irresistible”–Jeff Jonas
Originally made in 2011, this quote is now stale. In 2015, we are talking about a future that has already happened, and we are perhaps only now realizing that what has happened is much more invasive than we could have anticipated. The examples given here would not be possible at this scale without big data technology and they aren’t futuristic examples; they exist here and now.
Technology is changing the world, perhaps much faster than you and I can adapt. Change in this case is not about new gadgets but about a fundamental change to many facets of our lives. And yet, many of our privacy laws were written before the big data revolution. Just because you can get the data legally, does not mean it is ethical to do so if outmoded and outdated privacy laws are essentially lagging behind. The example of Target’s targeted marketing indicates even the company itself realized at some point that what it was doing was bordering on creepy. Legal yes, but also borderline Orwellian.
So why is a surveillance society irresistible? Simple:
Companies know if they can extract more insight from data faster than their competitors, they’re going to win–Bill McColl
These companies might counter any claims about becoming Big Brother by saying that they are really just creating a better internet experience for you, the consumer who prefers relevant ads related to your interests. Similarly, for governments, there is a competitive national security advantage if they gain better insights before other nations or hostile groups.
Data collection or surveillance?
Data collection and surveillance are tautologically related, even if perhaps only trivially so. If you record all data, you are in some way already engaging in surveillance.
You might not consider data collection surveillance until you do something with the data–much like crude oil it is not useful until you refine it–but what if you compare this activity to the increasing prevalence of closed-circuit television cameras (CCTV)? Is that surveillance from the moment you record passers-by on video or only when someone starts analyzing the footage?
The suggestion here, of course, is that these CCTVs are called “surveillance cameras” for a reason, and these surveillance cameras, too, have become part of the big data continuum, with stream processing of footage to make connections where none could be made previously. For example, systems are already available that can identify criminals based on processing of on-scene camera footage and comparing it against other readily available information on social media, such as pictures on Facebook as well as existing police databases. Similarly, CCTVs are already used in places such as the United Kingdom to create a “ring of steel” around some geographic areas, performing automated number plate recognition and cataloguing pictures of all vehicles and their passengers entering or exiting an area in order to correlate vehicle movements with criminal activity (Source).
Without the data, no analysis is possible. So, where does data collection end and surveillance start? The point is by and large moot: Few collect data passively without purpose or reason.
Do I participate … or abstain?
In the case of CCTVs recording all activity of vehicles entering or exiting an area, your information is swept up along with everyone else’s, as there can be no expectation of privacy when you are in public. But what about other areas, such as participation in social media?
As it turns out, it does not actually matter if you actively participate or abstain, your life is very likely touched by big data. Consider the example of Facebook mapping the network of all human relationships by way of social network analysis (SNA). We know that Facebook indirectly tracks individuals who are not registered Facebook users as the company creates complex models of how people relate to each other. If you are not a registered Facebook user, you are simply the network node that has not yet popped up and that is greyed out until you do register.
This kind of tracking behaviour is corroborated by examples like the Reddit user easyjet who signed up to Facebook in March 2015 without providing any tangible details, who had the same name as over 500 other people, and who remarked: “Just signed up for Facebook with a rarely used email address. No phone number. / They know all of my friends immediately” (Source). In this case, easyjet might have been the missing node in the social network of human relationships until he joined Facebook. He might also have been given away by a work email that someone in the his social network might have had and that allowed Facebook to make the connection based on the fact that this work email appeared in conjunction with the rarely used private email that he used to sign up. Whatever the exact method by which Facebook arrives at making the connection, it is somewhat akin to having a user profile without actually being a registered user: Damned it you do use Facebook, and damned all the same if you don’t, because you increasingly are affected by your online relationships with others whose behaviour you cannot control.
There is also a more sinister Facebook example from the EU recently: “Facebook tracks computers of users without their consent, whether they are logged in to Facebook or not, and even if they are not registered users of the site or explicitly opt out in Europe. … The issue revolves around Facebook’s use of its social plugins such as the “Like” button, which has been placed on more than 13m sites including health and government sites. / Facebook places tracking cookies on users’ computers if they visit any page on the facebook.com domain, including fan pages or other pages that do not require a Facebook account to visit’ (Source). The short version is that if you even just go on a page that has a Facebook Like button on it, which is exceedingly common, you might get tracked under this scenario, either by way of a cookie or as a visitor to that page through Facebook’s social plugins. Facebook now claims that this cookie placing behaviour for European Union users was a bug and is in the process of getting fixed.
“Almost Orwellian” (Bill C-51 edition)
“Almost Orwellian” was the description that a U.S. federal district judge gave to the National Security Agency (NSA) programme keeping records of all American phone calls when the programme was first revealed (Source). In Canada, we are currently faced with a bill known as Bill C-51 that might grant some very sweeping powers to security agencies. This bill has passed the House of Commons and is now sitting before the Senate; if it passes the senate, it will become law.
You have already seen earlier that projects such as Levitation exist to monitor and trace file upload activities of all Canadians, regardless of whether they are suspected of criminal activity or not. Let me tell you what the Privacy Commissioner of Canada, Daniel Therrien, wrote to the Standing Committee on Public Safety and National Security about about Part 1 of Bill C-51, which would establish a new Security of Canada Information Sharing Act (SCISA):
“We are moving very quickly into the world of Big Data … to spot trends, predict behaviours and make connections before any specific investigation is initiated or any particular individual is suspected of anything. … 17 government institutions … would have virtually limitless powers to monitor and, with the assistance of Big Data analytics, to profile ordinary Canadians, with a view to identifying security threats among them.
In a country governed by the rule of law, it should not be left for national security agencies to determine the limits of their powers.” —Daniel Therrien
Your immediate reaction might be to express appreciation for a national privacy commissioner who gets big data and calls it out specifically. But more seriously, the concerns the privacy commissioner had with Bill C-51 were, verbatim, that it:
- “sets the threshold for sharing Canadians’ personal information far too low, and broadens the scope of information sharing far too much.”
- “is far too permissive with respect to how shared information is handled. It sets no clear limits on how long information is to be kept.”
- “fails to require that information sharing be subject to written agreements.”
- “exacerbates serious gaps in existing oversight and review mechanisms, and does not facilitate sharing between review bodies. As for affected individuals, the privacy regime provides no judicial recourse for improper collection, use or disclosure of their personal information.
Note that this is the same Privacy Commissioner who was not invited to the recent hearings on Bill-C51, nor were any of his provincial or territorial counterparts.
Some smaller amendments were made, especially for the definition of what an activity that undermines the security of Canada amounts to in order to avoid scooping up simple civil disobedience rather than terrorists. By and large, these amendments will not change the bill dramatically, and overall the powers granted will be broad and they will infringe on your privacy.
So ask yourself: Do we want our children to live out their lives in a state with no reasonable expectation of privacy? This is a dialogue not dictated by the passage of any one set of laws, and it is a dialogue we have only just begun. As for the bill and similar legislation around the world, let’s call this approach predictive Orwellianism, where you can be profiled and investigated programmatically by a piece of software ahead of time, without ever having done anything wrong.
If Canada is moving towards the predictive Orwellianism of a big data surveillance society, then what is it like to live here? To frame this discussion, let’s talk for a bit about Jeremy Bentham’s eighteenth century panopticon. Often cited, the concept of the panopticon is still a useful starting point.
In its original design, the panopticon is a blueprint for institutions that is elegant and sinister at the same time. The initial purpose was for the panopticon to serve as a prison with a watch tower at the center of the prison, radially surrounded by cells containing the inmates. The watch tower in the centre is equipped with blinds so that guards can freely observe the prisoners, but the prisoners can never see the guards. At any one time, you might be watched or there not might be anyone in the tower at all, you just never know.
The idea behind the panopticon is to create a prison of the mind that is much more effective than any direct supervision by guards could ever be: If you are unsure whether or not you are being watched, you assume that you are being watched and start modifying your behaviour accordingly.
Michel Foucault scales up the idea of the panopticon to apply to many of our cultural institutions more generally, where the panopticism that is derived from the prison design is “a type of power that is applied to individuals in the form of continuous individual supervision, in the form of control, punishment and compensation, and in the form of correction, that is, the molding and transformation of individuals in terms of certain norms. This threefold aspect of panopticism–supervision, control, correction–seems to be a fundamental and characteristic dimension of the power relations that exist in our society” (Source).
Or, as Glen Greenwald puts it much more succinctly,
A society in which everyone can be monitored at all times is a society that breeds conformity, obedience, and submission.–Glen Greenwald
Now, you might not immediately buy into the idea that our modern day societies are essentially oversize prison designs, but let’s look at a possible modern-day reinterpretation of Bentham’s idea.
One of the aspects that makes the panopticon so sinister is that it aims for surveillance–observation by the guards–to be as “discreet, unobtrusive, camouflaged, unverifiable” as possible (Bogard, The Simulation of Surveillance, 1996). If you can invent a form of panopticism that is more effective without the building than with it, then you can get rid of the requirement for a physical prison altogether. If you then take this to the next step, it is not hard to see a modern day panopticism in the clandestine big data surveillance that governments engage in, as it meets all of these criteria: Surveillance is discreet, unobtrusive, camouflaged, and unverifiable. When bills such as C-51 become law, the Canadian government is successful at pushing big data surveillance deeper and deeper into the shadows, where it will become virtually impossible to detect when you are being watched.
The evolving concept of privacy: On ‘nothing to hide’
Related directly to our discussion of panopticism and clandestine surveillance, here are three brief commentaries on the evolving concept of privacy. Two of these commentaries might strike you as funny, but one of these is decidedly not funny.
Mark Zuckerberg: In a 2010 TechCrunch interview, Facebooks’ Mark Zuckerberg said that people are more comfortable with sharing information about themselves online, to the point that Facebook changed privacy settings for their users so that things you post are publicly viewable by default. According to Zuckerberg, they were able to make this change, because privacy as a social norm has evolved over time and is becoming weaker.
Before we go on, let’s mention that this is the same Zuckerberg who famously purchased not just his own house but also four houses around his house to prevent being monitored by his neighbours and to ensure his own privacy. There is a clue for you. But it is true, people want to share information about themselves online at a scale and level of detail like never before–think previously private family pictures of your children–and they wish to have the freedom to interact with each other like never before. Any updated notion of privacy must accommodate this general shift in expectation and behaviour.
Eric Schmidt: The then-Google CEO Eric Schmidt suggested that “if you are doing something you don’t want someone else to know, maybe you shouldn’t be doing it in the first place” (Source). Schmidt then blacklisted CNet News for a year after they published personal information about his salary, his neighbourhood, his hobbies, and his political donations. The source of this personal information? Google’s own search engine.
This is a case where the Google CEO says that privacy does not matter unless you have something to hide and then blacklists CNet News for violating his privacy with the very tools that Google makes freely available (Source). The response from CNet News was that “if you don’t want us to know how much money you make, where you live, and what you do with your spare time, maybe you shouldn’t have a house, earn a salary, or have any hobbies, right?” (ibid).
Glen Greenwald: Greenwald debunks this argument that there is no real harm done by surveillance if you don’t have anything to hide, in his TED talk. This is the same Glen Greenwald who helped Edward Snowden reveal the mass surveillance by the NSA.
To Greenwald, everyone who says that they have nothing to hide falls into the same category: They say these things but they do not mean them for themselves. Witness the actions of Mark Zuckerberg and Eric Schmidt when their privacy felt threatened, for example. All of us have something to hide, and we make judgements about what we want the world to know and we do act accordingly, to the effect that, even though humans are social animals, we also display a need for private space.
The effect of not having privacy mimics the effect of the Thought Polics in George Orwell’s Nineteen Eighty-Four:
There was of course no way of knowing whether you were being watched at any given moment. How often, or on what system, the Thought Police plugged in on any individual wire was guesswork. It was even conceivable that they watched everybody all the time. … You had to live–did live, from habit that became instinct–in the assumption that every sound you made was overheard, and, except in darkness, every movement scrutinized. (Source)
To Greenwald, there are two very destructive lessons implied by this kind of kind of Orwellian habit that becomes instinct:
- The only people who care about privacy are those who are by definition bad people, for example, terrorists. Under this scheme, objecting to surveillance itself makes you a suspect person, which is something that was in fact raised against witnesses at the Bill C-51 hearings.
- There is an implicit bargain such that you if are willing to render yourself sufficiently harmless, then you can be free of the dangers of surveillance. Again, only a dissident would have something to worry about.
According to Greenwald, surveillance suppresses freedom in all sort of ways, and we need to oppose surveillance. There is a collective good, he argues, that is rendered by dissidents who are willing to resist those in power, including preserving the essence of human freedom. Under this view, the measure of freedom in a society is not the freedom we give to those who agree but to those who disagree and resist orthodoxy.
Government surveillance versus companies
At this point, we do need to draw a distinction between what governments do and what companies do. Panopticism might be a metaphor for government surveillance, but it does not extend to what companies do. If anything, companies do not want you to stop you from doing something. If anything, they want to help you do more of what you like to do, preferably by selling you something or making it easier for you to engage. So there is a difference. To those companies, we live in technological glass houses and they are watching us, forever ready to sell us a bit more of what we might desire.
The riot is one night … but metadata lasts forever
This image off imgur (Source) captures some of the sentiments about big data and government surveillance perfectly.
We may complain about living in a world of panopticons or dystopian glass houses. We think of government surveillance and we think of companies trying their darndest to sell us something …. But how can we deal with the glut of privacy issues in the age of big data?
It would be presumptuous and frankly a little bit silly to assume that we can solve the issues surrounding big data in a single blog post. It is just not going to happen. There are simply too many different threads that are part of this conversation we have not considered. Instead, let’s come full circle with the earlier overview of what big data is and remind ourselves that underneath it all, underneath all the hyperbole, the sensationalism, the public protests against Bill C-51 and the complaints about companies like Facebook and Google, there’s really just … data. A lot of it mind you, but just data. The tools we use to process this data can be subjected, and in many companies typically are subjected, to standard, generally accepted practices which ensure that the data is used in compliance with all given rules. This does not entail that there is commitment from governments or all companies to follow these rules, but the practical solutions are really rather quite simple, centered around the idea that privacy means you can exert some level of control over your personal information.
There are a number guiding principles that can help:
Data governance: Some of the data collected today has no best-before date and may stick around indefinitely. As Yahoo admitted, “Our approach is not to throw any data away” (Source) and the same is very likely true for most government surveillance data. Simply put, you need to have the right to be forgotten, not just by campies but by government data repositories as well. Data about an event you attended in your twenties without having done anything criminal should not be part of your record in your eighties. Good data governance means you follow some data lifecycle practices that dictate how you deal with data for the span of time that you have it, and at some point that data needs to be deleted and destroyed.
Oversight: There has to be some robust and effective level of control that is independent of the national security agencies themselves. The idea that security agencies would self-regulate their behaviour when they are charged with spying on your lives as effectively as possible is frankly absurd. It did not work in the United States and it is unlikely to work in Canada. Some sort of independent oversight mechanism has to be in place and has to be respected.
Oversight continues to be one of the achilles heels of Bill C-51: “14 of the 17 agencies listed … that will receive information for national security purposes are not subject to dedicated independent review or oversight. To fill that gap, the jurisdiction of one or more of the existing review bodies should be extended to include the 14, or a new expert review body with horizontal jurisdiction should be created to review the lawfulness and reasonableness of national security activities” (Source).
Transparency: The response to the uncovering of government surveillance programmes might well be to try to hide these programmes deeper in the shadows of legislation. We need to reverse this trend and drag the panopticons out of the shadows and reaffirm some level of control over our own personal information. You should have the ability to see who has access to your data and you should have the right to limit access to your data within reasonable, law-governed confines. To put this into some perspective, in the IT industry we talk about dark shops quite a bit. These are IT departments processing data that are hermetically shielded from the outside world, so as to avoid security risks. We do not want government surveillance programmes to be a series of hidden, dark shops that operate with autonomy. In the private industry, dark shops are exceedingly well-regulated, and you know exactly what goes in or out.
Where can you turn? What can you do?
If you want to complain about companies invading your privacy, where do you turn to? Typically, your government. In the case of big data and the surveillance of everything, doing so may prove to be difficult:
“Had lunch with a former student a few years ago. He now works at the National Security Agency, which has grown dramatically since 9/11. He told me there is a big revolving door, with many employees leaving NSA to join Facebook and Google, and vice versa. I asked, ‘Why is that?’ and he replied without a trace of irony, ‘Well, we’re all doing more or less the same thing.’” (Source)
If your government is pretty much doing the same thing, then what? What is left are independent non-governmental organizations like the Electronic Frontier Foundation and OPENMedia.ca. OPENMedia.ca delivered 100 000 signatures of a petition they ran during the Bill C-51 hearings, including mine.
Bill C-51 is very likely to become law in Canada, but the passage of any one law does not end the public discussion of an important topic. If anything, the balancing act between reasonable data collection through big data and the surveillance of everything is part of a discussion about individual privacy which we have only just begun. Take action to educate yourself, learn how to protect your own privacy and become involved! And support the people who support you!