Darker Side of Digital: Human Rights Implications of Technology in Canada & Abroad

Darker Side of Digital: Human Rights Implications of Technology in Canada & Abroad

Show Video

Good. Evening everyone. Welcome. To our panel thank you so much for being here it's wonderful to see so many people here tonight's, normally. We start these events by acknowledging the, land on which the University of Toronto operates. As a traditional, land of the, húrin when dad's the, Seneca, and the, Mississauga's, of the Credit River but. I'm also mindful that acknowledging, specific, indigenous, communities, that. Have claims to certain territories, is itself a colonial, practice, since. Such statements, generally acknowledge ownership of land which. Does not reflects, indigenous, ways of understanding the relationships, with. Ancestral territories, so, instead I'd like to acknowledge the various indigenous communities that, have lived here before. Us and keep living here now and to say how grateful and how, privileged, I am to, have the opportunity, to live and work on their land, my. Name is Tamara misskaddy and I'm the director of the international in race program at the Faculty, of Law and I've, been due for two years and it's been such a privilege to. Work with some of the law schools most thoughtful students, faculty, and staff inspiring. Alumni, and a, host of partner organizations in. Canada and abroad, our. Fundamental, priority, at the IRA chirpies impacts. We choose all events and projects, with, careful consideration, of the, potential positive impact on our students legal education as, well. As on advancing human rights issues, tonight's. Who were able to do just, that, the. HRP along with their partner Human Rights Watch one of the world's leading research and, advocacy organizations. Have. Come together this evening to present a timely discussion on the, darker side of the technology, revolution which, has permeated so, many aspects, of our lives, on, a personal, note is perhaps with some irony that I'm introducing this event given, my, ineptitude with, technology, including. The one time I was tricked into clicking on some malware resulting, in my former employer being. Hacked by this unit electronic army, it's. A true story apparently for for those tech lovers in the in the group, is called spearfishing, who knew I. Want. To thank you my swash for not firing me after that unfortunate. Episode. But. Also for the collaboration, and partnership. Our esteemed. Panel our seam panelists. Lisa austin, from from, U of T faculty, bla black. Skill who is a fellow, at the citizen, lab and Felix. Who's a researcher, at Human Rights Watch who. We'll hear more about in, a moment and our. Team of volunteers, allison, daniela cara gabriela, lauren, petra this would not be possible without you all finally. As this panel is about the. Digital we want to be as online as possible, so, little housekeeping the. Event is currently being streamed perhaps, somewhat ironically, on Facebook, and. We're, live tweeting using hashtag, dark, CTO, for the, question answer period I encourage, you to tweet your questions using the hashtag at, Human Rights Watch Canada, or comments. On the Facebook live stream. Now. I'd like to welcome our moderator for the evening Stephen, Northfield Stephen, is the digital director for, Human Rights Watch and is. Responsible, for the digital operations for the organization, with, his journalism background at, the Globe and Mail and its current role Stephen is uniquely placed to guide us to. The Human Rights implications of digital. Technology, please give me a hand and welcome Stephen thank you. Thank. You Sam er thanks for the kind words I'm very excited, to be here it's and, honored. To be on a panel with such esteemed. Experts. In this field so I think. We're gonna spend a lot of time delving. As we've said into the sort of darker side of technologies, but so I thought that I would respond, jump. Into the deep dark end of that, subject, matter and and take a look just very quickly at some, of the ways in which technology, it can actually be a boon for for, the defense of human rights around the work around, the world and some of the ways in which that.

My Organization Human Rights Watch is using, it so but, the core of our methodology, is, a researcher. Like Felix going into the field gathering. Evidence physical. Evidence witness, testimonies, are at the core of it hearing those stories from, victims, from witnesses, of crimes, corroborating. Confirming, and then publicizing. Those in order to effect change and that's always going to be the core of what, we do at Human Rights Watch but. I think increasingly what we're seeing is that technology, can be a real help to us, and being able to investigate, these crimes and affect change and so I in the very short amount of time that I have I want to just touch very briefly on, some of those technologies, and what we're doing with them I think, one of the most higher-profile. Ones right now first is satellite imagery, is the use of satellite imagery to investigate. Human. Rights abuses and and probably the best recent. Example, of that is the crisis in Burma with, the row hinge we. Were able to use satellite, technology, we have a full time satellite, tech analyst. And we, were able to document the, destruction through. The ethnic cleansing by the Burmese, army of rehén. J villages and in brackeen state. And, this. This technology, is very effective in a couple ways one is it's quite irrefutable, when you show a satellite. Image of a village from, last year that is fine and then nothing, the picture. From two weeks ago that shows nothing but burn scar marks from one end to the other it's, a very compelling, and an irrefutable, sort, of evidence and we've found this to be very powerful, with, policymakers. And changing people's minds but it's also turned out to be an amazing storytelling. Tool because part of our job is to reach the world in a general audience is what we are and again that that to, see the evidence so stark makes these a lot of these crimes very concrete, so satellite imagery has been a very powerful, tool, for us, one. Of the other things that we're doing right. Now that is really nascent, but I think is also very. Interesting, is the use of drones so Human, Rights Watch has a little mini drone fleet now we have three drones that. Were donated to us by somebody who wanted us to pursue this tool in, our work and the, first application, that was a very interesting project that we did this, year in Lebanon, where we used the drones to document, but. 800 trash, sites across Lebanon where they were burning garbage and having a really profound health, impact, on local residents, and being able to use this tool and to show those things from the sky has been a very, a very sort, of powerful way for us to sort of push some, of our advocacy, but, we've used we. Can see all sorts of applications for, this obviously you know we can do everything from sort of you know monitor. Police conduct at protests, we can get into secret detention, camps and to be able to monitor those the, size and movements of displaced people and refugees and, the crises that we've seen around the world you can using, drones in some ways to help us gather evidence, and all those but, I think what, it's really gonna help us do is get into places that we actually can't, get access to or that we can't get safe access to so. It's an exciting, area for us one. Of the other things that we do a lot of work in is social media so being, able to sort of use the pool of data is we know thousands. Of social media post pictures, testimonies. A lot, of video every day and conflicts, like the Syrian conflict there are thousands. Of posts every day and we have been able to successfully. Harvest. That obviously it's fraught with problems in, terms of being able to verify some of that information but we developed, a certain, forensic, capacity, and being, able to trawl through social media and, we've, used it in investigations. Of protesters killings in Egypt unlawful. Airstrikes in Syria we, used it to document, some of the facts from the killing of Muammar Gaddafi and in Libya Isis mass killings, and others and it turned out to be again. Another very effective tool and the final area I wanted to mention was big data as a real buzzword, but it's it's, it's become a sort of increasingly, one that we have found very useful to take large datasets from various sources and, analyze, them for trends or to use them to be able to establish certain. Facts that we want to in some of the most recent work we've done in that was we did, an interesting analysis, of China Chinese. Government job listings to document, the sort of gender bias that was inherent, in those we've used it, we, recently, scraped.

A Bunch of data from Medicare and Medicaid to establish. The, prevalence, of the use of. Antipsychotic. Drugs, in u.s. nursing home in fact we had a report that we put out this week on that and we, established a lot of that research using, big, data analysis, and we've also been looking at a lot of the u.s. immigration data, and the light of a lot of what's happened under the Trump administration this, shows some of the abusive policies, directed, towards. Immigrants, in that country so. With. That I'll leave it I'm sure we'll get back to some of these and I will turn things over to my colleague Felix. Thank. Stephen so my name is Felix Warren I am, the ethiopia. And eritrea researcher. For Human Rights Watch. So. My sort of adventures. Was surveillance, I guess began in 2012. Ethiopia. Is the surveillance state his, pervasive, surveillance or at least that's the perception, and. We decided we want to spend a considerable amount of research time. Excuse. Me consider. Amount of research time trying. To assess what is the actual, surveillance. Capabilities, of the world of Ethiopia. If you talk to any Ethiopia, and they have this sense at all telephone calls all emails all. Facebook. Posts are monitored by the government so we wanted to get sort of beyond that. That perception, see what the reality was. The. Other thing was that surveillance, is a very sort of abstract, concept. You know you all might have the sense that you know that we were being watched by government, or others but, there was a sense that at least in Ethiopia, the surveillance, was not abstract, that a number of individuals, were being arrested based, on what they were saying. In. These communications, or who they. Were saying. Them to so, we wanted to kind of delve into that. Now. The report which ultimately, came out in 2014, is called they, know everything we do and, we'll talk a little bit about specifics, later but it was intended to look at as. I said the sort of. Telecom. Surveillance. But. Also look at how they controlled the flow of information so things like jamming. Of radio. Stations, including Voice of America Deutsche, Welle various. Diaspora stations the, jamming, of television. Stations often, from, the Diaspora and containing messages, that were independent of the government and blocking. Of numerous independent websites, and also because. There's only one telecom, provider in Ethiopia, and that's the government, they, frequently, just turn off mobile communications. The internet completely. When. It suits them for so for example for large periods. Of this past December, when, there was a bit of tension in the country they just cut off mobile. Communications, completely, outside of the capital Addis Ababa, but. One of those so what we do with our research is we would talk to victims. In this case of surveillance we'd also talk to witnesses and, we hopefully, talk, to perpetrators. And, it was surprising to us how many former. Ethiopian. Security forces intelligence officials came out to talk about their. Surveillance practices it was not difficult to get information from them but. You know we were asking we were looking for information on these high-tech tools, that they're using all. Of this but most of them would say to us like look we don't we have these tools you, know we can use them but we don't need to use them we know everything. That's happening at every single community we know, what people are thinking before they're thinking it we.

Heard This repeatedly, from different officials, so. Ethiopia, is a country where this grassroots, system, of surveillance sort of dovetails, with these, more sort of high-tech approaches, to surveillance, so, you Theo Pia the system called one two five where, one person, is responsible for monitoring the activities of five other people and any sort of anti-government, deviant. Behavior it reports, up the way this is sort of a pyramid so. Large parts, of the country are. Covered. By this so you don't know if the person that you're going out for coffee with is one, of the five or the one or is another one in within the five so, there's this perception that everybody. Is watching everybody. You. Know and doing this research you know I'm not I don't have the sort of tech background of the surveillance background, with doing this research was fascinating, my sort. Of perception of surveillance you know I assume that I am I'm the person I'm the researcher, I'm researching, this in Ethiopia. But, I start to realize that no I'm also being, surveilled, everybody's. Being surveilled the Diaspora, being surveilled is there's various, bits. Of spyware that we'll talk about later that are being used by the Ethiopian government to, surveil people here, in Toronto. And elsewhere, in in in outside. Of Ethiopia. There. Was one former official, with Ethio telecom, which is the telecom company. Outside. Of Ethiopia, was telling me how easy it was to monitor everybody's, phone. Calls and I, had given him my business card he looked up the phone number he had his laptop there he logged on to the. The customer, database. From. Again I was site of Ethiopia, he was no longer an employee of Ethio, Telecom he had fled but he still had access and he. Put in my phone number and he, showed me all the phone calls and, and, messages I had made over the previous three years which. Is many because I'm very careful with how I communicate but, it was still very easy for him for him to access including. The locations, of the, people that I was speaking, to and their and their names and ethnicity, so. Yeah, really really frightening stuff. Maybe. I'll leave it there for now and we'll, talk more about the specifics, of our findings later thank you. People. Hear me all right yeah, okay. Thank. You okay well first of all thank you for, the. Invitation to, be here into the organizer service and. Thanks, to all of you for being here I'm really grateful. And excited to share this time with all of you, maybe. I'll start by talking a little bit about the citizen lab which. If, you're not familiar is an interdisciplinary research. Laboratory, based here at the Munk school at, the University of Toronto and. We. Do technical, research and, development, and high-level strategic, and, policy, engagement around a range, of issues relating, to technology, and. Global security and Human Rights so. What that means in practice is work, around targeted. Threats against civil society. Filtering. Internet, censorship. Privacy. And security of mobile applications. Transparency. Of governments. Both. At home and abroad. So. My. Background is. A. Little bit weird, and eclectic I. Have. A law degree from McGill I, you. Know but my first love is kind of economics, I was. Until, recently at. The Canadian civil liberties Association, and I just noticed Brenda right there which is very cool you're, all in very good company. I. Was, working on national security there, I've been at, the Canadian astronaut policy, and Public Interest clinic in the Berkman Kline Center as well so kind. Of weird mix. Of Technology, and human rights work that when, asked I basically just described a science-fiction law. And. That's that's, more or less what, I've been up to I. It's. Difficult to whittle, down my work into a specific project, but I thought about three things I've been really proud to work on lately, the, first is a lot of work. Around encryption. And anonymity tools which hopefully it will culminate in some public. Reporting quite soon from the lab so, these are technologies that are really fundamental to the protection of human rights and. Global. Security. But. There are also very frequent, targets for government. Criminalization. And, censorship, so. My work has you know is everything, around you, know when can a cop force you to hand over your password, to whether. The. Government can tell, software. Company how to write their code I also. Recently. Was part of a team of people who were involved in producing a submission. To the Special Rapporteur on, violence, against women who, recently did a call on online.

Violence. Against women which is sort of contested, term, we were really lucky to have the support of the International, Human Rights program here at U of T and Chelsea who is, our student this, semester on that project. And. Our. Citizen. Labs interest and engaging in that conversation is really because. Narratives. About women, being vulnerable, and, victims. Are, often. Used to justify new kinds, of surveillance and censorship powers. And so, we really wanted to introduce a kind of counter narrative, based. On this you know radical. Idea that women might have other rights beyond, being safe, like. Privacy or freedom of expression. And, so you. Know we. That. Project is now sort of emerging into a legal, analysis around stalker wear which we can talk about a little bit later um, I've also done a bunch of work on signals intelligence, recently. If folks are familiar there's a new national, security bill called, c59. People. Might be familiar with the debate around bail c-51, the very controversial anti-terrorism, law from 2015, c59. Is sort of the current government's, answer. To some of those issues you know in reality it, improves, some, things makes, a lot of things more. Complicated, and make some things quite a bit worse so we. Can talk about that after but we've, particularly, focused around the role, of signals intelligence agencies. And in particular. The, communications, security establishment which, is sort, like the Canadian equivalent of the NSA. But. Instead. Of talking about any of those specific. Things in more detail I guess I just wanted to share. A couple of small reflections, to help set the table for this talk, on. This idea of the the darker side of digital. You. Know so the first thing is I'm not quite sure that there are two sides a light side and a dark side instead. There's maybe. Multiplicities. Or complexities. You. Know and. Technology, law, and policy is always very difficult because the boundary lines. Change. And shift you know between what is what. Is offline and what is online is not always clear. The. Law students in the room will understand that there's tremendous issues, around jurisdiction. Of the, issues, around competing, rights, get, very muddy. And. And, also to just say that you know technological, change is rarely good or bad right, but it, does have certain tendencies, you, know one is that it de territory eliza's and it dis embodies, it. Has, a sort of exponential. Effect it, it increases, complexity. And scale, and velocity, of problems. And. There's. A cryptographer phil rogue away who says that, cryptography. Rearranges. Power I think that's actually true of all technology that technology rearranges. Power and that's, a useful lens for thinking about the work that we. Do and. I guess I'm the last thing I would want to say is I recently, heard this really powerful interview, with Rebecca. Solnit who's, written a lot on this idea of darkness, and she, talked about wanting to rescue, the idea of darkness, from the pejorative for for, all kinds of reasons and she wants to characterize, it not as something necessarily. Nefarious. Or malevolent, or, evil, but darkness. Is a kind of unknown, as a, sense, of possibility. Or. Uncertainty, or, unknowability. And, I think that's maybe. More of a fertile environment to. Work in for this conversation, it's, where I try and situate my. Work. You. Know just to, see the the future and, probably. Parts of the present as having. This dark side that's you, know a little easier to see seeds, of hope and possibility and change in, so. Yeah I think that's all I want to share for now thanks. Thank. You I wanted to also say thank you to the organizers, for inviting, me, I'm Lisa often one of the law professors here at the University of Toronto and. Yeah, I I work a lot on privacy, issues, and I was thinking about a variety, of projects, I've done over the years and tried to think, about some of the themes, that, obsessed. Me because I tend to sort of get drawn to projects, because you, know it's like oh this is another example of, this theme that obsesses, me and one of the themes that obsesses, me in thinking, about privacy, and technology, generally is what happens what gets lost in translation when we shift from what I like, to sometimes call the analog world to the digital world so. You know if we're walking, in the street in the analog, world you understand, what it is to go and walk down the street and you understand that the person who passes you on the street can, see you and you have a social understanding of what that might mean and, and you had a video you can anticipate that you know it's probably unlikely that, some weird person is hiding behind a tree but maybe they are and but you know the person in behind the window over there might peer out at you casually, and you have a sense you can anticipate you're, likely audience, and you can understand, what it is to walk in public but, what happens when you actually now shift to a digital world or world where you have sort of pervasive cameras, everywhere and you don't really understand, anymore what it is to walk in that public space you don't know who's watching you you don't know what's gonna happen you don't know who the audience is you don't know if they're gonna keep that information you don't know if someone's even got the camera turned on, and, and, what does it do to our understanding, of what it is to be in public when you add these technologies.

And. Now you know think about what it's going to mean when we have you know smart cities like the sidewalk Toronto. You know development, in Toronto you have widespread sensors. Everywhere in. Our city space what is public space turn into so. When we think about privacy, and our a lot of our models about privacy are built. On in, an analog world and it's very easy in, the. Legal. Doctrines, in those cases and. Those theories to say Oh while you're in public you, don't have a privacy, interest and it made a certain amount of sense in the analog world but. In the digital world it doesn't make a lot of sense that's just an argument to actually, permit. Fairly, broad and widespread, surveillance, and so a lot of what I try to do is to try to look. At what's lost in translation, from the, analog, to the, digital, and I think metod the metadata debates, that have been going on is another one that, this happens to you so metadata, about communications. Is data. Information, about, our communication. So you know the what, I say to Lex, would be the content, of it but that I had, a conversation with, Lex, would be the metadata the information, about that conversation but, again in the traditional, kind of analog world we think well it's the content, that is so important, that's where our privacy interest, is and there's. A lesser, interest in that other information it's simply just not as revealing, but, in the digital world when you start collecting that information in the digital world it's, often. Structured, in a way that allows you to collect, it stockpile, it and analyze it in ways that are highly revealing, and, unanticipated and. So, the idea that it's, not as private as content, just, doesn't work anymore and. It's not that how we thought about it in the analog world is wrong it's we're not in the analog world anymore, and and we need to think about these things differently, and so that's a lot of what I try, to sort, of mull over but, I think it's a much broader a conversation. Than privacy, it's a broader conversation we, have to have about technology. More generally, and and it's something I try to think about these, days in relation, to just. How we regulate, in our models of regulation, because as lawyers, you know we have a lot of ways of regulating that, I really, again. Built, on understandings. Of, human. Decision-making the, frailties. Involved, in human decision-making, how we keep, human, decision-making, accountable.

And Transparent. It's. Built around how, we use, words how, we interpret, texts, but, how we regulate, the digital world isn't going to necessarily involve. Those things or those skills, and and that's, where I may be a. Which from the dark side of Technology to, maybe Stevens story of maybe we need to enlist technology, in. Creating new modes of accountability. In, this sort of digital space, and that some of the work that I'm, sort of moving towards doing as well, and, also, to leave, it at that general level but I know we're gonna dive into some specifics, in the questions. Thanks. Lisa. So. It is a very big topic area very broad very complex, but before we sort of dive into some of the detailed work I just kind of wanted to maybe, frame this with a. More. Of a reflection on, kind, of where were where we're at in terms of how we relate to the idea of privacy I think from my own personal experience, I find that the people I know fall sort of into two camps you have the camps, that is completely, blase, about what corporations. And what governments know about them they've sort of an is tensed, I think to be younger like my three teenage children who had sort of lived in this world all their lives they they're not sort of they're, not worried about it they just sort of accept it they see there's no there's no way of sort of you know pushing back at the idea of the invasion of privacy so there's that side of the people that I know and then there's the other sort of completely, paranoid side, you feel like we're on a slippery slope towards I'm sort of Orwellian, dystopia. If it's the end of the world it's like a bad, episode of, Black Mirror so, where where how worried, should we be is what I'm asking like how would you are should we be at DEFCON one are we at DEFCON five like how concerned, should we be where we're at at this moment in terms of our own personal kind, of invasion of privacy from, these different sources just a very general sense. Either. Lisa or Lex or. So. First I think I want to resist, this idea that people don't care, I, think, that there's a difference between being, jaded. And disenfranchised. And, worn down from feeling like you have to participate in this kind of transaction. To participate in public life. And. In, thinking that that's actually a fair deal and, also, to, put out that there there is a kind of reasonable. Evidence based, kind of paranoia, you can have that, you. Know maybe we can turn, back to you certainly it's something that you experienced in your own work right. But. On this idea of. You. Know an Orwellian, world. You, know there's that saying the. Future is already here it's just not evenly distributed, I, think that that's you know like dystopia, is already, here it's just not evenly distributed, right, and I mean that both. Abroad. Right, if you look at what. Governments, like Ethiopia, what. Chinese government, the way they operate and in in in you. Know countless, countries worldwide we, already see the the, trappings. Of you. Know Orwellian. Surveillance. States like that that infrastructure, is already there but I think it also exists, to some degree at home, in, North America and so. You know how much did you worry well the, answer to that question is you, know it depends on who you are because certainly in. Canada as, in the United States you. Know people of color indigenous. People, poor. People people, interfacing. With the criminal justice system, right these, are people who are subject not only to disproportionate. Surveillance. But. Also disproportionate. Consequences. For, that surveillance, and I think that it's really important, to think, about privacy. Not. Just about, you know as a sort of isolated right but contextualized, in this conversation about freedom and expression so. Anyway I think I just want to put that on the table to. Start with that that there. Is a healthy. Spectrum, of paranoia, through inequality, lens that we should talk about I. Don't. Know if you have other reflections on that I agree. With that that's a excellent, comment, I would I would just add that sometimes, when when people say people don't care, about privacy they have this kind of almost, quantitative. Notion of privacy that if I share information. With other people then, somehow I don't care about my privacy, in relation to that information but. Privacy is highly contextual.

It Depends, right so you might want to share information with someone for a particular purpose but, you don't want it shared with somebody else that you shared in one context, doesn't at all mean that you don't have any privacy interests in relation. To a different, group of in relation to a different kind of audience I don't care if you know with my presto card collects, information about, me if it's going to improve the, efficiency of Toronto transit I would applaud that I care a lot if, they're handing it over to the police without a warrant as presto, has been reported, in the media of doing and. I think a lot of people would care about that as well, so I think this sort of sense of look what people are posting on privacy they don't care about people, with posting on Facebook they don't care about privacy is a very, kind of simplistic. Way. Of thinking, about actually the complexities, of privacy, and the other point I would say is that you know a lot of people just feel they have no meaningful, choice, so. The idea that okay yeah I've done. This or I've consented, I clicked I agree in. Order to participate in, a particular thing it. Doesn't necessarily, mean that you had a real meaningful, choice to. Do so in a way that was more privacy protective. You know, informed, consent and meaningful choice are just not the same kind of thing. I, mean. This, is a I think a good segue back, to some. Of the work that Felix, was flagging, that he's been doing and tried to get to it look, look at a community that has been sort of profoundly, affected, by it. Got quite you know relative, to our experience obviously a pretty extreme example okay, and so Felix, it would. Be helpful if you could go back and sort of talk about sort of where the Ethiopia, worker came from why Ethiopia, and a little bit of how you did the research and I'm I was particularly interested. In going back to how you were talking about how that affects because, I think this is what we're here to talk about how does this affect the individuals, and what that did both, within sort of Ethiopia, and outside as well so if you could just walk us through a little bit of that yeah. Sure I mean as. I said there's been this sort of perception, of pervasive surveillance in. Ethiopia. But there's very little hard evidence to actually you, know indicate, that was happening, and. You. Know I. Would. Interview Ethiopians, but so why do you think you know you every, phone call is monitor that goes obvious everyone calls monitor I'm just like this sort of like. Paranoia almost, so trying to sort of understand where where, that came. From is you dig a little bit deeper you. Understand them well everybody knows somebody that's you know had, been talking to they're there in America, on the phone and, somebody. Comes on the line it says stop talking about that why, are you talking about the government in a bad way or, you. Know their their their, neighbor. Was, talking to a cousin. In Kenya across the border and very. Shortly thereafter they, were arrested.

Based On that telephone call so everybody has these sort of experiences, that they're not terribly. Willing to share because again we think that you know somebody is watching. Or somebody's listening but, everybody has that this this type of experience, and so as we began, to sort of dig deeper into this we, we talked to dozens and dozens of individuals, who had been arrested, either. For. Talking. To an individual, the government didn't like you, know usually. Who was outside of the country or. Because. Of the content, of the phone calls now people know Ethiopians, don't. Often speak openly on the phone so everything's you know using code words and analogies, and you, know how to call with an Ethiopian the other day where we talked about football for 30 minutes soccer for 30 minutes I didn't know if we're talking about soccer if we talking making rights abuses maybe it was, completely, useless. But so the government will the. Government will interpret. These. Sort of phone calls how it wants to. We. Talked to a lot of people, who, there's, one person in particular who's really, stuck. In my mind, who. Was a journalist a young journalist and in 21. 2012 he. Wanted to attend a protest well that time in Ethiopia protests were quite rare so as, soon as he got there and pulled out a camera he, was arrested by intelligence, his camera was smashed his phone was destroyed then, he was taken to into to, the local police station where he was he was hung out by his wrists he, was stripped naked and he, was whipped while, they questioned, him about all the phone calls he had been making over the previous three weeks to, activists. To to a Muslim. Rights activist, outside. Of the country who, is Ethiopian, to. Several international. Human rights organizations. We were not one of them but several international, rights organizations, and you, know why are you talking to these people these people are against the government so he spent three months in that police station after he was you, know brought down from his from, his whipping. He, was put in solitary confinement I, thought, about him this week as we were you know talking, this-this-this talk. That'd. Be good to get in touch with him but when I interviewed, him which was in Kenya about, a year after, he. Had had this awful experience, he made the decision that he doesn't want the phone you, know he's shutting. Off from that world so I you know I had no way obviously.

Here In Canada I had no way of getting, a hold of him and short notice, you know his experience, as a journalist a well-connected journalist. You. Know with telecom surveillance, was that he would no longer participate, in that world. Kenya's. World is very interconnected too and you do everything by telephone, you you move money you yeah. He do everything so. I've seen a massive impact not only on his life at the time but how he viewed his, sort of place in the in the digital world into, the future no. There's lots of legal processes, and Ethiopia the government's I was fond of pointing this out you need court warrants - you know facilitate, surveillance, but in practice that that just, doesn't happen you, know the intelligence officials just walk in the front door of Ethiopia. Calm and access, all the information that they want - I think. The other thing is I hear discussions, about privacy, so I live in Ottawa some Canadian it's like your discussions of a privacy and surveillance here, in Canada or in the United States, I mean there are basic. Protections, that we have I mean we can debate around the edges whether those are those, are enough, or not but there are some protections but. One of the challenges in Ethiopia is that there are virtually, no protections, there, on freedom of expression freedom of association freedom of assembly so you who you talked to you know there. Are so many cases of Ethiopia's, that were charged with terrorism because. They, talked to. Somebody on the phone that. The government didn't like who remembers are a legally registered opposition, party or a journalist, either local or international you. Know we're not engaged in anything remotely be considered, you. Know something that would get you charged with terrorism and then they get convicted and, they spend long periods of time in prison. Because you know there's very little independence of the judiciary so, those sort of protections, that are in place in many countries are not in place in Ethiopia, which makes this sort of you. Know surveillance, practices extremely. Worrying. I was I was very surprised at how widespread it, was. And. What is it sort, of at a societal, level like, the relationships, between people like it it sounds like you described it the it sort of tears at this sort of social social fabric of community, and that's, paranoia, but everything, and not being able to trust anybody it, does is incredible so like when I do interviews you, know if I'm often, have a translator, depending what the languages, that I'm doing interviews in you, know if the person doesn't know the translator, there's just you know they're not going to say anything if, there's. Somebody outside of the place or doing the interviews, who's you know standing on the street corner Neath the open you haven't seen before paranoia. At every single at every single level it really affects their abilities to talk about issues to, tune you know to do, business to go to the market to sell produce I mean an impact sort of every, facet of life I think some of those things are breaking down the last couple of years I mean social medias become more of a thing in Ethiopia, it's interesting that the concept of privacy and. Perception. Versus reality you, know there's still a perception, that Facebook is a relatively. Private. Space. That's. Changing, there's, definitely there's more for perception it's not a private space but people's if you open say things on Facebook they would never dream of saying in person.

Or Through any other any, other medium is fascinating. So. But yeah I really it really has a very profound, impact and it's not just you know the activists, the journalists, know those that perhaps you might expect to be paranoid there's also government officials the government officials are spying on each other everyone's spying on each other so, it's a real sort of. Yeah. It's a it really impacts, this sort of ability, of people to live their daily lives and, I mean we understand. We can understand the notion of a government being able to sort of control it's sort of within its own domestic borders but I think well the thing is chilling, about ethiop. Is the way the impact that it has through the International diaspora and, I just wanted tell, me a little bit about that. Impact, that they've been able to sort of reach past their borders, yeah no it's fascinating so, you, know if you sort of controlled, most of the spaces, in the country where dissent can be expressed you know so they've effectively. Limited, you know independent civil society, media political parties. Etc. And so for the long and it's changed a little bit the last two years but the longest time any. Threats inside the country were contained and there was a perception. Of stability within, the country but, there are hundreds of thousands of Ethiopians, who, have laughed for economic reasons fleeing repression, you know lots, of different reasons we're, incredibly engaged, in the affairs back home and there in Canada there in the United States there in many, African countries there, are many European countries there all over the place and as soon the, government began to realize that they are a threat you know they're the ones that are starting. Independent. Radio stations television, stations. You. Know they're the ones that are active on social media on Facebook and, they're not censored there they're very, very. Open, you. Know that they do not fear the government, there. Was also some, some concerns about position, parties being funded, from abroad some, legally, some. Legal opposition parties, some that are that are not big Oh some that are that are armed so, the government began to use spyware, to target, these members of the Diaspora so, basically you know the the, simple definition is, you know you get an email might have a link you might have an attachment you open it your computers or, your device, is. Infected, and then they can access sort, of everything on that device your files your emails your, Skype conversations, they can log your keystrokes they can turn on the webcam to see what you're doing they can turn on the Mafia microphone, on the computer to. See what's you know what's being being spoken I mean it's very powerful and of course it's very very, difficult for to, detect these these products, and so they have used these products, they're. Commercially available they've used these products, to target members. Of the Diaspora in dozens, and dozens of countries including here in Toronto citizen, lab I mean they're there they're the experts on this we've. Partnered with them on some things the citizen lab is the ones that have really done a lot of good work on this and some of this commercially, available spyware. Are from companies. In Italy in Germany in the UK and, most recently Israel, there was a report, that came out in December from citizen lab that looked at some of the spyware, that. Was used against. Ethiopians, purchased, from an Israeli company, and, many, of them were in Canada as well there was one in Toronto a. Computer. We don't know if this person is with a computer that connected, from York University, pretty.

Regularly, And then over Christmas of. 2016. Went. To Eritrea and then came back so the computer went to Eritrea and then, the computer came back to, York University we. Still don't know who that person is maybe maybe, today we'll figure out who it is, but. You, know this this this has a chilling effect on the Diaspora as well you, know I think a lot of diaspora ten years ago thought they're completely free to say what they want to engage in politics. How they wanted in the - and many, now are really self censoring their activities because of this perception, that they may be surveilled. And when these citizen lab reports came out come out which is excellent we, get a flurry of email from from, Ethiopians. - missing I think I wonder them I think I'm one of them and I'm saying the time they're not but, it has the same effect they control their activities, if you're interested, in reading that report by a bill, mark sack one, of my colleagues in a walkabout they're really brilliant people it's called champing, at the cyber bit and it's. On our website yeah, it's an excellent report I. Think. Sort, of the easy of these is it kind of a very specific and I think probably a, very. Extreme, example of the worldwide phenomena, which is that you know advances. In technology, that are available to, governments, around the world to be able to you know increasing, power to conduct sale surveillance. Directly, indirectly but, gather data as well on. Us in our lives and so I just want to sort, of Lex and Lisa what's sort of the implications. You know for the average citizens, and like who are most, impacted by, this. Trend that we see towards this incredible, power now that that governments. Have and, obviously in conjunction we'll talk, a little bit about a corporation's, later. Ok, I'll, take a stab at that um, so. In terms of who's most, impacted. I think, I I sort of tried to get at this a little earlier um.

But. Maybe what's helpful to understand, is. That there's a sort of trickle-down. Effect. Going, on with surveillance technology. And censorship, technology, in particular, where. You, know you have spyware, and surveillance, and hacking tools that are, generally. First, developed, for nation-states. To do nation-state. Stuff, and. Then. Invariably. End up sort of repackaged. And resold. For. Commercial, and enterprise solutions, on gray markets, for workplace, surveillance, and we, eventually even see those things sort, of creep, into, markets. Like for example of, what, we call stalker where right things that basically. Spy tools to monitor your ex or your spouse. You. Know so there's, a proportionality. Issue there but I think it's just important to understand that this stuff exists, within a market markets. Unto itself and there, also a sort of pipeline, that. Sometimes gets called function, creep. You know from military, applications, to intelligence. To, more serious criminal, investigations. To sort of everyday neighborhood. Policing, so you, know that's how you end up seeing, you. Know drone. Technology is a good example of this aerial, surveillance satellite, surveillance. MC, catcher's facial, recognition right. Things, like, these technologies. Are. Sort of deployed first on foreign. Battlefields and. In refugee. Camps, and in places where people. Are, invisible. Where. Their, rights are not particularly well respected, or accounted, for and, and. Those things as they become sort of normalized, and commercialized. Bleed. Into, our, everyday lives, and so but you know by the time, you've. Got like a middle-class white dude in North America, concerned about drones like that's, been somebody's, life for the last ten years and. So I think it's really important, to understand, that there is like an economy, and, a framework around that so who is most impacted, well it's you, know often the people who are most invisible the people who count, the least, sort. Of my reflection, on that you. Know I think that's true there's like enormous, ly important, equality, issues around all. Of this that. Are important, I just want to add a kind, of note just because I'm obsessed with metadata, that, there's. A lot of new, kind of methods, of surveillance. That catch, up people. Who themselves are, not under, investigation. And. So you have all these different kinds of bulk. Surveillance techniques. That, is, as simple as say a cell tower dump where you get information from, you know all the particulars users, in a particular, time associated. With a particular cell tower because you're going to then do, a certain kind of analysis, to figure out something, about a particular, individual. So, these are those when. When the Snowden revelations came, out people started calling these is you know you collect the haystack, in order to find the needle and. There's a lot of these kinds of techniques they're enormous ly useful. But, they've changed. Way in which we think about surveillance, because, you, catch up a lot of innocent. People by definition innocent, people not oh I was targeting you and you turned out to be innocent, and, and, we need their, data in order to make some of these types of methods work, and I think that's very different, than. Older. Methods of targeted. Surveillance. And even. We have had examples, in Canada, for example Oh CSIS had a database, at the Federal Court, called. Them out on as being, illegal. Last. Year, where they were holding on to, data, metadata, about people, who had been caught up in investigations, maybe, they had been cleared maybe they've just been caught up in investigations. About other people, but, they were retaining, that data in, order to analyze it in different way so these were people not currently under investigation or under suspicion, so, there's a lot of ways in which the.

Analysis Of metadata is really useful, it allows for all sorts of government, analytics. But we're using it in relation, to people who are not under. Any suspicion. And. And, that's a sea. Change in how, we think about this and I think calls for some thinking about how we protect people. Yeah. So I mean obviously, there, are examples, of like, again to go back to Ethiopia where government is using that obviously to consolidate. Its power to oppress his people and deserve control control. The countries and the citizens but they're obviously you, know with some of this surveillance with some of this gathering data there's a legitimate security. Concerns like. We all have a stake in security. There. Are bad actors out there that want to do bad things there, is an obligation on behalf of the, government to protect its citizens so where I know it's a big big question about sort of where is that balance between our right to privacy and then they you know and, security. Which is you. Know which. There is a legitimate case for in the application of some of these techniques. I. Definitely. Feel excited I. Mean. I think that's in Ethiopia that's, that's what the government says it says no these people are against the government these people are you know they're according, quote eros you, know in all walks of life the government conflates, peaceful. Expressions of dissent with terrorism so it's a broader issue but so as I said earlier I mean. You. Know in, places where none, of those rights are protected, that. Line of what is appropriate what is not in terms of. Surveillance. For security reasons gets really sort, of messed, up I mean they're clearly cases, where you, Theo Pia. Has. Obviously. Don't market this but it's. Pretty common, knowledge that they have, intercepted. Communications that. Have tipped them off about potential, attacks. In. The United States also of course works with Ethiopia as well to intercept, a lot of signals from. The. Entire Horn and and, the, Arabian Peninsula as, well and that information does get back to Ethiopia. So. There's, lots of Intel that they get that does prevent, them from from. From these sort of sort, of attacks but, yeah. I mean it's it's a difficult one that's one that we've had a very hard time pushing. Governments, around the world to push Ethiopia, to, begin to sort, of you know curb its surveillance powers because. They use the exact same lines, that you know all of the the Western governments use as well about you, know where that line is. Yeah. Um. Sometimes. When I answer this question this way it sounds like I'm dodging, the question but I'm really not because, I think it's important to reframe. Or question. This idea that there is always. An, appropriate, trade-off, between. Privacy. Or other human rights and security, on the other end right in. In, many, situations, in my work we, it, ends up actually being a lose-lose, situation and, so it comes down to a bit of a fight about what we mean by the word security and whose security, we're talking about, so. Certainly that's the case for example for. Like. Encryption, technology, right. Governments. For like the last four decades have. Been talking about how, strong. Encryption interferes. With their ability to investigate. Crimes and, conduct. Useful, intelligence operations. And. Have proposed all kinds of. Curious. And and sometimes, extremely problematic, solutions. To this encryption problem, that there, is information, that the government can no longer get. And. I. Think what's. Interesting about that calculation, is it's a very narrow view of what it means to be secure, right, that, security is where the government has access to all the information and. It's true it's you know it's I think, I would concede that encryption technology, probably results, in some cold cases and some sort, of in, some situations, but the the the. Balance sheet is off right, because there's no calculation, about the, vast. Benefits. Economically. Right, in terms of prevention, of crime that, encryption. Technology. Doubtlessly, is, involved, in everything from the you know protection, of people against blackmail. Or identity, theft to the broader sort of economic, benefits of being able to do like online banking, safely right but, there's also this this failure to calculate, the vast, Human Rights benefits.

That We gain from these types of tools so anyway, should just get back to this this question you know sometimes, when we do the the privacy security trade-off, it's actually like, you. You lose it all right, encryption actually keeps us safe so by undermining it it's, like you. You end up with less, but. Then there's. Just this this other piece, that. That. We were the government engages and surveillance there's been this like, long term civil, society, conversation, around it and there's, a set. Of principles sometimes. They're called the 13 principles or, the necessary and proportionate, principles, and without, getting too much into the weeds of that, world I think that's probably, those. Are the values we want to start with right is it necessary, is it proportionate, and. I, don't think that we exist in that climate, right now certainly not here in Canada and probably. Definitely not elsewhere I guess. All I would add to that is I'm, you. Know when, we think about this if I just put it in that kind of a constitutional. Framing just keep, it in, Canada, we don't have a right to privacy we have a you know a right to be secure from unreasonable search and seizure and that's, been understood to protect, a reasonable, expectation of privacy which. Is understood. To be balancing. Privacy interests against state interests, including, law enforcement and national security so. Even at the constitutional. Level balancing of these ideas is at the heart of, what. We do that's part of what the liberal democratic, framework is, what. I think the interesting question in in Canada, not Ethiopia, what sounds like there is no balance okay. They're, looking for something remotely like a battle in space but we have a balance I would go back to saying you know we think about this balance as we've constructed, it in an analogue world and then we just kind of assume, that we. Can just migrate, it over into this digital world and we need to be really worried about actually, what's happening, when we migrated, over so for example, in bill, C 59 that Lex mentioned, you, know the Canadian, our.

Sort, Of spy agencies, are being allowed. To, look. At publicly, available data, on on, Canadian, we put big air quotes around that they're like publicly, available information, sorry, yeah I'm, and, I have a big issue with that because, I, think that you know this idea that there's no privacy, interests in information, that's in the public sphere in some form is. A problematic. Idea of privacy, in this digital space and what it does is it augments. Surveillance. Power and tilts. The balance, not, translates. The balance and I think that's what we have to kind of keep our eye on what's getting lost in translation, are, we really just shifting, what, we already had and, reproducing. It or right. Is the surveillance powers growing, because we're not paying attention to, what's. Going on as we shift over I. Mean. It's an interesting so, if, where. Are we in sort of with c59. Where we act kind of what is this stat like how where. How would you characterize, the current protections, and like are we like. For. Both of you do. We have a lot of work to do do we have relatively, compared. Say to other sort, of Western countries, are we relatively. Have, more progressive, legislation around, protecting privacy, where would you situate Canada and where do we sort of see that whole debate going. Okay. We're gonna do some building blocks here first, okay. So bill c59 is the new national security legislation it's, sort of the government's, answer to bill c-51, which but. Bill c59, contains all kinds of things that's 150, pages long it's everything from like the terrorist entities list and the no-fly list to CSIS, disruption, powers right, to. Talk about the issue of privacy in particular, I think we'll probably like hone in on the CSC CSIS Jeff and. So I think that the way to characterize it is maybe like two. Steps forward, six. Steps back I guess b13. Just. Do fractions. You, know so in some, ways, the. Legislation, makes some positive, changes right so there's some good which is new oversight, and accountability functions. Which have been sorely needed in which academics, and civil society have been asking for for a long time. There's. This sort of bad which. Is the fact that particularly. In the case of the CSU which is again our signals intelligence agency, but like the NSA it. Really normalizes. A long-standing, practice of what they call unselected. Bulk collection, which probably, you and I would just call mass surveillance which. Is sort, of unselected. Mass. Collection, of information about people not on the basis that they necessarily, have done anything wrong or that they're using a particular search term but just sucking. Up the whole internet so you. Know bill c59 normalizes. That contact which has not just privacy implications for Canadians and people in Canada but also in, terms of Canada's international human, rights obligations and, then, and then I guess there's also a sort of like ugly, which. Is that, you. Know as Lisa mentioned. C59. Creates a whole new exception, allowing, the CSE to collect what they're calling publicly, available information about, Canadians and people in Canada so, this is a again of foreign signals intelligence agency.

They're Not supposed to be directing, their activities. Domestically. And yet they've created this fantastical. Exception, that allows them to. In, effect, collect everything from you. Know your credit. Score, to, your MySpace. Page from 2006. To your college radio show from 2003. You, know as. Well as potentially information, that they get through hacks or leaks or breaches so if you think about like Ashley, Madison, or the Equifax breach like that stuff is included there and then, the other piece is that. Never. Before discussed in in public life the COC is potentially. Going to get what they're calling cyber operations. Powers, which is like a new universe. Of state-sponsored. Hacking, that is outside. Of any, of the establishments, previous, mandate, and a lot of ways which would give it the. CSE the ability with, very very minimal oversight, -. You. Know hack foreign, adversaries engage. In maybe what we might think about a cyber warfare but also hack for like international, affairs, types reasons, that foreign intelligence, gathering, could. Allow them to modify the contents, of websites, could. Allow them to impersonate people online, you, know so this, is a big, deal, it's. Something that we ought to be having a big public. Conversation. About and. And, it's tremendously difficult I think we, can all agree because there's again we have to do this sort of like well first I have to tell you that there is a signals intelligence agency. In Canada now we have to tell you why this is problematic, so you. Know conversations. Like this are important I don't know if you have other thoughts on c59, I've. Been doing. This a lot. I. Mean it said obviously, the government, will have a cap at a. Capacity then to collect a lot of raw data, what, are their so you know information. About me is sitting in some database somewhere but what powers. What, restrictions are. There on them being able to access that and utilize, that for whatever purpose, that they might want is are there are the restrictions on the actual use of that data that can give me some comfort in and and. How private that those communications. In those days, that data is yeah so one of the challenges with c59. In particular, is that the, COC has a set, of obligations, to introduce privacy, protective, measures, in. The law but. There's no information whatsoever about what those measures look, like there'll be sudden regulation, after the belt is adopted you, know so it's sort of like a bit of a trust, us framework. Which. Normally. At, least from from my perspective isn't, you. Know really good enough and and this is also not to insinuate right that the COC is up to no, good necessarily. Right but we're you're, talking about one. Of the most powerful. Institutions. In the world with, extraordinary, surveillance. Capabilities. Extraordinary, technical, abilities you, know you. Really want to get it right in terms of limiting the scope of powers to begin with rather than. Giving. Giving, this full range of abilities and saying like well we'll just trust you to use, that within the bounds of law and I think that may be something to decide - this conversation, is you.

Know Trust us isn't good enough because democracies, are really quite fragile and institutions, are fragile if we look, to our neighbors in the south you. Know we see. Databases. That were meant for you know like traffic regulation, being used to round up undocumented. People. You. Know so I think that we have to just, think have a long term view of what. Kind of infrastructure, we want, to be building I. Would. Add that one. Of the rules that the CSE plays is it has also in us what we call an assistance, mandate, so it can assist the RCMP. For. Example, so it can assist in sort of domestic investigations. But then its. Ability. To assist is going to be constrained, by whatever, legal, authority, constrains, say the RCMP. So. And there are all sorts, of constraints. Right. This isn't easy okay, there, are all sorts of constraints about for. Example how the RCMP. Could get access to that information and, use, it and if this gets very very very complicated but, yes we're in a world of plenty. Of legal constraints. I guess the space that worries me within, some of this, I told. You I have certain key obsessions, is how, we, deal. With the. Terms of access to metadata. In. In, the safety Criminal Code and in the Canadian law, generally. So we have a set, of you know protective. Constraints. About that that govern the terms of that access, are they the right ones. Are. Sort of some of the questions I would ask but it's not a legal. Constraint, freeze okay. I mean. It's worrisome enough that our own government has, the state what is the sort of restrictions, on them sharing, it with say, foreign government protection, do I have as a Canadian that. My my, own government is not going to give, information. To me to the, British to the Americans, to the Ethiopian. Yeah. That's, a it's again it's a complicated, question it's you, know the the interesting thing about working, on signals, intelligence stuff, is that most, of this stuff happens in secret right so we can generally, we can speak in generalities so, of course there are there are a great deal of restrictions, around you.

Know Disclosing, Canadian, data or even using you know as Lisa said using. Canadian. Data or sharing it with with. Foreign governments while they're allied or otherwise. You. Know but we're you. Know to the extent that you're engaging in bulk collection, right, like this is really the business of, making. The haystack, as big as possible right, like all the hay in the world and. And so the the issue isn't, necessarily, I mean there it, is in some ways but you know we're really talking about sort of collateral, or incidental impacts, on rights that are just. As important so when you suck up all the information and you sort of try and narrow it down to not include Canadian, data or whatever you're going to they're, going to be errors in the process, and those are going to have real impacts, on people's lives you know particularly. Where disclosing, to foreign governments. So, I wanted to talk a little bit about the Snowden. Revelations is, that kind of the real, game-changer here, I mean it was a surger matic revealing. Of this massive. Amount of. Aggregating. Of data both, you know within the United States and with its allies that. We, were largely not aware of before and i

2018-03-30 18:37

Show Video

Other news