FPF Privacy Book Club - Privacy’s Blueprint: The Battle to Control the Design of New Technologies

FPF Privacy Book Club - Privacy’s Blueprint: The Battle to Control the Design of New Technologies

Show Video

As. Long as we cannot see it we're good here we go alright. Okay. We're now broadcasting, to all attendees but we have no attendees, reading. Oh here, we are okay hi everyone okay. It takes about a minute for all of the attendees to get. In they've all been on hold waiting for us to start welcome, welcome welcome. We'll. Give this just a moment before we start, Facebook. Live, streaming's. Okay. Jill do you want to welcome. Every day are we rolling. We're. A lot I've hey everybody, this. Is Jules Pella Netsky at the future of privacy forum, welcome. To our first virtual. Privacy, book club you, guys, will see John Cross here to my right maybe, your left, John. And I were members, of a maybe, this is the the geeky kind of thing that only exists in certain, circles or certain cities but we had a privacy, book club that, ran for a year or two where about a dozen. Of us would take turns, inviting. The, group to their home to, make sure that we chew through the latest privacy literature it's become such a complicated, world and there are so many books on policy, and law and technology that anybody, who lives in this space needs, to be on top of that. It was a great excuse, to not only socialize, but, actually have some real good conversation, eventually. Scheduling. Became hard, logistics. Everybody was busy on the weekend and we. We. Let it go it. Was super exciting to see that when I threw this out in, LinkedIn. Or Facebook or wherever you saw it there was a real appetite of. People all over the world, for. People. To be able, to get together so, tolerate. The technology, we hopefully will, make, it work with our various live, streams and so forth I'm, delighted to have a, couple of people in the room as well as our celebrity. Guest, author years. Ago someone being a crisis. Tea celebrity, or a privacy, academic, celebrity, was sort of a you, know almost, an oxymoron but. Today. Woody. Professor. Hartzog, qualifies. Not. Only because. Of. Years of high. Caliber and, sophisticated. Academic, or that. Is incredibly, respected, among, his colleagues. In the academic world but, I think a far broader audience, in companies. In civil society in, government, because, of a his gracious. And friendly and humorous. Personality, but, his ability. To write, in a way that has made some, of these ideas. Accessible. And, popular. Certainly, at least within our. Crowd so I'm gonna let woody, in. His own words give a little bit of a chat because this is a book club and you don't usually get the author with.

You The we said to him set, us up but, then the goal will be for, the folks in the room here, I'll do a quick intro, and. Then frankly the, rest of you out there in in. The land to, engage and and just have a good chat with each other before, I do some, housekeeping, from Stacie, gray, one of our senior policy councils, here who's done, a lot of work to make sure this works, thanks. Jules this is our first privacy, book club so we're using the zoom platform, and we should also, momentarily. Be live-streaming, to Facebook live so, if you're, joining us via Facebook live, you should soon. Find, a link to join us in the zoom meeting if you're interested in actually participating. Rather, than just listening for. Those of us who are here, and participating, we've got about 50 people on so far thank you so much for joining the. Way we're going to do this is you're, currently, in listen-only, mode if. You, want to participate when, you want to participate, use, the zoom platform, to raise your hand or send. Me a chat and I. Will turn. On your camera. And microphone remotely. Which is not a privacy, violation at all we. Will enable, you as a panelist, and get you linked up here so that you can participate in speak via. Videoconference, or via phone so again. That's the raising hand function or just send me a chat in the same platform anytime, you want to just jump in also. So. Let, me maybe do very, quick intro. So you know who. You're seeing in these windows, here both. In the room and remote and then we'll turn to Professor. Huck. Oh, Joe, Jerome, Joe. Was a policy, fellow once upon a time at fpf and moved, on to the Center for Democracy and technology where he's, one of the leads on their consumer. Privacy and technology project. So Joe welcome thinking, of joining us thanks for inviting me you're, already met Stacey, I. Mentioned. John Croft John is the chief, privacy officer at, a large. Global. Technology. Contractor. Here. In his own friendly capacity. Also, a scholarly. Guy who's done some, authoring of books, on his home maybe we'll actually talk, about one of John's books not on privacy though your books are about I do. Have one privacy, book but we can all, right that's. Remotely. We. Have, from. I think she's in Paris today Gabriela, is unfair fortuna got really used to work for the European data protection supervisor and, she is our. Lead, on helping us understand, European, privacy, law she's based in Detroit from. Ireland, Kate are you and you're in Dublin ray yes, yeah. Yeah, a, holy. Area did, I pronounce that right Kate. Practices. Law. In Dublin. And is the ia PP representative. For. The country, of Ireland and, one of the deaf, ear and savviest, privacy, experts. Around, and. Then Jason, Kronk I call, Jason mr. privacy by design so. Particularly, great, to have him Jason, focuses, on privacy. Engineering is. A consultant, as one of the leads, also. It's, the I apt on their privacy, engineering work and does lots, of writing, blogging and advising. Do, I have everybody, I do. Wudu. You. To. You those, of you who followed Woody's, career, know that he recently looked. Relocated. From Hamburg. School. Of Law to North. Eastern. And, so has moved to Boston. And is enjoying, the, being, at the the city of universities, so what, do you let me let me turn to you what let. Me ask you this before, you give us a couple of words about the book. In. My view this has been a best-seller, in terms of privacy, and that is certainly, you know gone beyond some, of the traditional academic, circles, so so what has that meant you've, done, a lot of scholarship before has. There been a different. Impact in writing something that's been sort, of so well received in the in the broader community sure. Absolutely. Can everyone, hear me okay. Nate. All right wonderful so, first. Of all let me just say thank you so much Jules and Stacie and everyone at the theater, for selecting. This book I'm really honored to be the the, first the. Kickoff, for. The book club and I look forward to joining for future book clubs. It's. Been incredibly, rewarding to, see the, book be. Received, as it has been because I wrote this book hoping. That, it would reach beyond, traditional. Academic, circles, this is a book that I, tried. To make accessible, so. That people that's anyone, that had an interest in technology and, privacy would, be able to follow it and understand.

The Arguments, that I was trying to make I tried. To avoid a lot of jargon, and. So to see, it actually be taken seriously, by. Those an industry, and an advocacy, eye and an academic circles, and even, by my, mom's friends, doesn't it, incredibly. Rewarding. So, thank you again for having me. What. He would it be fair, to call it a privacy, by design book, I mean obviously design, is in the title but as I was reviewing, it, last. Night having initially read it of. Course hot off the presses when it came out I thought, it was perhaps, limiting. Means. A lot of people call. Privacy, by design you, know different things but. In some ways, you. Take, off a very specific. Piece of the problem perhaps what was your did. You consider a privacy by design book, or, are you thinking of something broader or different, when. You set out your your, range of ideas so. I. Think that it's fair to call it a privacy, and design. Book I think. That I originally. Set out to. Write. A book that tackled the issue of privacy, and as. And I knew that privacy. That the privacy by designed movements, is, part of that and so I wanted, to Nestle the book a little bit almost, adjacent to, the privacy, by design movement, because there's a lot that that, privacy. By design captures. That. I think a lot of what I wanted to address is. Close to and then and then what I hope to do actually, is lay. Forth a general sort of theory of privacy, that I've sort of nestled, in there about what's important, and the. Fact that we need to articulate, specific, values, and articulate, boundaries, and. Then I also wanted, to contribute something that I think. Extends. Beyond, what. We might traditionally think, of as privacy. And so on Twitter over the past few days, and. Even in. The. New York Times Natasha. Singers excellent article just don't call a privacy, there's been this incredible, debate, about is, what, we're talking about with. All of this regulation. Of data actually. Privacy, or is it something else is it algorithmic. Accountability. And while I don't necessarily get, into some of the algorithmic accountability. Stuff in the book I hope. To create something that was lasting, they could be used as a framework that might be really relevant in those debates. Have. You gotten what to push back up perhaps. From you. Know those who argue, innovation. Is limited, by you, know restrictive. Guidance. Or the like have you seen anybody. Sort, of debate, that, you, know the ideas that you raised here would limit the. Great uses of data that that. The companies. Might want to propose. Sure. So I've gotten pushback in the best kind of way was just to say that I receive, critiques, from all sides of the spectrum. So. I on, one hand when I started, pitching the idea from the book several. NGOs. Expressed. Concern, that I might be advocating, for a pretty heavy-handed. You, know government, takeover, of the tech that would limit. Innovation. And. Even after the book has come out people have read it that, there, have been some people that have disagreed, with some of the value choices that.

I've Made so, of course a lot of what we about involves, costs. Benefits and trade-offs and I. Think that maybe. Some would would calculate, that, differently, than maybe the way that I've done in the book and. Then on the other hand at. A few book workshops. I've been criticized, as being too reasonable, and actually, not pushing the boundary enough, in. In some academic circles, some would have me suggest, even more. Restrictive and more robust. Prescription. And. In, some. Of the prescriptions, that I recommend, in in, the book maybe don't go far enough and so I feel like if I can equally, get, criticism, from from, both of those sides as being too restrictive and, and not, restrictive, enough then. Maybe I'm doing okay. Jason. Do we have any folks, from the, floor. Who were looking, at speak if not I'm going to ask. Not. Yet if anybody wants to send me a chat to make sure this is going through that, would be great but just to make sure you all know if you're in the zoom meeting use, the raise your hand function and we will either, unmute. Your audio or upgrade. You to a panelist to chime in John. So. Woody. John cross are really, fantastic. Timing. On your book I will say I. Have. You. Know noticed, that now. With with the release of your book today. We had a. Hearing. Before the Senate Commerce Committee on. What. Privacy, legislation might. Look like and, yesterday. We had a, release. From the Commerce Department, for. Calls. For comment on what. As. Proposed. Privacy, framework might look like so, perfect. Timing on your part, as. A. I'm. Going to try to represent sort, of the practitioners. Point, of view here. For. The day to day privacy. Practitioner. Who is working within an organization. To ensure that they're there. They're. Really respecting. Privacy. I, guess. I may, be rushing, to the end of the story here but very interested, to know what. What, some of your proposals. Might. Look like in. Legislation. I I. Know that's the part that you deal with in the, last part of the book and, you. Talk about search, blockers, and D, identifying, tools, and. What. You called privacy, or obscurity, settings like, to hear a little bit more on, what, you think that might look like in in, legislation. And in practice, sure. Thank. You so much for the question I think that, the. Timing has been really quite good my book was published almost. About. Three days before they came which analytic a scandal. Came out and so it's been a wild ride ever since.

So. I think that in. Terms, of what might some of these prescriptions, look like I do tend, to break it into smaller pieces, maybe. That's just a sort of pragmatic approach, the, the, simplest, thing and the easiest, thing that we could do that I think would make an, incredible. Difference, would. Be. To. A, simple. Pass to modify section 5 of the Federal Trade Commission Act, to empower it with, the authority to regulate abusive. Trade practices, the, Consumer, Financial Protection Bureau, already. Has this Authority we. Could I mean you almost might be able to copy, a lot of the wording. But. If, the Federal Trade Commission were involved in with say rulemaking, Authority, and the, ability, to. Regulate. Abusive. Trade practices, now, it allow the. FTC, to target, a lot of the practices, that I highlight, in the book that we might think of now as dark patterns, or. Designs. That. Are meant to leverage, people's own limitations, against them to do something that they might not otherwise do, and. That's. That's a relatively easy. In. In, terms of. Changing. The law. Modification. Or a small modification, that could really change things at scale, now, in terms of what are the larger, more fundamental, shifts, one of the things I'd like to see is, a. Move towards. A trust-based. System. Some. People have used the word fiduciary. Based system, in. The book I talked a lot about the role of trust, in the role of design and, the way in which its shapes relationships, so design gives. Off signals, it it. Shows, us something. That we might perceive about the relationship, who we're dealing with and what that relationship is, about and. Then it makes things easier and harder and on, the trust side I would like to see an embrace. Of that sort of trust based model, the. GBP our of. Course is a fifth based model which has its virtues but, I'm pretty, critical in the book and remain critical of. Conceptualizing. Privacy. In terms of control and, leveraging. The notion of consents. To. Protect. People's, personal. Information, and then the final change I would make and. This is perhaps more controversial, my, co-author, Evans Bellinger, and I have argued. That. Facial recognition technology, is, the most dangerous, surveillance. Technology, that's ever been created and. We would argue for an outright moratorium. Or ban on it that's. The harder that's the more robust, critique. But, in. Going, from moderate. To extreme, those. That would be the spectrum, of options that I would put on the table. Don't. Oh, so. What, do you know I'm a huge fan of the book I guess, I want to say that I actually think that this is a the ideal book for you guys to have started with because it really is sort of future. Looking on what privacy will look like so. I just wanted to highlight three, things that really stood up for me after having read the book earlier, in the air and just skimming it again first, I love that you mentioned privacy, and design because I really, think that this is a great, discussion, of all, that ails privacy. By design because, when. We use that term I think the, privacy, community understands, it but during. Gdpr you had press people and, other people are saying companies, are doing this privacy by design I see, this content flow that makes no sense and you're really getting, at what the problem is and.

I Think also the, thing, I took was in the book is just, how, clumsy we've gotten with terms and I think this highlights the. We're having right now if, you watch this morning's hearings I something, that stood out to me was on page 118, you, know you're talking about. If we. Keep throwing around we're privacy, and to address privacy we're going to have more control but that's really not what we're talking about, instead, what we want is autonomy, and. That means technologies. That does not work against. Or interfere with people's interests, I think that really speaks to. Me as a as a privacy, advocate, I guess. One thing I guess I'd ask you and this is me trying to throw, some shade of your book is it, ends, with IOT, and it sort of does so in an abbreviated sense, and, I guess to sort of follow up on the questions you've been getting so far so, speaking. As a policy, wonk in DC I don't think it's going to be possible to change section, 5 but. I'm wondering if you think how. We can inform agencies, to think about these things to the lens of unfairness or their existing authorities, and I guess my challenge to, you is you, know earlier this year I filed some comments with the Consumer, Product Safety Commission and, frankly I'm embarrassed I didn't cite your work in it but, that, agency. Is incredibly. Reluctant, to think outside the lens of existing authority and if you watch this morning's hearings. Nobody's. Talking, about stuff, the way you are and if you've been watching the recent, FTC, hearings on consumer protection nobody's. Really talking about this so I guess how. Do we inject this are you and the people who've you worked. With on this book going to try and like inject yourself in the debate because it's a real missing voice I think I will. Note that I. Didn't, we have. Ways. Earlier, paper, as one of the privacy, papers was it this book that we use is one of the privacy papers for, policy makers which, would or at least the paper, that was sort of a earlier. Version, so, we put. That out a little bit there's also a failure of the fifth the owner work on that the problem with the fifth is also great stuff which, I know you're getting at in the book what do you what next, so. What's. Next is to fight. Joe. Thank you so much Joe by the way for that I really appreciate it and I think what's next, is we have to start pushing, back. Against. Currents. Conceptualizations. Of privacy, so there's. A fundamental pathology. That. Seeing in the way in which we're discussing privacy. And nothing's, going to change so long as we keep using the words control, and we keep using the words consents, as the. Mechanism. By which we. Protect all of this and. And, so I'd, like to see, a shift, to. Talking, about relationships. I'd like to see a shift to talking about possible, product liability which. Is an existing, regime. So, I tend, to share your skepticism about whether. We're, gonna see meaningful, change at the legislative, level and so, the instead, and say, let's shift to the tools that we've got, and. The, theories, that we have and, amplify. Them and take better recognition. Of privacy, and that starts, with changing. The way in which we talk about privacy it, also means a local, movement so localized, some of the best efforts, that, we're seeing happen, in privacy, right now are happening at the state and actually, happening at the municipal, and city level I, have. Been a, critic, of the, way in which body camera, policies, have been developed, for police. Body cameras, for example that's, an area where. It's. Still highly in flux we still don't know what we're doing and. An. Actual, add because he can really matter right you might actually know, you're, a city council man who's working on this and. So instead of focusing. Everything up top we're. Never going to get a. Equivalent. GDP. Our level, law at the, federal level it's going to be if anything, is going to be GDP, our light which, is going to be insufficient. For adequacy, purposes, and, so instead, let's chart. Our own identity. One. Of the things I'm, getting. Over. Quickly is the is the the fact that the path the way in which the u.s. is preceded in a patchwork fashion. Has, sort of left the u.s. identity, for privacy a little Hollow and I think that there's an opportunity there, to. Take the reins and, say this is a, trust. Based system, this is a consumer protection based, system it's its system based on, autonomy. And. Sort of full-throated embrace, and that can happen at the bottom as well as the top, so. We're gonna go to one.

Of Our remote. Participants. Who has, joined us Tara, Tubman. Tara. Thank you for joining us, I'm, gonna let Tara ask her question, and what. Do you respond woody then we're gonna let you go and teach your students and write your next book and, then Jason Kate. Gabriela. You'll we'll go to you for your, reactions, and the. Rest of you who want to chime in either via chat or raise. Your hand and Stacy will put you on camera so Tara. Live, impromptu. Tara, you're in Paris, yep London. London. I'm, French I'm, based in London. Very. Interesting, book sadly. I did not get, to finish it I joined a book club a little bit late but I read, a good half of it and all very interesting, I. Was. Wondering, how you can see. The. Desert, the privacy of my design. Seen. By a regulator. The. Problem that I see is the ketchup, of Technology, and the, law but it's on gdpr came out. It's. Already out of date because of blockchain. So. Anything, that would like to legislate, design, would, always be one step, behind. While. Gdpr. Having, taken, the. Side. Of combining. Transparency. With control. Part. Of the design should be included, in transparency. In my view so I was wondering, what is your view, yes. So great question and, this is an interesting question that's popped up a fair. Bit in some of my talks which is how can he'll all possibly, keep up. And. One there, are two questions embedded, within that one of them is how can, legislators. Have the competency, to regulate, law to, which my answer, is we hire more technologists. And then three more technologists, after that and. We. Take. Seriously, the. Role of technologists. And helping shaping, law and policy, and. There are some amazing technologists. To serve as an example of that Joe Hall at CDT is someone that comes up to the top of my mind and. And, we, need we need more and. And. Then the second question is how can they create rules that don't become outdated, the moment they passed in my answer, to that is, we, avoid, technologically. Specific. Rules and rather, we create outer boundaries, based, on either, process, so. We say there are certain amount of steps that you have to follow and those steps would apply regardless of the technology that's being used we.

See This a lot with data protection impact, assessments, though we could adapt that process, based model, to several different things and then, the other answer, is we, simply, shift, the risk of loss to. Industry, in other, words what's. Happening, a lot with our regimes now, when, we use the sort of control, and consent, based, framework. Is, that, it allows companies to take, calculated, risks so, long as as, people. Data subjects, agreed to it. And. Now, of course I understand that purpose limitation, tends to minimize, some of that a little but, in regimes. Where consent remains a dominant, option, we're just transferring risk, and I'm a torts professor, and in torts we, say all we're doing is shifting the risk of loss so if we have a broad technology, neutral, requirement. For example toning, don't, engage, in. Designing. A product that's. Facilitates. Abusive, behavior, for example or or is designed. To facilitate abusive. Behavior or materially, contributes, to abusive behavior, then. Essentially, you're moving, the risk of loss to the company and whatever. Calculated. Gamble they decide to make they. Can make that right and they can choose to absorb that risk or not but, the key component is that we're shifting, the risk of loss away, from data, subjects, and. I think that's a way in which we can sort, of stabilize, the equilibrium, here without worrying. About creating. A very specific top-down. Regulation. That, you know this. Items, a through z. That might become obsolete by the time is passed because in, the United States we've experienced, that too with things like the electronic, privacy communication. That's, our. Thanks to Professor Hartzog, woody thank you for joining us really great to have you as. Part of the conversation, we'll see you soon, yeah. Let, me turn to some, of our remote, panelists. Thanks again woody, Gabriela. Um you, wrote. A paper, is. It already out that, yeah analyzer privacy. By design. By. Looking through, the. GPR, looking, at other sources and, arguing. That there. Really is a lot of detail. To. Guide in. The law or. Not, what what's your reaction to, the, book and and. Particularly how it plays in. A European. Legal, context, yeah. I think thanks, for that Jules. Um I. Have to say I really like the underlying, idea of the book because, I think that, design. Is, kind, of our best chance, to save, privacy. Everybody. Says that, privacy. Is dead and we. Are the group that keeps. Saying no it is not I. Think. Design is. Our. Literally. Our best chance to, have. Meaningful privacy. In. The future. Looking. At the, book from the European, perspective though. I, I. Found, myself thinking, that I wish. I would, revisit. It and embed. A bit more of the gdpr, into. It. Because. There. Are several reasons one, of them is that in, the European. Privacy. And, data protection legal, culture. We. Are making a difference, between. What, privacy, is and what protection, of personal data is and I. Didn't. See any, of that taken, into account when, the EU regime was being discussed, in the book. And. That's, relevant because a lot of the criticism, of the control. System. That. Privacy, laws currently have. Come. From. The fact that we, don't have a workable, definition of privacy and perhaps. The, way around that in the European, Union was to come up with this personal, data protection and, look at it in a very procedural. Way. That's, one thing another, issue. I, spotted. Was that GD, P R in. The book tended, to be. Reduced. To, a consent, driven regulation, I do, agree consent plays a part in the GD P R but, I think, it's about, more. Than that it's not really. Concerned driven. It. We might say it's control driven, that's true but in in, my view control is more than consent, and. You, know getting, rid of the idea of control, in. This. Type, of legal. Regime will not happen soon, in Europe because. We have. The. Charter of Fundamental Rights, and we have an, article 8 paragraph, to there that, puts so much emphasis on. Access. Rights erasure. Right side control rights of. The of the persons. And. One. Last thing would be that I. Wish. I would have seen, a bit. More of an analysis, analysis. Of article 25, of the GD P R we. Already, have, a. Legal. Obligation that's. On data protection to. Have data protection by design, there. Is a mention, in the book, to. Data. Protection by default, but that's, sort. Of an additional. Obligation. In, article, 25.

The. Focus is on data. Protection by design and that, article, already, has. Some. Meaningful. Provisioning. For requirements. And I. Think, it already needs some, of the. Blueprint. Points, that would be proposed, for instance, they are minimization. Literally. That's the centerpiece of article. 25, requiring, design, around data minimization. And then there's also a heavy penalty and this is one of the points that really means that you know it, would be important to have, significant. Penalties, for bad design, and there. Is a significant, penalty for not, inviting, data minimization into. The design of products in. The gdpr, so. I will sorry, for taking so long I will wrap up saying. This I. I. Really. Like the, idea, of the book I just wish, from. The European, point of view that it would have embedded, more of this ideas. From. The gdpr. Okay. Let's go to you in in, Dublin and welcome to a collection. Of the law students that we see, chiming, in it's exciting, to see a new. Generation, getting. Excited about this as a as a career, path. Okay. What's doing in Dublin and what I, thought, I saw you nodding a bit when. When. Gabrielle was making some of her points. Thanks. Jules, yeah. 25. Okay. Well Kim has done in relation to privacy by design where, some way moving. Towards. Woodies concepts. In. The book and I would have liked to have seen a little bit more time given, to that and maybe. Some. Analysis, and maybe some crystal ball gazing is just what do we consider. That. Article 25 could, achieve and, what good design looks, like we. Had a data summit here in Dublin last week and we had Julie Brill from Microsoft, speaking, about Microsoft's. Voluntary. Worldwide. Adoption, of some of the GDP or concepts. And. There, was an interesting discussion about. Can. We achieve some, sort of a global agreement, on. What. Does good privacy practice, look like worldwide. And. I think that's, that's going to be a very thing. To achieve. When. We look at, when. We look at our 25 and some of the privacy. Laws more. Generally, and. So. Yeah so I would echo what Gabrielle, surged in relation to that but I find a book very. And I think that's that's, laudable, in, when. It comes to it what, could be some heart of, what. Some people might think is it quite an academic, concept. Ich. Was really, really readable and enjoyable, so I hope others, find, the same. Jason. You, are. Somebody. With a strong technical background how. Did the book resonate, for, you so. I gotta, tell you the, the privacy, geek in me this. Was a page-turner. I read, it to over three solid, days luckily. I had, three days of vacation and. Just read it cover-to-cover. So. Woody, and I have on the same. The. Same wavelength for a number, of years but I've only gotten bits, and pieces from, articles. And and speeches. He's given so this really brought it all together and, and, we. Really, think alike and some of the things that I really, liked about this book, what. My, original blog. That, I started was called privacy, in public because, I had this notion that it wasn't just private, or public that they were these two. Two. Opposite. Extremes but that there was no spectrum, and. Woody's. Obscurity. Concept, really. You. Know enlightens, me on that and. Again. People have seen it before he's, done articles, about it but the book really. Brought. That home also. His concepts, of autonomy, and Trust this, is something that I've been pushing. For people, to. Move away from. This. Kind. Of data. Or, information. Proxy. For privacy, but a lot of legislation, uses, but. Thinking about privacy. In terms of autonomy, now, I do what I want to do there's, one thing I want to talk about I could probably talk about this book for four hours because, there's so much in it that I, that I enjoyed, but. I, want. To talk about his, transaction. Cost and signaling. So. When after I read the book I took what, he did and mapped, it to what I did. How I do my job and, it, fit, really well and, the. The concept of transaction, cost and signaling were really, interesting. For me for me so, I was, just listening, to a podcast yesterday. That was talking about, the. Robocall, industry, and I. Started, thinking about how the. The ban, on automated. Calls less, actually, if you think about it doesn't increase, in transaction. Cost so. If you look at legislation, have potentially. Look, for areas like. Robocalling. Where. Where. Technology. Has, decreased. Transaction. Cost and then, flip that on a side and make the legislative. Response to, increase, transaction, cost by either, banning, or make it more costly from a regular, expected, the, second thing is the concept of signaling, and I really, liked his an idea of user interface, as contract, and. So potentially you.

Know Putting in legislation, you, know a requirement. That judges, look at the totality. Of interface, between, the individual, and the organization and. Not just. Looking at a privacy. Notice, that nobody but regulators, and, Moyers read, so. That's that was really the the fascinating, things that I took away, from his book. Super. Um I'm, gonna ask. Our. Our. Group. Here, to keep your microphones, open so we can actually, chat. A bit and I see. Some. Active. Comments. In the in, the gallery Rishi from the Caribbean, from, Trinidad. And Tobago. Pointing, out that the Caribbean. Doesn't, have any data protection law in place at least in the, internet. And and Tobago, perhaps. A future. Place, for a good privacy conference, where we might all not, only enjoy the locale. But have a some. Impact on the local political. Discussion. Make, sure I always go to financial, cryptography. And data security conference, every year and two, out of three years it's in the Caribbean, so. They. They've got it they've, got it right um guy. Cohen. Guy are you nearing. A camera. We'd, love to bring, you in I noted. Your comment. Scroll, down a bit to the. Guys comment. There oh I'll. Call it out. Guy. Mentions. Of. Course he's in the in, the UK guy. Is pre. Pre. Privet are perverts, are yesterday it's good time, there. There he is, got, gyeom. You. You, you, were. Pointing. Out in. Line with some of the other people, commenting, that, consent. Is likely, to be a minor. Legal. Basis to the extent that we're making design. Something. Designed around. Consent. You. Don't to share your your thought further. Right. Uh sorry, I'm home, so apologies for the poor poor camera so um I guess I would I would say two things one on the consent. Aspect. I think we've seen a big push towards legitimate, interest and the. Balancing, tests and things that go with that and I think the, the. Focus there is essentially the sort. Of saturation of decision-making, on the individual, means they're unable to make meaningful consent with the number of interactions they have each day and so, the data controller, is equipped they at least make an informed decision for. Me that pushes the question onto how do you ensure that the incentives, of the controller, or elides and so we might need additional transparency. Mechanisms. Just appalled that shift to legitimate interest but I wouldn't argue that the gdpr is a consent, focused. Regulation. In fact I think strengthening.

Consent, And supporting, legitimate intensed in the guidance we also know, he might even 2014, shows, a move, away from consent. In, the EU model, oh then. Also say the harms piece naturally I thought F, PFS work on the. Sort. Of harms evaluation, for mates of decision-making was excellent, and. That really highlights, some of the challenges, with appraising, these harms, and you, know I think that's only gonna get more complicated. And difficult as, we get more complex, into. Operating system so when. Talk about, legal. Mechanisms, that are based on demonstrating. A. Certain, effect has been realizing, and attributing, it I think that's going to get harder and harder to do and that can become potentially. A sort. Of unenforceable. Mechanism, which is why what, I think it's important to have there it, in of itself I'm not sure would make a sort. Of effective, legal. Framework. You. Know Gabriela, you and I were part of a conversation the other day of a. Company, that looking. At trying. To create a data privacy, impact assessment and, in looking at all the areas where, the. GDP are calls for us to look at the, impact on rights. And freedoms, and. You. Know the question was which. Rights and freedoms should one be looking, at is it a sort. Of very narrow you. Know data protection impact, or, is it you, know the right sons of the charter the right that. The courts have recognized. And and, those haven't all been clearly. Tied into law. Yet should you when. You're evaluating. A. Legitimate. Interest when. You're maybe doing a DPI a and you're looking at impact. On the individual, do. We want. DPOs. Or, the companies looking. At well, how am i affecting free speech how am i affecting. You, know a range of these other rights that have been incorporated, by law but are still very new in terms, of how they would fit into, data. Protection. But. Gabriela. Or or. Or. Any and. As probably see by design is supposed to protect therefore, again. Just. People you know accidentally. Getting, some marketing they don't want or is privacy by design supposed, to helps me with this, broader notion, of harms, or rights and freedom and then I see Jose got all right Gabriela. And let's go to Joe yeah. Great. Point Jules and, I. Would, say from my point of view at least that definitely. That's a definite yes so, whenever. Conducting. The DPI aid it's a definite yes that we have to look at, a broader spectrum of rights so. And I would argue because, we're talking about, privacy. I would I would argue that affecting. Free speech in the sense of having some chilling effect with whatever. Processing. Of content. Or, data or metadata we do. It's. Rather. Important, and then you're. Absolutely right that the question is to what catalog of rights who should we be looking at.

Under. The strictly. The gdpr framework. We. Have a. Combination, of, the European Convention of Human Rights in the you Charter, of Fundamental Rights. But. Indeed, there's a long way to. Make, all of them operational. At dpi a level, I agree, with that. So. I guess I would just say that I think this. Discussion and, frankly Woody's book goes back to the notion that maybe, we privacy, is just the wrong word we, are locked into privacy, over and over again and really we're talking about is autonomy, I. Don't have an easy answer here I just I think again having, just gotten out of this hearing this morning there, were a lot of questions that a lot, of the panelists particularly Google unfortunately, for Google would say I'm the privacy expert here I'm not able to to. Speak, intelligently, on various topics that, from, my perspective I think are inexorably, intertwined. With how, data flows work, and I think we're. Seeing a lot of that this year like a lot of the most, recent I wanna say privacy. Scandals, on one side the privacy. Pros and the other like well if that wasn't a privacy, issue some other part, of the team that should have handled it and that, doesn't work, when we're talking I think we really need to broaden the discussion to, something like autonomy, but I don't know how you do that I think we've sort of again locked ourselves into a term that isn't describing, everything or concerned right and I give I actually did two European Velata credit trying, to move towards data protection, is, probably. Better than privacy, so. Let, me tell you I, get. Blamed. Whenever I see, colleagues. Who. From. My days at AOL, there. Was a plan, afoot a number. Of years ago when, I was still at AOL so this is maybe I'm, here 10 years this was three, four, before that AOL. Instant Messenger aim. As. Many. Of us who. Were users, of it in its day it's now done. But. It it looked like a very early version of Facebook they were status, updates and. People. Liked updating. Their status I'm away I'm studying I'm hungry. And, one. Of the business, people, had. An idea that, rather, than going to your, friends. Who are online and, looking at their statuses, maybe. We'd create sort of a feed and. The. Minute you changed, your status, it, would be distributed, to all of your friends now, I objected. To this because. If. I'm leaving my room I say I'm away, and. You happen to later on you know want to talk to me and you see I'm away that's one thing versus instantly. Announcing, that you have just left your room right maybe someone will come and steal. Something or, maybe. I said well I'm sad I just broke up with my girlfriend that's my status it's one thing to put that and it's another thing to blast it to everybody and I. And. You won't wait it was already available, to those people your. And I think it but you're changing you're making it less obscure, now there, were other reasons why this product didn't launch but when I run into the. Product, manager who championed, that product and he, points to its, a major change that Facebook, made years ago from being you know go look at someone's page to. This newsfeed. Where people objected, because it was a big change the, impact of having a feed.

Clearly. Changing. That you know what he talks about obscurity. Lurches, right that was certainly, an obscurity, lurch. If. Facebook. Had asked, people they. Probably would have said no I'd like it the way it is don't i meant, to change it so that my friends would see it when they saw us it's a very different experience it. Turned out to be the. Most significant. Thing now whether we like or don't like it we want to debate all the aspects we. All keep going back to check those feeds it was a very powerful business. Decision, and although there was a lurch and although people probably would have said no I don't want it it turned out to be something that people wanted. And used and so how, do help businesses. Design. To. Put people in the, most careful. Shape. Because, that's how we should design stuff. With. The when. Is it that you are allowed to I, don't want to use the bad word innovate, but when, obviously, if you do things that harm, people or affect, their autonomy, or affect their right but, what's, the boundary, right, between and, whenever I see this guy he's like we a well would still be we, would be you know values, like Facebook instead of you know you. Know pretty, much you know division, of Verizon how, do we tell those of us who got to give advice to businesses and. Who are looking at this and said well where's my growth strategy, you're. Giving me a privacy by design strategy. Well. First. Of all I love that story Jules I'm very surprised I haven't heard it before. That's fantastic, I think my, understanding of, what Woody's response to that would be that it would be that, companies. Can make a calculated, decision on something like that but that they should be the ones who are internalizing. The risk, and. His solutions that. Seemed to be about. Shifting to trust relationships, and product. Liability and, and that seems reasonable. To me but just to chime in and as sort of the United States counterpart, to Gabriella's analysis, this, is really hard in the United States for, a couple of reasons first, being that. For. Any sort of federal action, article, 3 of the Constitution requires. A specific. Tangible. Harm, and of course we have a lot of trouble, identifying. What. Privacy harms are and.

The Other part of that is that it requires a customer, relationship, or a consumer relationship, with, a specific person. Which. Would. Include people like Facebook's ago like Google but, leaves out a large. Part, of the data economy and leaves out a lot of companies that don't necessarily have, that first, party consumer. Relationship, and. So you know, you run the led about this you recently had an op-ed in Slate about about. This exact issue right, the shout-out will appreciate that so it I think. That what he's proposed. Solutions work really really well for a large subset of most of the companies that we're talking in the trust relationships, we're talking about but, but leave out a. Good. Portion where maybe other solutions, or more appropriate, like prohibitions. Or legislation, or, agency. Oversight does. Anybody think there's a new, privacy, by design effort, that the. ISO. Right, one of the leading standards. Body they you, know they're their security, standard. As sort of a default uh people, show that they're you know certified. Against. That they've, got a privacy. By, design. Effort. Going, on now is anybody, to. Your left response from, the UL by, very group for, that in. Fact there's a meeting next week in DC the. First date to face meeting at the US Technical Advisory Group. Yes. Where, do you think that's gonna go we've been debating whether, we've got time to engage, or not there there may be some people on the call, that are but it you know is this, robust. Enough that we can already standardize. On it when we're still sort of debating, whether it's what. Exactly it is and. What. What's your well. So it's really too too, early to tell I'm, certainly, pushing. For a, a, broader, view, of it, but. There, are those who you. Know want the, kind of standard. You. Know fifths in cooperation, and kind of simplistic. View that we've had for the last thirty years by the way I just wanted to comment on, the talk, that was a made. Earlier I definitely. In training. And consulting companies, I take the broad view, looking. At rights. And, privacy. Much broader than just confidentiality. And, data protection, so. So. Definitely, they're definitely people like me and, others out there who are pushing that view not to say it's always, receptive. But I can tell you in training people. Are always aghast, at some of the examples. I give them, that. Are that are not kind of the data breaches, that they're used to so. You think, of target, and the pregnant teenager things like that. Can, I share my theory about target, and the pregnant teenager Joe you've heard this one before, probably. Alright so. The. People at Target are very sophisticated, marketers. And. They. Know that. If, suddenly, you, know you bought something, and then, you started getting stuff related, to that especially. Something. Sensitive, that, that might freak. People out so. The way targets marketing, as I understand, it actually worked is, if, you.

Purchase, Lots of maybe skin creams or something indicating, you you were. Perhaps. Likely. To be pregnant, they, did not take their mailer which, those. Of us in the u.s. gets whether we want it or not you know every, three days it. Just keeps pouring into the mailbox they, did not take it and turn the, twenty different products, that are usually highlighted. Into pregnancy, pregnancy, pregnancy, products, if there was typically one item, out of twenty they were then three. Items or four items out of twenty and that was a pretty good marketing. Strategy so the notion that the. Father of the, you know of this young. Team, would. Have suddenly started getting pregnancy. Catalogs. And. Then run into the store and you, know been surprised that they knew she was pregnant because of the marketing, has, sounded a bit off to, me and I've looked back at some of the people who worked on analytics. A target entirely in pocket here's what happens here's what I'm gonna tell you happens and I've yet to get target to admit that target. Separately. I would like them I'd like to speak office I don't think that the story is that great here's what I think actually happened, target. Does have a separate place where you can actually sign up for, you know a registry, for your baby you can actually say, I want, to get you know stuff, get. Put me on your list for diapers. And all these sorts of things I think. That either she signed up and didn't realize or they accidentally. Started. Sending her what, they send to people who expressly. Sign. Up and get, pregnancy. Stuff, only because they signed up for the brain they accidentally. Put her on that list or maybe she actually somehow signed, up for that list and that's, how all of the marketing suddenly turned into a pregnancy because they certainly know that just because they were predicting, someone was likely to be pregnant because I bought skin creams it could be me and my expanding, belly or someone else who's you, know who's using, skin creams for some other reason, so I think that's the real. If. You're sophisticated marketer. You can get away with this as long as people don't notice. So. I'll say to think I wonder fooled that may be the case I do want to say my understanding, from my. Memory of the event an article by Brad that, was targets response, in terms of they, started, making it less obvious so instead of instead, of 90% of the mail or being pregnancy, now they're gonna include tools. And things so, it's so it's a little more subtle and. That's absolutely not. A potential. Potential. Control to. Reduce the the privacy, risk of you exposing. Or disclosing. This information to. Somebody else in the household so. It's a legitimate it's actually a legitimate control, that I teach I think, there's some other controls that they could have implemented. But but it's absolutely a legitimate, control that could have used. Our. Folks, over the seas before, I turn back to John, here cake Terrell any any quick thoughts. I'll. Bring in John so, I want to go in, total, reverse from my first question, which was entirely, day-to-day. Practical, focus to a very abstract, question. Which is you. Know woody woody talked a bit about the. Fifths and his, book being a bit of a challenge to the fifths and I'm. Going. Back to roots. Here I really want to understand, how, does it challenge the fifth is, he thinking of still. Including, them or, modifying, them or just setting it aside altogether, and coming up with a completely, different framework. So I throw that out for sort. Of the last question, of chew on I. Know. I I, think I think we're all struggling, with that I think it's a totally, on point question which is why I was trying to ask him I think it's time frankly, for academics. To be more evolved in this conversation so I think we all saw the, the NTAs request. For comments, it's, very sybian and. Then it's like what other outcomes should we have and I almost. Wanted to hear what he's like so yeah I'm gonna write something into them because I, think I think I think, this book needs to be read by everybody there maybe it already has been but it's, not reflected, in their current request for comment although I think maybe some, of what these points will fit more, easily into the FIPS than maybe we recognize. For example so, one of those examples in the book was about Gizmodo, making, the conscious decision to not log, IP addresses, so, that they would be able to tell their users even. If we get a valid access request. From law enforcement we, can't respond to it to. Me that's data minimization, right. It's. Also a conscious, design choice but I think it fits into that framework that we that we have folks, as we approach the top the our one quick special. Promotion, for those who took, the time to join some.

Of You know we're hosting, a privacy, war game on November 12th. Being hosted at Cisco in Silicon Valley so, for, those of you who have signed up to be, part of the book of a special. Offer we. Will both discount, the price and give you a copy of, the. Cambridge. Handbook on consumer. Privacy Cambridge, University. Not Cambridge generally to go the Cambridge handbook here, is a copy. Of it 140. Dollars on Amazon, or your favorite bookseller but free. With your subscription. To the November, 12 privacy, war game where you can learn. From your colleagues and see how you might actually, react. In, a real privacy, crisis, match wits and learn from each other really, great to. Have you all with us you see this pack, of books here we will send out a poll these are the books on my bed. Stand today and I. Really. Want to read the known citizen, next. But, want. To make sure we choose something that is really, interesting there's a great book by Kai Fuli, who, worked at Microsoft who, works at Google the work that I think, even Facebook in China and does, a really, in-depth. Look at how in, his view China is gonna you know win AI or, some, of the complexities so for those of us who want to sort of understand, the global interplay. Of data, and air it might be a good read but, we will let you vote, thanks, to Kate. In Dublin, thanks to Tara and guy for impromptu, coming. In Gabriela coming. To us from Paris, John Croft, Joe, Jerome Stacey, and Michelle, Bay on our team who helped set up the logistics, I think, it worked well we had almost it. Looked like 52, people, on. The phone, in a dish in, the zoom, as, well as those. Of you who, joined by phone so great, to chat and let's do it again thanks all thank. You thank, you thanks. Everybody.

2018-09-30 13:14

Show Video

Comments:

To join the FPF's Privacy Book Club, please sign up here: https://futureofprivacy.us6.list-manage.com/subscribe?u=29435880652ecea8d5a25d45e&id=2a9fb373a1

To join the Privacy Book Club, please sign up here: https://futureofprivacy.us6.list-manage.com/subscribe?u=29435880652ecea8d5a25d45e&id=2a9fb373a1

Other news