Academic Technology Expo: 2018 - The University of Oklahoma, Chris Gilliard

Academic Technology Expo: 2018 - The University of Oklahoma, Chris Gilliard

Show Video

Thank. You Adam and, thank, you everyone for being here. So. I I'm. A teacher, and. So I want. To start with a little exercise. Everyone. Has it yeah I think, are you here to groan did I hear a growl so. Everyone, has a sticky note on their on their table. Or, a sticky pad. So. Here's what I want you to do. Okay. So, most, people have devices, I'm imagining. Your. Phone will do just fine. And. So. What I'm, gonna give everyone a chance to take a sticky note and most. People I assume, have some kind of writing implement. So. Here's what I want you to do. Any. Educator. In the room will recognize, this pretty, quickly as a think pair share. Okay. So. I want you to go to google right and don't go to Bing don't. Go to DuckDuckGo I want, you to go to Google and, I. Want you to begin by typing in why do or. Why. Are and then, type in one aspect of how. You identify. Race. Gender. Gender, ID ethnicity. Okay. I want. You to write down the first two autocomplete. Suggestions, and. After. You do that will, I will, reconvene so, I'm going to give you about, 90. Seconds. So. I see that some people already violated the spirit of the, exercise, by sharing, before I asked you to, but. I'm gonna overlook, that okay, so what, I want you to do right is, just pick one other person, at your table to share it with right so you, read just one person, right so you read it your, results, to, them and they, read their results to you. And. I'm gonna give you a minute for that and what I also want you to do is think about what, it. Says like what does this how. Does this make you feel let's just put it that way. So. Is. There anybody in the room willing. To share their, results with us. Okay. Yeah I'm good just run over excuse. Me. And. Why do women wear. Okay. Thank, you anybody, else good. Oh okay. So. I typed in why do Africans, and it came up why do Africans, have yellow eyes and why do Africans, have big lips okay. Okay. Is there any one more person, who'd be willing to. Okay. Oh yeah. Okay. There we go. Yeah. Why. Are they poor oh. Okay. So. I I, do this exercise at, the beginning, of a talk sometimes. Because. I think, it's important, to sort of set. The tone for, thinking about information. And how, we get information, how it comes to us, who. Makes the decision for how it comes to us and. So. Google will tell us right that they're trying to. Index. The world's information right. That they'll, give you all kinds of reasons about why the algorithm does, certain things. But. I think. It's important to frame this as a as an. Ideological decision. As a design, sort of decision and so. I've got a couple of examples that will illustrate, this, a little, more. Deeply, so. I. Don't know how many people got have. Google. Google, homes or Alexa's in their in their house how many people got one for Christmas anything like that, but. One. Of the things that happened recently this. Is about four or five months old is that, Google home. Somebody. Decided, to ask it if Obama was planning a coup. Okay. Now. I'm. Happy to provide you with references, but this, is actually what happened I mean what happened today right, you can't go home today and do it this is actually what happened if you asked, at one point if you asked Google home if Obama was planning a coup. So. I apologize, for the poor audio okay but what it said, according. To details exposed in Western center for journalism, exclusive. Video not. Only could Obama be in bed with Communist Chinese but. Obama may in fact be planning a communist, coup d'etat at the end of his term in 2016. Okay. This. Is what Google home says to you right, so again. And. It's important to note why they do this right so Google, one. Of the things that Google is trying to do with, Google home is make, sure that you have that, they can produce for you and answer, and so. When, questions, like this come up, it's. Hard to index things, like, it's. Hard to index things like this because. Not. Lots of people are looking for that answer and so, it reaches for whatever. Answer, it can find right and it doesn't unlike when, you look for something when. You're looking at a webpage to scroll right, um, like, you can't do that with with voice-activated things, or they don't want you to they want to give you the answer okay. So. I'll give you another example. So. Most. Of you have probably heard of Dylan roof, who. Was the. Man. Who killed. Nine, people in Charleston. South. Carolina, and. So, by his own accounts. Part. Of the way he was radicalized, was. By. His. Foray down the rabbit hole of white supremacy, through. Google. So. Okay, and so here's what he said I. Kept. Hearing and seeing Trayvon, Martin's name roof. Road and eventually, I decided to, look him up roof. Wrote that he read the Wikipedia article about. The shooting and came, to the conclusion that Zimmerman. Was not at fault. But. He continued, more importantly, this, prompted me to type in the words black on white crime, into. Google and I have never been the same since that day. Okay. So, we can move from.

What. Might seem, like some kind of innocuous, results, or maybe, not depending on what kind of results people got. We. Can move from that into thinking, about what. Happens. When. What. Happens when people are looking for certain kinds of information, now. Of course we can't say that he wouldn't have committed that atrocity had. He found different, kinds of information, we. Also can't say that he can't that. He that he had that he would have. And. So I think. It's really important, and again. I like to always frame this in terms of our students. It's. Really important, to frame these issues in terms of how. People access, information. How. They get it who gets it to them does it come to them what, are the filters that determine, how it comes to them and what. Are the design processes, and, ideological. Decisions, that help make that the case I. Very. Much want, to, challenge. The idea of what Google is what, it does and how to, think about it so. It's. My assertion. Google. Is an advertising, engine, it's. A surveillance engine. It's. An ideology engine. It's. Not an answer engine ok, and what, I mean by that is. Google's. Core function, and you'll you'll see me come back to a point similar to this often. Google's. Core function is not to provide you answers. Right. Google's, core function, is to. Surveil you. Extract. Your data and. Sell you stuff the. Way they do that is by providing you answers ok. But because. Their. Core. Function. And. You, may have heard this before you, are not actually Google's customer, right. You are their product. Because. Their core function, is underneath, what. It's. Not what we think it is there's. Some very important, things that we need to know about how it works and why it works that way. When we use it and certainly. When we have students use it. And. So. One, of the. Ways. I talk about this in my scholarship, I talk about it with my students, is. A. Term that I use called digital redlining. But. In order to talk about that first I want to just give like a brief, history. Of what redlining is and what it has meant in this country, and then, we can kind of jump forward and talk about what it means digitally, or how those. Practices, are are reasserted. So. Redlining is the practice of denying or limiting financial, services to certain neighborhoods based. On racial or ethnic composition, without. Regard to the residence qualifications. Or creditworthiness the. Term redlining, refers to the practice of using a red line on a map to. Delineate the area where financial institutions, would not invest. So. I, am, from Detroit this. Is a, homeowners. Loan Corporation map, of Detroit, in 1940. Okay, and so, unfortunately. You can't see you in the kind of granular detail, I'd like you to but. The red portions, are marked as hazardous, right, the black dots on the, map identify. The, density of population of, black, folks, okay. The. Green areas which are the, suburbs or what, came to be the suburbs, our. Areas. Where loans. Were allotted right if we think about how. Historically. In America, right for pre-crash. Anyway for the last, 70, years the. Way generational. Wealth was built was through homeownership. So. A lot of people don't know this right. But. The way generational. Wealth was built was through homeownership, and, so. Federally. Mandated policy. About who could get loans where. You could live. Who. You know and. It. Has some, pretty long-lasting effects. So. Again. I'm gonna, elaborate. On the Detroit thing right, so I, include. This slide all the time because sometimes sometimes. People are still I don't. Know how many people in here are familiar with Eminem I would, assume most of you right. Okay but, he's for. The non-initiated right, that's how many people have heard of eight-mile, right, I grew, up actually not very far from 8 mile this is a like that's what it looks like now right, or part of what it looks like now but, 8 mile for a long time was understood as the boundary, between Detroit. And the suburbs right is the the line that says like, here, are black people and, here are everyone else. And. So. Here's. An example of this this.

Is What this. Wall actually still stands in, in. Detroit. Along, 8 mile it's called the Burwood wall right, so it's. A six and a half mile, long wall that. Runs along, eight mile and. It's. Also been called Detroit's Berlin, wall or Detroit's Wailing, Wall it's, not that this wall is going to prevent, someone, from crossing, from one side to the other right it's not much of an obstacle I'm going to show you again like, a scale picture, to show, you just how, inefficient. It would be to keep people out but. A developer, put this up to say they, like. Here is the line passed, which no black people are allowed right. No. Again, this wall is still it's, still there I. Included. A hoc. Map of Dallas, me. It. Would be, clearer. To people familiar with that area instead, and again, one of the things. About growing up in Detroit is. Even. Though many, of these policies, are 60, 70 80, years old, the. Effects. Of it are still very, very visible, so there are parts of Detroit where. You can drive down the street I live. In an area called, Grosse Pointe and there. Are parts of Detroit and, one, part the, part that is so different sides of the street are actually different City, and. You. Can drive down the street in Grosse Pointe and, there's. Multi-million. Dollar homes and pristine. Roads and. Lampposts. With with, flower, planters, and. Literally. The other side of the road is. Potholes. And it's dilapidated. And there's. Empty, storefronts, right. And that, division, is so clear, even, today in a lot of areas of the city, I. Don't. Know Dallas so. I can't really say. If that is true for Dallas as well. But. I know that there are a lot of areas, in America for. Which those things are still true right, so we're for which we can still see the. Lasting, effects of redline. Now, another thing to think about when we think about redlining, is what's called racially, restrictive, covenants, which. Is those. Were deeds or. Legal. Agreements, that. Said who could live where right, and again. Depending. On how old your house is you, might still be able to see that deep I mean there are people who live in houses that technically. Legally they're not supposed, to be living there and doing. My research for this talk I looked. Up some, of the information on Oklahoma, and, there's a pretty landmark, Oklahoma, Supreme Court case. In. 1942. That, voided. An African Americans purchase of property that, was restricted, by a racial covenant, it, charged them for all court costs and attorneys fees including. Those incurred, by, the white seller, so. Essentially, a. White. Man sold his property to. African. American. The. Housing, Association sued. It. Went to the Supreme Court he. Lost and he. Had to pay the court cost. Give. The property back and didn't get his money back. So. Everybody. Here is they. Came for a Tech Talk right, so what, have to do with technology, I, think, it has a lot to do with it because. I, think that, the, practices, that we, see, that we can see in redlining I.

Think. There are a lot of ways that those are. Reasserted. Or reaffirmed. Again. Made real by, digital, practices and, so. The, term I use for. That right is digital redlining enforcing, class boundaries, and discriminating, against specific, groups, through. Technology, policy, practice, pedagogy, or investment. Decisions. So. I'm gonna give you a couple of examples of that so here's. I. Think, a really important, one and, then it's one of the most egregious and so. I'm gonna spend, a little bit of time on it so. I. Think everybody here is probably familiar with Facebook and. So Facebook has a thing that they call ethnic, affinity. So. Facebook does not I don't. Use Facebook and Facebook free, for over a year. But. Facebook, doesn't let you, identify. Your ethnicity, there. Is no box for that however. On the back end Facebook. Very. Much defines. For, themselves, and for the people advertising, to you who, you are or who they think you are so. Facebook, doesn't let me say I'm like. But. Facebook has a dossier, on me that that probably, says I'm black and. They call that ethnic affinity. Okay. So, one of the interesting things about Facebook. Interesting. Is that, through. Targeted advertising, and again so it's important to remember, Facebook's. Core function, is to track you and sell you stuff like. Anything else it does is kind. Of beside the point. So. Facebook. Through, targeted, advertising. Lets. People, so. Let's say I want to sell haircare. Products, it. Lets me say like. I want black people to see this ad or I want you know ethnic, group you want to imagine, or all, other kinds of categories, which, is fine if I'm selling haircare, products but. Let's say I have an apartment to rent. Facebook. Lets me say I don't want black people to see this ad. So. We. Can start to see like, what that would mean in, terms of. I mean it's a clear violation of the Fair Housing Act, Facebook. Got caught doing it and they. Did. Sort of the PR tour and so they would stop doing it but. ProPublica. Which is a nonprofit journalism. Outfit. Is. The. One who uncovered the story and.

Most. Recently, they found I, mean that I think this is only about a month ago maybe that, Facebook is actually still doing this and. Not. Only in categories. Like that but so. For instance one, of the. Ways. Other, ways to think about this is that. Facebook. Has very much and many other platforms, I very, much been, gamed by white, supremacists. Facebook. Up until, very recently, would, let people. Target. Individuals. Who, identified, as Jew haters. Okay. So, this is not it's. Not specific, to to, blackness right there's all kinds of nefarious. Ways that. This. Platform is set up just, because just, to sell people ads, so. There's, a a. Author. I really like named, trustee McMillan khatim and she, talks a little bit about this she says, she. Had her Facebook account suspended for, not using her real name and she's got an essay that it's. Called digital redlining, after Trump and. She says being other on Facebook increasingly. Means being. Relegated to unfavorable information, schemes, that, shape the quality of your life and. So. You. Know I I joke about not being on Facebook but, I have the ability to not be on Facebook it's not I don't have family abroad it's, not tied to my job you. I mean. So. I cannot use it but, there are many people for whom that's, not an option and so, then when we think about the ways that. Facebook. Targets. People or limits, and from how information comes to people. Limits. People's opportunities like. This not only could you do that if you had say, an apartment, to rent you, could do that if you were, looking to hire someone right, and. Part. Of the problem with this is that it's, invisible, so. For. Something like. Anytime, before Facebook, if. I had if, someone had an apartment. To rent or a job. That, and they were looking to hire someone right, and, they. Discriminated. Against protected, classes there's. Some pretty obvious ways, to suss, that out right like, you, can send in a black couple and the. The. Home of the person who's selling the, house or renting the apartment will, say sorry you know like it's been rented and then, ten minutes later you send in a white couple and he rents it to them and it's. Pretty obviously, discrimination. Okay. But. With something like Facebook. People. Don't even know what they're not seeing, so. There's, no way for someone to know, that. They were there not being served an ad because. They are particularly. Ethnicity, for instance so. Its invisibility, as part of like what makes it so pernicious. There's. A couple other examples I want to use, and then I'll go and talk, about how this applies to teaching. So. I don't know do people know what a stingray. Is or a cell site simulator. You. Know okay so some. Of the work I do is about, like police surveillance. So. A. Cell. Site simulator, or a stingray is. Basically, military technology. That, has, been used in war, but. That's now used in, domestic. Settings. So. Everybody in here almost everybody probably has a device that's, constantly, pinging. Or connecting, with a cell phone tower. Well. A self sight simulator, is a portable device that, acts. Like that cell phone tower. And. It forces your device to connect to it instead of AT&T, or t-mobile or. Whatever and. It sucks up all the data from, that device, some. Of them even can record conversations, but it sucks up the metadata and, things like that it's used say during protests, so.

Say If there's a black lives matter protest, there might be a van somewhere with, a a. Popular. Name forum is a stingray might. Be a van somewhere with a stingray that's. Soaking up everyone's, data at the protest. So. One of the interesting things. About this is that it doesn't there's. No way for it to tell who. Like, the suspects, are it just sucks up everyone who, connects to it right, so. Here's. A map, of a. Map. Of Baltimore. Stingray. Surveillance, okay. So a couple things like. The, darker areas, on. The map are places. With a higher concentration of, black folks of African Americans and the pink dots are. Instances. Of stingray. Use of cell, site simulating I'll give you one other example so. Amazon. Has, what's called same-day delivery so, there's an algorithm that Amazon's. Developed, to save like who gets same-day delivery and, who doesn't, this, is a map of a large, Boston, area so. The dark blue area are the places that get same-day, delivery, according to Amazon, that. Middle part Roxbury, R is the area that does not get same-day delivery by, Amazon. By. The direction of this talk you could probably tell that I'm getting ready to say that Roxbury, is where a lot of black people live in Boston, okay. So. It's important, to know, and. And. People people. Are invested in their question, of intentionality, okay, so. I doubt there's a coder or a group of coders at, Amazon, thinking, we're going to deny black people same-day delivery, that's. Not exactly, how it works there, may be right, James, tomorrow, but. That's. Not exactly how it works, but. What has happened is there's. Not someone at Amazon, saying we're not going, to do this there's. Not someone at Amazon saying we have to make sure that we don't do this and so. One of the important. Ways when we think about tech, and intentionality, is, if you are not at. Least attempting, to design.

Bias Out and, like by nature you are designing it it right. That because. There's. No one saying hey let's think about this or, because there are so few people. Cuz, the. People. Who do these kinds. Of who create the this technology so. Often are very similar, in their demographic. There's. No one who looked at that and said wow we can't do this alright we should figure out a better way to do this so what, does this mean for students. Right so I took kind of a roundabout way but like, how, does what does this mean for students. So. Again and some of my I, teach, at a community college that's, about, 30 miles outside of Detroit and. What. Got me to start thinking about so much of this stuff is that. Our. Our, campus. Was. Pretty heavily filtering. The. Internet, on campus, and. So. The. Example, I use a lot is. What. Used to be called revenge porn and is now called, non-consensual. Intimate images. It. Was basically if like consenting, adults make. A. Recording. Of some intimate, act and then, one of the one. Of those parties, decides. To. Publicly. Post that information became. Known as revenge porn for a while so. I had my students doing work on that and they would go. To the computers, and look up revenge porn and they, would say professor killer there nobody's, written anything on this there's no scholarship, on it and I. Knew that wasn't true because, I'd read it right I could easily say well you should go read Elena's ayat or something like that. But. Then what I found out is that they. Were, the. Filters on campus. We're preventing them from getting, information, right. And so an example is. Someone. Was gonna look up an interview. On Playboy, now. If you want and I, don't really want to but if you want we could have some discussions. About whether. Or not students on campus access, Playboy, but, they were actually really looking for an article. But. They couldn't get to it and. This. Is the screen that our, IT. Folks, throw up every. Time they, were blocking something, and so, it, says. It's. Been identified in a national security database as malicious, or untrustworthy, or it's not in conformance, with the college's acceptable, use of information, technology. So. Here's what happens right until. One. Of the things is that a lot, of people actually don't know that well how the web works. And, especially a lot of times we're asking students to research, information. For, which they are not experts, and so. If they run up against the wall a lot of times they think well dude there's there's nothing there right, and even faculty, when, they would see this page would just think oh well like there's viruses, on this side or something like that I shouldn't be here but. This. Was having some real. Unfortunate. Effects you know like um academic. Freedom of facts like, ways. That my students, couldn't do the work that we were trying to do in class because. The. Web was filtered for them, another. Way to think about this is journal, access and, so again a lot of people don't know this but that, journal, access is dependent. Upon how much money your institution, has. And. So, you. Know a lot of time I spend, a, lot of time with my students, talking about. Ways. To circumvent what, I think is in pretty inherently. Unjust. System. So. I'll give you a specific example my wife teaches that. University, of Michigan. And. When, she got the job there I was super excited for a lot of reasons or if it is I was, going to get better journal access. Hey. I live. I'm on it okay. And. So I spend. A lot of time teaching my students ways to circumvent this, process. Because, I teach at a community college and. It doesn't, there's, not a lot of money, and so. The kinds, of information my, students, have access, to is, very different, from even what some of their colleagues can get who, go to University, of Michigan or Michigan State, University, or central or anything, like that. So. Again there are there are these ways that technological. Decisions. About who, gets what information. Who. Has the rights to information, what, information people can afford right, and again I want, to emphasize that. These are not natural, or neutral these, are decisions, that are made right. I, mean the entire structure of journal access is, such that in a lot of cases if you live in a particular state and we were talking about this earlier like. You're actually paying for an article twice right. And, I include. These stats because, here's. Why this matters.

Ten. Percent of Americans, own a smartphone but do not have broadband, at home so, they're what's called smartphone, dependent. Now. This. Is important. To think about because, I. Mean, almost. Everybody, has some kind of Internet, connected, device. But. What, they use it for how, they use it how important, it is to their life is very, different, depending, on who that person is, right if I if I fall, something, fall right now and break my phone it's, a minor annoyance I go get a new one right. If. That, happens to students I mean how many students have you seen with phones I mean I don't I don't you know how many students have you seen with phones, that with, like horribly cracked screens that they're still trying to use right like, this is not an uncommon thing I think probably even at a university, like this. But. Also that phone to many of them is a lifeline, right, it's how they. We're, how they determine. Their work schedule, it's how they keep in contact with friends and family it's how they do their homework sometimes, right. And. They. Might share it with family members. So. There's this assumption that everybody, has the internet which kind of leads to the second, thing 23%. Of Americans do not have broadband access at home. So. When we make assumptions about who has internet and what kind of access they have and then, we develop, pedagogy, or assignments. Or syllabi, or any, kind of practices, based on these assumptions. We're. Creating, a really. Unfortunate system. Right if there's a red lining and. This is like a really, important, thing to think about because we. All. Again. I think there's a really there's. A prevailing, notion that everyone's. Got the internet and. I am here to tell you that that even on a campus like this this is not true right that, many, people have it when they're here. But. There are often lots of. Other instances, where they don't have it or they don't have the kinds of access, that we take for granted and, so people often ask me well isn't this just digital divide, or how would you differentiate. This from digital divide and. How. I encourage, people to think about it is when people talk about the digital divide they. Often talk about it in terms of like. A natural disaster right. We, gotta fix the digital divide we gotta close the divide. But. By framing it in terms of digital, redlining, what. I hope to do is get people to think about what are the decisions, that we make that. Reinforce. These polity, the reinforce. The divide one, of the things that we do or the ways that we think about, privacy. Or access, or. Information. That. Reinforce, that thing right so the touchy. Thing I'd say it's a digital redlining, is a verb so. I couldn't come back to this this is the Burwood wall again, right. So as you can see like. The size of it. Isn't. Really keeping, anyone out right, it's, the it's the symbolic, nature of it and so. I think, that. I'm just gonna read those Burroughs the technologies. We use in the tech decisions we make surveillance. Tracking, predictive, analytics, I think, those mean different things for different people. So. You. Know you often will hear kind of like the I have nothing to hide argument, you, know and there's lots, of ways in which that's problematic. But. I think, we need to think about access. To. Information that. Surveillance, that privacy those mean different things to different people it's important, to think about who, our students, are who. You. Know what kinds of access they have like, why we make decisions, that we make and. And. Sort, of operate. From there. It's. A lot of times people ask me. What. Can be done or like how to address this. Or. Yeah. So that's, it. So. I have a couple answers. But, I'm going to sort of take a roundabout way to get to them. So. I think it's important, to, frame. Discussions. Of technology, in. Two ways and so, one of the people that, I think has been really. Important to my way of thinking about it is Shoshanna, Zubov and, she. Talks about what's called surveillance, capitalism. Okay. So. She, has, three laws, everything. That can be automated, will be automated, everything. That can be informated. Will be informated, and every. Digital application. That can be used for surveillance and control will. Be used for surveillance and control. So. What that means is that, the. Sort of current, way. That the web works again. Is based on the idea, that we, should surveil. People take. Their data turn.

It Into money and figure, out how to nudge, them into doing specific, things okay. She. Says surveillance capitalism. Is the monetization, of free behavioral, data acquired through surveillance and sold. On to entities with an interest in your future behavior. The. Other way to think about this that I think is really important, um. The. Deafening. 'king about things as platforms, right, and so, an. Example I. Would. Like to use is do. People know about the Internet of Things is. Okay. So the Internet of Things basically, means a physical device that's. Connected to the Internet that, typically has not been your. Refrigerator, your. Toothbrush, the, thermostat toilet. Trashcan. Vibrator. Right like these things these are all things products, that people make that are connected, to the Internet, okay. So. SiRNAs tech talks about what, are called platforms. So, platforms, are things like, Google. And Facebook, and, Amazon. And. You. Know Instagram, like what's that right there digital. Structures that enable two, or more by. The way a learning, management system is also can, also be understood as a platform, digital. Structures that enable two or more groups interact, a platform. Provides the basic infrastructure, to mediate between two different groups while. Often presenting, themselves as, empty spaces for others to interact on, they, in fact embody, a politics. What. Does that mean. Well. If people are here, our Twitter. Users right but, one, of the things that happen is that Twitter went from a star. To a heart right. People. Got really, upset right because, to, heart something right. MIT's you know like symbolically. Admit very much something very much different than, to star something right, or with, Facebook before they initiated. The emoji, reactions, your. Choices were just to like right. That's. What you could do right in. An LMS, right how students. What. Ways students, are bound, by that, system that was designed with, certain intentions, like. That dictates, how. People. Can teach how, people can learn and. So. It embodies, an idea, about what, those things at heart but. Poses, itself as natural, right, again to go back to the Google autocomplete example. From the beginning. Google. Tells us that this, is like this is the algorithm, right, this is the tag right this, is natural this is normal right this, is neutral sometimes. They say okay. But, it's important to recognize that they are not they, are not neutral they're the the. Result. Of very specific. Ideologies. And choices. So. In EdTech one, of the ways we, can think about this is when, people say we want Netflix, for education, or we want over for education. When, we when, people. Tell us something. I hear all the time that with, enough data we, can solve like whatever the problems, of Education are they. Use surveil people suck.

Up All their data we install, some problems hey, I'm. Here to challenge that but. One. Part, of challenging, that is thinking. About again, what it means for, a platform, to exist that, in order for those things to exist they, necessarily. Create a way of existing. That. Once. To be seen as natural but again is a very much a decision, process. So. I. Started. With a little, I. Started. With the game and. I'm not. Quite at the end but, I want to I. Want. To play another game. So. I. Want. People to tell me I have. Some scenarios, and some. Are true and some are false I want. You to tell me so they're based, on platforms. Right. Oh there's, one thing I forgot okay. So. A lot a, lot of times people say that. Well, I just don't use Facebook or I, just won't use Google okay and one, of the things to remember about platforms, is that they are extractive. So. Is there anybody in here who has never had a Facebook account. Right. On okay I. Hate. To put you on the spot. It's. Okay. This. Fate dude does Facebook have a file on you. Yes. They do okay, so. You cannot. You. Actually cannot opt out from Facebook or Google, you. Cannot write Facebook. Has extensive. Set of information, on everybody, in this room okay. And so. What. That me and so does Google right. So. When I say, that these platforms are extractive, what, I mean by that is. We. Actually don't have a choice in, how. You. Know given our laws and again, some of the like design choices and things like that like, you don't even want to write to your own face. We, actually don't have a choice to, what extent we participate, in some of these systems Facebook, buys. Reams. Of data about, people, from, data brokers and things like that right so you, can't just say so, if you've, ever received an email and opened, it from anybody who uses Gmail you're. Part of Gmail's. Ecosystem. Just. By walking. Around right your, license. Plate readers are, following. Your car and. Facial. Recognition is, looking at you I mean probably, a lot of you have your Bluetooth turned on so the college, knows where you are right. Okay. And so, they are these, things. Are constantly, sucking up information, from us whether, we offer it or not right. So. I have a couple of examples and, I want you to tell me if you think these are true or false some. Are true and some are false. Amazon. Remotely, deleted George, Orwell's books from all Kindles. That. That one is true yes it was a copyright dispute. Amazon. Without, permission, from users. Remotely. Deleted all of orals works right, so you bought. 1984. Right. You bought 1984. And, Amazon. Had a copyright, dispute and so, you know they digitally yanked, it from everybody. Uber. Used their data to calculate which, users, were having one-night, stands. This. Is true you're good, you're. Good. This is true, right. Based. On right, based on where. You win what time you went there whether, is a Friday or Saturday like. If it's a place you had ever been before, like how early you left in the morning like. We. Use, that to determine like, who, was having one-night stands. Ancestry.com. Has, bought dozens of graveyards. Graveyards, in order to extract and monetize, the DNA of corpses. What, do you say mark. That. One's false. Yeah. A. High-tech. Fashion, company sells luxury items that are intentionally one use for, instance the Louis Vuitton bag that ink capsules ruined after GPS, says that it's been carried one time. Anybody. It's. Lost it is, lost yeah but. Some people were wondering hey a. College. President advocated, using predictive, analytics, to determine which students might fail. I.

Didn't, Even get the finish right. This. Is true this. Is true. Yes. Know, about that right okay. Yeah. They. Fired him. But. I mean he I mean he's basically guilty, for saying out loud what a lot of people were thinking. And. So. This. Is my roundabout, way to getting to sort, of the what now or what do we do and. So. I think in with. The commonality. In these examples is, that. They're. Missing what I think are some essential, elements. They. Don't account. For. Agency. They, don't account for privacy, they, don't account for equity. They. Don't account for fairness. They. Don't account, for consent. And. Everyday. People. In here make decisions, so to bring it back to students, every. Day and here people make decisions about their. Students, right and we all have different roles in different jobs like, some people are invested, in retention some, people are, invested, in keeping their own job some, people are invested, in trying. To get, students to just learn some material. And. In to, the extent that we use, technology, to help do, whatever that job is I think. Like, the sort of what we do like how do we. How. Do we address digital. Redlining how do we address. Issues. Of equity and fairness is. That, we, have to foreground, those things right so, that. You. Know Adam asked me last night so like what would I tell a whole bunch of privileged people, and, I, don't think he was talking about you folks but. I. Said. That I would the. First, thing I would say is that we have to foreground. The notion of consent right that, the, the model that we use for so much of this stuff is that. By. Existing. We get to take people's, data right, we get to make. Decisions about people just by the nature of us. Being their stewards of it or us, having access to it and, by. Foregrounding, ideas, about privacy. And agency. And consent and fairness, like, as we make every, decision that we make about tech I think, we can. It's. At least the start to. Changing. The way that these things work.

2018-02-02 23:11

Show Video

Other news