An Unconventional Look at the Future of Technology with Baratunde Thurston (Google I/O'19)

An Unconventional Look at the Future of Technology with Baratunde Thurston (Google I/O'19)

Show Video

It's. Been a really big. Momentous. And, mostly, beautiful. Gathering, I saw one of the creators. Of machine. Learning and neural networks this morning. Declare. With 100% certainty that, machines will, gain consciousness and none. Of you reacted, and that made me more alarmed right there what's. Going on I saw, a space. Pioneer, dr. Mae Jemison, remind us all that we live under one, sky under one roof and remind, us to connect, with each other, I saw. A fruit used, last night to make music, and it. Made it hard for me to eat that Apple during lunch today because. Every time I took a bite I heard it screaming Apple. Apple. Like fruit, shouldn't talk and if, it does talk it shouldn't talk in a self-aware. Kind of way it's. Just creepy. There's. Been a lot of amazing, news, announcements. Out of i/o this year we have Android, q updates, flutter, for the web something. Called a nest hub max and. Of course the, biggest technical innovation, dark-themed. Which. Seemed to really excite the super geeks in the room got, you to glance away from your screens so, I should say dark theme. You're. So easy you're so easy, now. My presentation. Will run at 30%, higher energy, efficiency, and push fewer pixels so, I, was. Paying attention I really was I learned, a bunch of things I learned that in bu. Is. Google's, way of saying next billion. Users, that's. How this company thinks about growth a humble brag the. Pixel 3a is out and we, got privacy, ish. Which, I'm gonna come back to later. So. The first conclusion, I came to after, many, of the sessions and overhearing, people in lines with, that machine learning is going to take all, the, human jobs I saw, very specific, ones where machine, learning algorithms. Outperform. Or nearly, outperform. Humans in things, like translation. And transcription in. Driving. In picking. Things up and most. Interestingly. Of, all in developing. Machine learning, algorithms, I want. To show you a slide that I saw yesterday, which, really, it looks kind, of banal but this is a terrifying, image the, red curve is the performance. Of a machine learning algorithm, built, by a machine learning algorithm the, black line is what, puny humans are capable of and. Here's why, that concerns me I have assumed, a level. Of self-interest, and self-preservation. On, the, part of the people designing, and developing, this new world we live in so taking, away certain jobs in occupation, school but not my job and yet, the people with the superpower are like no, be cool to, create a superpower, capable, of creating superpowers greater than my superpowers. Stop. Do. You don't have to do it like literally no one's making you do it do. Something else, just. The thought just the thought that curve should be a warning, line we've. Gone too far but. At the same time I had another, opposite. Conclusion, which, is that machine, learning will. Bring us closer, together, I spent. A lot of time in the experiments. Dome it's one of several, domes, here. On planet. I owe and, I, witnessed, and participated in. Some really beautiful uses of, technology. I saw, a conversation. Between two people one, of whom was hearing-impaired. Yet that conversation. Itself was, not impaired, at all I took. Part in a dance lesson for one of the greatest dancers we, as a species have ever produced bill, T Jones thanks to the pose net algorithm, and I'm, as good as he is now as submitted. By Exhibit, A here I would, literally empty, a wallet, or max out a credit card if I could get personal, dance lessons from Beyonce, right, that would be really successful, I met, this developer, Ezra from Egypt, and we, made random. Beautiful, art together using something called. Raw, chavo now, I'm assuming this isn't some kind of weapon system it's just used to make cute, art but. We made this and as I stared into it I recognized, it as a metaphor, for the arc of human, existence and, that we peak sometime, around candy, crush and. Are. Now in the climate, catastrophe, decline unless we intervene I. Also, recognize, that one of the greatest innovations, of IO 19, was, to get hundreds, and I mean hundreds, of developers, to. Go outside. Just. Go, outside and. Then. Be forced to exercise. Because. Everything, is 1.3. Miles apart from everything else I've, walked 30 miles in the past two days I have calluses, now this.

Was Also a test of in what. Situations. And circumstances will. People stare at their screens it, turns out everywhere everywhere, including the. Digital, detox own. People. Like I'm here to tox I don't know what you're here for I'm a TOC sir or. Just, sitting on the street with, nothing else to do Thank You Yama for the photo and, maybe I was one of the few to notice the sort. Of police overwatch. Situation. And as. I took advantage of digital imaging, technology. And zoomed in I realized there's. Not a human in this car it's. Already. Here. Skynet. Starts, with an empty police car overlooking, a bunch of super powerful technologies. Creating the future it's. Too late to run enjoy, captivity. The. Greatest, invention, though others, in dark-themed are. These, porta-potties. Can be given up for. The ultra, lab come. On now this is amazing. Actually. Bigger. And better than my New York City apartment, so. That's. Most, of the reflection, on Io again. My name my full name is Barrett soondae Rafik. Third. And I am here by way of this, woman our Anita Lorraine, Thurston. Who, raised me and my older sister Belinda, on her own who, was a multitude, of people a survivor, of sexual assault, a paralegal. A computer. Programmer, and activist, and environmentally. And it, was her working, as a systems, analyst, for the federal government in the early 1980s. That, brought the first computer, into, our house a, computer, that helped, introduce me to the early, internet clap, if you remember this Internet, yeah. Text. Baby no. Gifts, no cats no ads it, was a simpler, and beautiful, time and, and I witnessed, and was a result, of the, power of technology to, upgrade, my. Life my mother's financial, life got upgraded which, meant my education, quality. Of food got better and my. Sense of creativity, of literally what was possible, was, powered, by technology from, an early age this. Is a image, of an article I wrote in high school in Washington, DC in. 1993. Headline. Upper school joins, Internet, because, we got a full time always on, t1. Connection. Which. Changed, everything. And among, the observations. I had in this article, this, one stood out to me I wrote students. Have used the computer. Room because. We segregated. Them back then to. Write papers solve, math problems conduct. Science, experiments, and connect. To local libraries. And universities, all legal. Make. Your mama proud type, activities. What. I did not anticipate was, fortnight, cyber. Attacks on our election system selfies, or dark. Theme of course. The. Path that, I have, walked has been enabled by, what technology. Has brought to us I registered. My domain, name in. 1998. Most. Of the jobs I've ever had were, directly. Influenced. By or working with technology, including. Working for America's. Finest news source The Onion where, I was yes. For. The good kind of fake news, yes I was, director of digital there for years right after that I helped start a business which, merged, technology.

With, Humor and tried, to bring a level of humanity to some of these cool, tools and our signature, action. Was a series, of hackathons that we called comedy, hack day and straight. Yes, yes. We. Got one person who knew about it I'm very excited, about the human it's a human connection it was beautiful, should. Do more of that I like you so. If we would bring developers, and designers and comedians together to imagine, and then build, working. Prototypes. Of jokes, so, someone, once made a digital, assistant, voice. Enabled but, only in the body of a Furby, that had to be attached to your shoulder so you'd walk around with that there. Was a team that made an app which was for a long time in the app stores are, called equitable, and that would allow you to split your bill with friends, after a night out but, it wouldn't split the bill equally, it, would split the bill equitably. Taking. Into account the, pay gap based. On gender and race in the United States, so. Different, people would pay different, amounts it was reparations. One meal at a time what. They called that and. Then. I helped in the first year, of The Daily Show under Trevor, Noah, to reimagine how that now global, institution, deploys. Technology. For more than publishing. Video feeds online, but to get way more interactive. With its community and more creative, with the possibilities. For jokes now. Technology can be very personal for, all of us and it has been for. Me through a little company up the road, call 23andme. As. I mentioned I have an older sister Belinda. She's, nine years ahead of me in life in more, ways than just, age she, is also we. Have different fathers and after, our mother passed away we. Had even more questions that, a living. Human couldn't answer so, we went to the great database, of 23andme. And I got. Incredible. Bragging, rights thanks, to the comparison, capabilities. Within, 23andme. I got to be able to find out that, I am less neanderthal, than my older sister. Clap. If you are a younger sibling if you're a younger sibling with. It we're the good ones I'm. Just kidding everyone's good but, it is really powerful to be able to say to your older sibling no you're the Neanderthal, because. Science, right like yeah back up. But. Big sis had a genetic clap, back I did not foresee. She. Said that's cool baby bro but you're also way. Wider than me. Not. That there's anything wrong with being white, room full of white people but. Like. I got. A lot invested in the blackness thing, and, this, is way late and how do I not know this and what does that even mean and it's extra, awkward, because, I wrote a book called how to be black like, that's not, it's. Kind of undermines my brand you know is, not good I, didn't, write a book on how to be 81.6%.

Black, I was a totally, different book not a best-seller, very, different authors. My. Life is mostly, great mostly, incredible, because it been filled with great, and incredible people, and, that includes the. End of last year my then-girlfriend. Proposed. To me and I, had the good sense to say yes so we are getting, married I'm engaged it's very exciting. And. I'm gonna take this. Opportunity, to, share with you all in the world we're, starting a family it's, very exciting, time, to. Do that it's an Amazon family, and, what. We did is it means we we merged our Amazon, account so she, can use my Prime but with her credit card it's great innovation. And it. Really changed, it's also it's just really nice to have our union, recognized, by. Chairman Bezos you know head of one, of the largest non-governmental. Military, operations, in the world it's really cool just, he's. Seen truly. Seen like. That. There. Was a time when to create a child which we are not. Actually doing but to create a child involves a lot of physical, work but you, all know disruption. Machine learning and Amazon. Makes it much easier. To just, add a child. To. Your family, they've got a slick, UX, and. Then you have a bunch of parameters you can fill. Out to optimize. Your. Child for your particular situation. Now. I'm not a fan of the binary, gender, choice but that's a simple update on the backend so. What I did was I created a, little girl after my favorite roller derby characters named beyond. Slay and I, just skipped those early, years where you don't sleep so she's, almost 11, now, and. She's a cute, little unicorn but once you've got a child, this is the beauty Amazon gives you options, as, you can add another child or if you're not happy you can edit your existing, child, I mean. Clap if you ever wanted to edit a child right like that's an amazing, we, need that IRL. You, know so. You choose to edit, the child and when you click through you're given even more dramatic, options, do you save, the child, or. Do you remove the child which. I'm pretty sure it's a human rights violation but, it's cloud-based, its Amazon who knows it's beyond the jurisdiction, of any government. It. Brings me the ways one, of my favorite, tools for. Always letting me know where I am but denying, me the knowledge of how to get anywhere it's an amazing combination, where, I know less, the more I know and, one. Thing I took a look at somewhat recently, is just. The level of chaos and mayhem that. Happens, inside of ways I'm not sure how many will pay. Attention to what happens inside of ways but this is an average route, through a busy Los Angeles, interchange. But, when you zoom in you see, violence. Like, there is a bloodthirsty. Sword. Wielding, nut. Chasing. Royalty. There's someone laughing in the background, protected. Though probably, why they're laughing and the, police aren't doing anything, about it. Oddly. Reminiscence. Adly of what happens in the real world so maybe Waze is actually, real when. I got here to IO I wanted, to see. What I could learn see what some of the sessions were inside I loaded, the app onto, my phone I found, out that the app is created by some, unverified. Developer. Who wants access to a view and edit all the events on my calendar and once, I just clicked through blindly, through those permissions, I started, searching and I searched for a machine and I found so many options and, things, to learn about I searched for ethics I found, one. Which. Is better, than zero progress. So, y'all should definitely find the stream on that one and like embody everything in that session and then I searched for avoid. Apocalypse. And I didn't find anything in. The session, it. Brings me to a moment, in my life three, years ago I was, invited, to South by Southwest Interactive I've, been going for almost a decade at that point and in this particular year, they brought me in to be. Inducted, into the, Hall of Fame joining. People like Dana Boyd and J Frank, and.

Kara Swisher and in. My brief acceptance, speech I made some remarks, which I think are still, relevant to the world we're living in today I want to share a brief, moment of that with you and you can read along I said. To them and I say to you still, now the algorithms. Are coming. And we, know they, aren't pure or objective. Like. Journalists, they're embedded, with the values, of their makers they reflect, the society. Around them, but. If innovation, is all about making the world a better place and the algorithms. And code that claim to do so derived, from, this very imperfect. World sick, with racism, and sexism and crippling poverty, then isn't, it possible that, they. Might make the world a worse place could. We end up with virtual, reality racism. Or machine. Learned sexism. And. Today. You know could poverty, be policed by drones, and an, Internet of crap. Possibly. Maybe, the, answer is yes so. Today we live in some, version, of the. World the world that gives us headlines, about social. Media disrupting. Our democracy. About yet another, data, leak from yet another organization. Taking advantage, of access. To information about us about our legislature, selling, us out to ISPs, who are selling us out to advertisers, who are selling us out to brands who just want our, money about, self-driving, cars who literally, can't see black people, that's. Terrible. About. Machine learning applied, to resumes, which conveniently, sort, out all the women, because that's also what, history has done so we've just scaled. That into, our present, and future about. An automated, response system. That was goaded, into mentioning. The N word publicly, on Twitter and about. The cloud-based, apology. Company, known, as Facebook, apologizing. Yet again for, something it deservedly, should, apologize, for, I. Have. Been working, over the past year to try to integrate my own thinking around. Technology I, was a big booster. In the 90s, and early 2000s. I have, seen the, harms as well and last, year I wrote a bit of a manifesto, I went on a journey to, try to understand, how all my data existed, amongst. The major platforms amongst. App developers, amongst, the very websites, or web browsers, that I visit, and that I use and what, came out of that was. A set of principles that I then open sourced with a Google Doc that others have, contributed to, to, help guide us, more. Conscientiously. Into. The future and so I'm gonna walk you through several, of those, principles, and hope you take them in the spirit of generosity and embed them into the code and into the values, as you go out and build this world that, we all want, to live in the first is about. Transparency and, what I call trust. Scores, now look we all get a score, from, a system we get credit scores from, the financial, system that determine if we can get a job if we can get a home if we can get a car and I think it's time to flip that scoring, system around to create something like a trust, score which rates the, organizations. That we are in relationship, with based. On how they handle us, how. They handle, our information. Based, on what's inside, of the technology, and we, have a good metaphor from the world of food when. I want, to know what's in my food I don't, drag a chemistry, set to the grocery store and inspect. Every, item point by point our. Read the nutrition label I know, the, content, the calories, the ratings this, is possible, and with that knowledge we can jump to another, world and look, at the building trades where we get broad, ratings, of lead generation for, sustainability, and energy, efficiency. I shouldn't. Have to guess about, what's inside the product I certainly shouldn't have to read. 33,000. Word legalese, Terms of Service to. Figure out what's, really happening, inside all of that should be as usable, as the. UX and systems that got me onto the platforms, into the apps into, the services in the first place now. The second principle, is about defaults. Changing. Those defaults, from open, to, closed from, op out to. Opt in. Defaults. Matter, you all most of us don't, change the default as in, life we accept mostly. The world that we walk into we accept the family, beliefs the religious, beliefs and the, settings, in the programs, that we're using as far, as what rights they claim to have with our data I think we need to totally switch to. A minimal. Data, default. To something more akin to data, conservation. Rather, than data extraction, and exploitation. There, was a great series of guidelines at a model from Mozilla which they call their, lean data, practices. I encourage. You all to look at those and apply them and ask, do I really need this information, about, the customer the user the, person, if you choose to call, them a person, every now and then and maybe. We should treat data like. Other things we want to limit sugar Netflix.

Trump Sweets carbon. Fossil, fuels and see, how far we can go with how little of that, information, about the user as possible, that. Brings me to the third point about data, ownership, and data. Portability. Now, I think of this in a couple of different layers I think, about the data that I, generates. Or, the data that is derived from, my actions, about. The data of us and the data about. Us about the content, and the metadata. So when I take a photo and upload it okay that's mine that's you generated, content but when I walk from here, to, here and create, a digital trail a little history, that. Is also data of me and I think we need to start living in a world where, that is mine where, that is a part, of me where I have a level of sovereignty, and self-determination as. I, do with, this body, so, should I do with. What this body represents. In the virtual world because. Things are going both ways in business. They should go both ways in rights, as well, and when, we start to think about ownership. Of data in a more expansive, way it makes. Something like this a bit more interesting, these CAPTCHA. Codes these little quizzes, and tests to prove you're not a robot which, pretty soon we're all going to fail because the robots are going to design the test and they're gonna pass, it but, I've identified so. Many hills and stairs. And cars. And, I'm essentially a co-founder. Of wham-o and every other self-driving. Technology out. There we. Should all be considered. Co-owners. And partners, in the products. And services, that are being built machine. Learning, depends, on data artificial. Intelligence, without, data it's, artificial, stupidity so. Every, user is also a contributor. This is much more of a cooperative. Economic, model, than a top-down capitalistic, extraction, and exploitation, model, let's have that framework, in mind and I know it's complicated. But so is everything else we've ever done and we. Use technology. For the express, purpose of dealing. With the complexity, of trying to abstract it and find, ways to still approach. It, number. Four I think we should start shifting our terminology. Around, permission. Not, just, privacy. Privacy. Has this vague yet hard edge binary. Field it's private, or it's public, privacy. Or not permission. Has, layers when. I think about permission, I think about the first UNIX systems I ever used back at that high school and the schmod command, CHM. Oh do changing, the access, permissions on directories, on files for, users groups and others.

Are Including the owner and I think we need to apply that as well to information, about. Us I'm cool entering, into a relationship. With, a developer, around, this use for. This app for, this amount of time that doesn't, mean that seven, years from now the, business that created that I have got acquired by a company I want nothing, to do with and they just automatically, get me right. I married, you not your cousin it doesn't translate, that way so, giving much more fine-toothed. Controls. Around what, apps. And, technologies. Have permission, to do with. Whom for, how long in what, context, including location, will, bring much more self determination control, and respect, to, the underlying. Relationship. And I. Can't talk about respect, without talking, about inclusion. Systemic. Inclusion. We, all live in this world of systemic exclusion. We're sitting, atop, centuries. Of all, kinds. Of exclusion. And so, where we are is, a result, in part of where, our ancestors were the people who came before us were of where, the system, that was designed by people put, us and we, know too much now to say, it's all equal for everybody we literally have the data to prove that's not true so, we should be using that information to create inclusive, systems. Across, many dimensions, and I want to give some kudos to, this organization and many others for the steps they've taken toward accessibility. Which have gotten so much better than they have been they, can still get. Better still project, include is one. Such effort that's, focused, on the technology, industry they got literal, work books that, I think you should be using it's. Not about charity, for, me doing. The right thing is a good thing but, it's good for business it's good for human rights it's. Good for designing, not, just technology but, the world that, we're all going to be living in we're, not operating in a vertical anymore. This stuff is infecting, every area of life and so if people from. Every area of life, are not participating, in, the, creativity, and the construction, of the rules and the systems then. Their, subjects, not, citizens, and that's not the type of world but I think any of us want to be living in number. Six. Imagine. Harder. I want. Everyone to think actively. About what the worst thing that can, happen with. Your technology is, don't. Just think about the hockey stick and the valuation and the user growth and the delightful. Personas. And testimonial. Stories come to life when, something goes well think. About the tragedy, and the horror if something. Can go wrong and, then, use this tool ethical, OS developed. In partnership with Omidyar, and a bunch of other organizations, to, literally come up with a framework to think about worst-case. Scenarios, and how to manage against. Them one. Of my favorite examples from their workbook says, could your product or business do anything your users are unaware. Of if. So why are you not sharing that with them would, it pose a risk to your business, if it showed up in tomorrow's. News if you'd. Rather it not be written about publicly, maybe you're ashamed of it maybe someone in the organization is ashamed, of it and that is not a good way to operate in a again any relationship.

Number. 7 we've got to break open the. Black box so. Much, of what is happening with technology is, beyond, us almost, literally, beyond comprehension. So, we've got to start building in ways to. Understand, what, seems to be not. Understandable. There's, an analogy, and food inspectors, in that industry and auditors, in the financial industry in peer, review and academia we rarely just let. People skate, by so, please, start, building, in even more ways, to inspect, interrogate. And measure the impact, of the, tools were building when. Someone, commits. Driving. Under the influence, offense. A DUI, or DWI, we. Take away the car at least, for a certain, amount of time I would, love a world where someone abuses. The, data of millions, of users and so, was not allowed, to continue, to have access to the data of millions, of users that's, the most logical approach, that we take in every, other area of life technology. Is just another, area of life and the, New York Times launched, the experience, last week that, did this for, advertising. I can't be here and not talk about advertising. We're here because ad money 84%. Of, the Google's money is ad money so let's talk a bit more about how, those ads even get, made what, the Times did was pretty brilliant they. Bought an ad campaign and the content, of the ads was. How revealing, how those ads were derived as in, what that, ad buyer knew about the user when, they booked the ad in the system here's one example this ad thinks you're trying to lose weight but still love bakeries, as derived. From browsing, history probably and credit. Card history that. Level, of opaqueness and more of us understood, that not, just as the creators, of the tech but the users of it we'd. Want some more breaks some more visibility some more controls, to prevent that that's probably a bit too much we, can trust but we need mechanisms, to, verify. We. Also need to upgrade and, enforce, the rules around. All of this again I return to the idea of once, you identify, abuse, you stop the underlying, abusive, behavior, you don't just say please don't do that again and I trust that you will mind, your own business so I'm glad, to see the industry calling for regulation, we, need independence. As that, comes in to, play. Finally. This has been the ninth and final four, now principle, and it's actually the more, inspiring. One to me but I think of this as encouraging. Folks to think beyond, just, consumption. A lot, of the models, that were, designed, of the business model than the technical model to support them was, about getting people to spend time with to. Log on for longer to spend money with to suck life out of to. Consume and I. Say let. Me do what, you do I remember. The first time I used, the. Ad buying, capabilities. Of one of the platforms, where I was merely a user and I, saw my friends, and. I'm much more empowering. An interesting, way I could, slice and dice them I could understand, them in a way that I couldn't as a friend I could only see them that way if I, chose to see them as an object as the, target, of an advertising, campaign both. Enabled, by this platform, let's, close that distance, and allow, create, Divini on the, part of the users the contributors, in this new world that I am imagining here's, a couple of examples of beautiful, things that have been done with technology that, allow us to see ourselves a, bit more honestly, in a bit better there's. A group, called the equal justice initiative, which, does great work around unearthing. And revealing some, of the dark history of the United States with, respect to racial terror, Lynch's they, partnered, up with Google, google.org, to build lynching, in America, as a virtual, experience to accompany their, Museum in Montgomery, Alabama. There's. The center for policing equity, which, is bringing machine, learning, and big data analysis, to, one of the most painful and intractable problems, in this country which, is discrete, discriminate, use of police, force where, we know that police use force far.

More Disproportionately. Against black people, than against white people science. And math are helping, us understand, why and with, police are reducing, that Delta, in New. York City where I live for 12 years just fix, NYC. Is putting. Power in the hands of tenants, to, argue against, their slum Lords especially, tenants and low-income, and public, housing so they can do things like get repairs in their apartments, find, out who owes the bill owns the building respond. To an eviction notice this. Tool is using. That power with a more of a sense of Spider Man rules right. You can think about the ability of all of this to. Create power, to create wealth and then ask for, who who. Needs it more who, needs club music banging right now clearly we do we, just got it i time. That perfectly, to have like a club under, beat to my super, serious moral talk at the end of the section. But. Yeah make sure that, these tools are not merely accruing. Wealth and power to the already wealthy, and powerful that's, not a fun world that's a fun world for like six people literally. Six people and the rest of us suffering that so, I'm gonna leave, you with some resources some things you can follow up with all these slides are online Barracuda. Comm slash Google IO they're, in a Google slide show and. You can download them and manipulate them to your own degree, but I want to talk through ethical, OS org. Do, not. Everyone, got code that UK the, data and society Research Institute, out of New York City where I'm an advisor the AI now Institute, which, asks, deep questions, and does independent, study of some, of these systems project. Include which, I already mentioned and then a couple of books, winners. Take all by Anand Goethe das read, that you the. Age of surveillance capitalism. Read that first and an. Inspirational. Twist on all of this that goes back much farther than either of those two is a book called decolonizing. Wealth and it's, based on the ideas of indigenous, wisdom brought, to bear on a system, that we've built which is again. Very extractive. Very exploitative but. Could be much more contributory. Circular. And balanced, for. Inspiration, the verge has a beautiful, podcast. Series called better worlds which I think of as an inverted black mirror it's, thinking of positive, futures that we can use with this technology, and actually giving you characters, and narrative, to be able to inhabit the. Biggest book of all draw. Down the most comprehensive plan ever to reverse climate change, I think if you're not working, on the climate crisis do, that it's like the biggest problem, that connects all the other problems and there's so much being done right now to improve energy efficiency, and use, machines to find these spots we, can do and should be doing a lot, more to. Me these are questions not of tech they're. Questions, of life and reality. And of, power and as, we're building, these things we're also building, the world we're gonna live in where, we coexist and so ask what's. The world I want, to live in what's the world you want to live in what's the world we, want to live in let's imagine, harder, and imagine. Better and build, that, world thank, you very much I'm baratunde, a person, Lokhande, forever. You.

2019-05-17 05:50

Show Video

Comments:

wow. best presentation at i/o by a mile (1.60934km).

OUTSTANDING talk, my man. This stuff has to change for us.

Inclusive system = more data ;-)

This video has way too little views.

Other news