January 16: Intro + Opening Session

January 16: Intro + Opening Session

Show Video

Good. Evening ladies, and gentlemen, and welcome to. The. Inaugural. Class of. The. Ethics of technological. Disruption, I'm. Hilary Cohen a pre-doctoral, fellow here at Stanford's, Center, for the ethics in society, and and, on behalf of our full team we are thrilled to be welcoming you here tonight and to. Be kicking off what we expect. Will be a series of meaningful, and hopefully. Educational. Conversations. Our, topic. Tonight and over. The next several months will. Be the profound, ethical social. And political questions, that. Emerge as, a result of technological, change. Much. Of it funded, by firms just down the street on Sand Hill Road. Generated. By companies, here in the Bay Area and. Seated. By the research, and education, at this very University. We. Believe the stakes are high the. Premise of this course is that, there are a few things more. Important, to consider when it comes to the long-run, well-being, of humanity. The. Israeli, historian, you've all her re describes, it this way nobody. Would doubt that. The new technologies, will enhance the collective power of humankind, but. The question we should be asking ourselves, is, what's, happening at the individual, level we have, enough evidence from history, that you can have a very, big step forward in terms of collective, power coupled. With a step backwards, in terms of individual, happiness an, individual, suffering, so. We need to ask ourselves, about. The new technologies, emerging at present not only how, are they going to impact the collective, power of humankind, but, also how, are they going to impact the daily lives of individuals. In. Terms, of history he provocatively, writes the, events, in the Middle East of Isis. And all of that is just a speed bump on history's, highway it's. Not very important, Silicon. Valley is much, more important it's. The world of the 21st century, and to. Be sure the speed of technological advancement. And of disruption, can, be dizzying, so. Many of you in the audience may, have seen the latest internet meme circulating. On the over the last week or so the, 2009. Versus, 2019. Challenge, or the hashtag, ten-year challenge if you, haven't it's, fairly simple. People. Post on the one hand a. Picture. Of themselves from 10 years ago and then just next to that a more recent photo of themselves and. It's meant to show just. How much progress, or, change one, can experience in a decade of life, so. Here you have a couple of examples for those of you who haven't seen it. So. Now if we were to do a similar, exercise with. Some of the largest tech companies, or, events, related, to. To. Technology's, impact on society, it might, look something like this. In. 2009, you might see Mark, Zuckerberg proudly, proclaiming, the, mission of Facebook to give people sorry. I think we're on automatic. Proceeding. Give. People the automatic, power to, share and make the world more open and, connected a, statement, that would later change but. One that reflects the sincere, aspirations. Of idealistic.

Technologists. You. Might see the, beginnings, of some. Online organizing, that. Would usher in the Arab Spring the following year the. Democratic uprising, across the Middle East that demonstrated. The power of technology to foster, positive social, and political change. You. Might see President, Obama issuing. His first weekly address directly. To citizens across the country using YouTube in which. He described the opportunity, to restore science, to its rightful, place and wield, technology's, wonders, and. You might see that the most powerful, American companies, represented. A mix of different industries, from, Exxon to Citigroup, to Microsoft. So. Let's contrast, this with some, of the pictures you might see today a. Facebook. That has been roiled. By a set, of scandals, including, election. Interference, and privacy. Violations. And failures to moderate content, appropriately, around the world. You. Might see some of the, very same technologies, used, to give young advocate, of democracy, a voice are, now also being used by. Authoritarian. Governments. To wield more control, evidence. By China's, citizens, court you. Might see that the President of the United States is tweeting. About how tech companies are trying to silence or suppress the voices, of conservatives. Of half of our country, on their sites. And. You might see that America's. Five largest, and most powerful. Companies are exclusively. Technology. Companies Apple, and Amazon alphabet. Facebook. And Microsoft which. Are professional, homes for, many of you, now. Of course this. Isn't the. Full story of what's happened in technology. Over the last 10 years there. Have been extraordinary, developments. New products, services, and companies, that solve, real problems while. Fueling. The, economic, growth that we've seen around the world, but. It's impossible to, deny that. Over the last 12 to 18 months we've, also witnessed, growing, pushback, or skepticism about, the way the, technology driving, this progress has. Influenced, the lives of people and communities, there's. Been a shift in the general populations, attitude, and. The amount of political scrutiny aimed at the creators, of technology, and those who have reaped the benefits of, its creation a, transition. In, part, described by the, early hopes of liberation, technology, -, at times fears. Now of technological. Dystopia. So. Perhaps this, was inevitable, you. Know after all as, a society, were now forced, to come to terms with genuinely, new, concerns, questions. About data and privacy about the impact, that automation, will have on the future of work and on politics, about. The roles that both private, companies and governments, should, play in shaping. The lives of ordinary citizens and in either upholding, or possibly, diminishing, democratic, institutions. Don't. We'll touch on many of these our, task here is not to fall down simply on one, side of any single debate it's. Not to add to either the, roster, of booster, ish and mints of technologies benefits, or. The scathing. Critiques, of a specific, company or industry, our. Aim rather, is to moderate, a series, of conversations that, explore. The dilemmas, that emerge, when, technological. Aspirations. Bump, up against, the realities, of politics, and inequality. And a, plurality. Of visions, for, what a good society, looks like, it's. To begin to wrestle in earnest, with. The massive benefits, of technological. Progress right. Alongside the. Significant, costs, that sometimes, accompany, it and, we've. Assembled a group of experts from industry from, government and from civil society to represent. The full range of perspectives that, these topics deserve. So. In preparation, for this exploration, we asked you a couple of questions about the, roles that you play if, any in the tech sector about. The issues you're most interested in, there's. A set of questions about the state of affairs today, the. First how. Well-equipped, do you think US policymakers, are, to, govern a society, that will be transformed, by technology in, the years ahead on. A. Scale of one to ten the, average answer as you see here amongst. You was a three and not, a single person gave. Our, policymakers. Above a six, now. On the one hand this. Seems comical, right, it reflects what many, of us have seen as ill-informed, handling. Of tech issues by politicians, perhaps, most memorably, with, Senator hatch asking, Mark Zuckerberg how despite. Its free surfaces, Facebook, possibly, makes money. Now. On the, other hand if you're right this. Should actually alarm. Us that. The people charged, with governing. New technologies, have, such little understanding, of how, it actually works, in, the. Words of our colleague Shannon valor who is a philosopher, of Technology at Santa Clara University. Well. We shouldn't expect our elected, representatives to, all share a deep knowledge of, technical, or scientific matters.

They Do have a duty to be acquainted, with the basic operations. Of our most powerful. Horses and institutions, that are shaping our nation especially. When they are tasked with regulating, those forces, so. We also asked you about your faith in technology companies to confront new challenges, and possibly. Reflecting the bias. Of our audience or of our geography you. Had a slightly higher expectation. Averaging. At about a 5 out of 10 with, fairly, equal, measures. Of skepticism. And optimism, on either side. But. Most importantly, you, told us about the issues that drew your interest in this course about. Those that give you pause and, those, that give you hope. We. Intend to cover a large portion of these topics over, the next few months and I want to tell you a little bit about the team that we have on side to help us navigate them. So. In addition, to myself we'll. Be joined in every session by. Three. Sample faculty members first, is computer, scientist Mehran sahami who. Is partially, responsible for the enormous, growth and success of the computer science department here he, teaches the single largest class, at the university, intro, to computer, science, and. Spent a decade in industry, before, joining. The faculty then. We have political, philosopher, Rob rish who, has, proudly confessed. That he. Like any good philosopher, often. Delivers more questions, than answers. He's. The director of two centers here on campus, just. Leave the photo so you can see as all, the. Center for Ethics in society, and the Center for philanthropy in civil society and, finally. We have political scientist, Jeremy Weinstein, who, among, his other roles is the, director of the global studies division here and. Having. Served in both terms of the Obama administration. Brings, practical. Government, experience to the team in. Parallel. To this class the, four of us have been working for over a year to develop a related, undergraduate. Course which. We kicked off last week we, have roughly, 300. Stanford, students, future. Technologists. Inventors, policymakers. And activists, enrolled. Who. Over the next 10 weeks will, be diving, into, the code the, public policy and the ethical frameworks, that underpin, technological. Change and our. Course flow in here will mirror the four core units that we will be covering in that course so to remind you they'll be starting, next week, first algorithmic. Decision-making and accountability, then. Privacy, and data collection and civil liberties with an emphasis in here on facial recognition, then.

The Ethics of autonomous, systems with an emphasis on military, use, of AI and then. The power of private platforms. Some. Of the undergraduates are actually here with us tonight and with, their help we are thrilled. To be thinking differently about how you train, and equip the next generation, of technologists. And leaders to think about their roles and responsibilities. To society, so just. One note on format, before I introduce, the guests, who will be joining us ordinarily. We would love, and expects, to have microphones, on hand for you to ask questions after. All that's a central. Part of the sort of engagement, that a university intends. To foster, but, because. We are recording these sessions and, we need written consent forms. For audio, and video recording, will instead be collecting. Questions. From you digitally, in advance which many of you submitted, tonight and we'll let you know how you can do it for the sessions moving forward, but. Without further ado in order. To help, us understand the full landscape of, these, challenges, we're kicking off tonight with two distinguished, guests, reid. Hoffman co-founded. Linkedin and is currently a partner at Greylock partners he's. On the boards of many technology. Companies including. Microsoft and Airbnb, as well, as several nonprofit, organizations, he. Hosts a podcast called, masters of scale and recently, released a new book blitzscaling, based, on his Stanford course of the same name we're. Also joined by Nicole Wong who, previously served as deputy, chief technology officer, of the United States and the Obama administration. Focused, on Internet privacy and innovation, policy, and, prior, to her time in government Nicole, was Google's, vice president, and deputy general counsel and Twitter's, director, for legal eagle. Director for products please. Join me in welcoming everyone, to the stage. I'm. Gonna get us started with a general. Question about the. Framing in the moment we're in that nicole. Described. At start that used to nicole hillary described it to start. The. Issue is to take stock of the early. Optimism. That almost utopianism, that attended, the tech industry, and its founding, chiefly here in silicon valley and the moment we find ourselves and now in which there's great, fears of the, tech industry producing.

A Kind of dystopia. So. For. Our guests i just want to read representative. Examples, of some of the expressions, that had been offered so, the following was said by president, ronald reagan in nineteen eighty nine just a few months before the fall of the berlin wall, quoting from him as president the. Goliath, of totalitarianism. We brought down by the David of the microchip. I believe. He said that more than armies more, than diplomacy, and more than the best intentions, of democratic, nations the, communications. Revolution emanating. From Silicon, Valley will, be the greatest force for the advancement, of human freedom the world has ever seen a, few. Years later then President, George HW Bush, said that. He asked, the world to imagine, if the internet, took hold in China just. Imagine, how freedom. Would spread. Rupert. Murdoch said. Quote advances. In the technology of telecommunications. Have proved an unambiguous. Threat to Thoth alliant, totalitarian. Regimes everywhere. All. Right now fast-forward to 2018, I'm gonna read you the opening paragraph, of an article, that I just came across. Captures. The sour, mood of the day, quote. Mobile. Technologies. Distract, adults, and depressed, kids. Twitter. Is mostly, a bad platform. Full of Nazis, and assholes. Facebook. And YouTube blend, cheerful, up worthiness. Pathological. Self-revelation. And malignant. Deceit, delivered. To hyper personalized. Addictive, filter bubbles optimized. To capture, our attention, for sale the. Open web is a vast soul sapping, wasteland, of sexual depravity, and medical. Malpractice and. The, Internet, of Things is nothing less than an emerging, panopticon, and, behind. All of this good work lies the disruptive, innovation, of a small band of predatory. Plutocratic. Enriching, monopolies, inspired. By a toxic, mix of swashbuckling. Ambition, and moral. Preening. Oh. Man. Yeah, and end of quote, Nicola.

Read What happened in this thirty years between those two quotes. That's. A great set up we're. Gonna try and have you leave happy, tonight, your. Your sudden the challenge, really high. So. I actually, wasn't, sure what you're gonna be like I. Wrote. Down a quote that, you'll probably remember John, Perry Barlow. Declaration. Of Independence, of cyberspace right, I could have started there right well, so so here's the thing that I caught. On to you you do not know our culture or ethics. Or the unwritten codes that already provide our society, more order than, could be obtained by any of your meaning, governments in positions. Right. The, the optimism. Of the belief, that cyberspace. Was going to be this borderless. Place. Where. We would all rule ourselves, because we knew exactly how. This was gonna play out. You. Look at Bakugan Oh wouldn't. Have that have denied and. Even in the, in. The, laws that we eventually put in place that followed, so. For, example the Digital Millennium Copyright Act, or section. 230. Which is highly debated, now which. Regulates. The the, content, liability, for four platforms. In. The conversations. In Congress, around the development of those laws there. Was a a real sense of like we have to protect, the Internet it is. A place for growth it is a place for innovation it will be a place for for, individuals. To express, themselves, freely so, the. Laws were actually generated. With the notion like this is a fragile ecosystem, that, we should protect. And. We, have now come to this place where all. We want to do is shut, things down or. At least that seems to me to be the mimin in DC, right now it just apropos, that that point I was a part of a committee here at Stanford, maybe five or six years ago that was tasked every 25 30 years they put one together to, review the quality of the undergraduate, education, and when. This happened that five or six years ago one, thing that occurred to us is well how has technology changed, the, experience, of students at the University and we. Pulled the faculty, about what they would wish with respect to technology at, Stanford University, and the, main thing the faculty, expressed the desire for was, an internet kill switch in every single classroom I. Look. I think part of the, question. Is it's. Not I, describe. Myself as a technolon, a, techno utopian, yeah which means just because you can build the technology doesn't. Necessarily means it ends up in the in. The right place but. Actually in fact technology is more often the answer than the problem and the way that you look at it is say but there definitely we. Have a, bunch of different kind of birthing problems because exactly as Nicole said you. Know the initial kind of. Early. Days of the internet all, on John Perry Barlow Andy if everyone else say well, we used to have this kind of very centrally, kind of controlled by. Large, companies they're also you know media channel how do we give freedom to individual. Voices how do we allow people to express, themselves I mean like one of the things during that, I was really delighted about in the kind of the the.

Blogging. Arena. During. The Iraq war was was, being able to read blogs, from the Iraqi. Civilians. And so forth I'm the humanizing, and connecting, and figuring out what kind of what's the what's, the right way to do that and so there's a lot of, good. There. Now and in the past and then the question is say well, how do you reshape, it so, that you, have it and the reflex, to say you. Know Killswitch, is. A reflex, I don't think is right right I, think the question is is to. Say, okay and by the way we have this challenge of course which is well as, you making social policy, you have a lot of different conflicting interest groups that have this, interest or that you. Know one of the challenges, that that Facebook has right now is you have you. Know up, forces, generally speaking on the left saying how, do you allow all this fake news and filter bubbles and you force it on the right just like the tweet. From President Trump saying you're silencing, you know conservative. Voices and suddenly getting fire from both sides and resolving. That is is is. Challenging. And so I. Tend. To think that you. Know it's kind of. You. Know you always have to refactor, technologies. As you go and. I. Think some of the question of course what will be addressing some tonight, and ongoing is, you. Know well given. That hard problem, what are the right questions asked and what are puzzle. Pieces of, things. We might be considering, or, might consider, because, you know like again you go to your opening, thing and you say well, confidence. In government setting, the policy. Average. Score -. I. Was. Curious if you'd included, zero and your survey what would have happened and. Then, you know and confidence. In, in, company, five to six which, relatively. Because this all comes on relevant normally much higher but. Still has challenges and, problems I have a model in my head for why we, got here and I'm actually interested in whether you saw this in your businesses, as well which is if. You think back to sort of the the development. Of the Internet, in the early days of twenty years ago right it was largely, here, right, it was largely, us centric, the companies that were building the products were here it. Was largely being used, by US. Europe. Japan and. Australia and, Canada to, some extent but that was pretty much the set so that's a set of countries where. Our. Values. Are largely aligned, our. Judicial. Systems, are largely aligned, we, are mostly, in, the same place in terms of our treatment of freedom of expression, and privacy, and. Then, in my experience, at least around. 2005. 2006. 2007, when. Internet penetration in, some of the developing world Brazil. India. Russia, China, really. Started to take off you. End up having. This audience, and communities, coming, on to the platforms, that had previously, been dominated, by I'm, gonna call them Western democracies, as a shorthand that, have really, different values that, have really. Different systems, of adjudicating. Fairness. And. Individual. Liberty and I, think that that caused, a a, real, conflict, both at a policy, level but also at a technical. Level of like how do I deal with this because my, experience, with engineers, who are building for these platforms I'd ceiling so where are you gonna launch this product and they'd be like fig. CJK which, stood for in. French, Italian, German. Spanish, Chinese. Japanese Korean and I'm like those are languages, not. Countries. So, we, have to talk about the, countries. Not, just the distribution. But. If I can push on that a little bit Nicole as. A sort of frame for thinking about when the challenges, began, let's. Take the kind of you, know Western advanced, democracies, the United States European, countries, may, be the initial engines, beginning here and spreading into Europe there, are pretty damat, dramatically. Different perspectives. About, how to regulate. You. Know technology, companies, even, among that set of countries, that presumably as you described share, a set of common values I don't know if folks have been to, Europe recently but, since the adoption of the GDP are a key, legal framework, in Europe what, you'll find is that when you turn on your phone and you try to use Google Maps or you go to, some website you. Are giving all sorts, of permission, for every, single step you take and are being told all sorts of things about how your data is going to be used that, that, we are in the background of everything in the United States so even among these countries, with, shared values that, facilitated.

This Growth, they're just dramatically, different perspectives. About the ownership of data about. The, ways in which technology, companies. Should be governed, how do you think about the, origins, of those differences, like what drives those really different perspectives, and from, both of your sort, of different, roles both at companies, but also in government how do you think about what the implications are for the United States as we grapple with some, of these challenges from how Europe has begun to move. Is. Like. So. The. Difference is you're right which is the difference is even between the countries at the very early start word we're, different and I think to me the challenge the. Reason it was a little easier to deal with is like if you were gonna get a government. Demand in Europe it was probably gonna be a law. Enforcement official, who had. Credentials, who, was going to present you with legal process, there was a judicial court that was going to deal with that you. Go into India or Brazil that's, actually not how that's gonna go down that it's gonna be cops at the door in your office, waiting, for you to take the servers away right so so it's it was just a different moment, if you were at a company, trying to manage. The legal issues around that. But. The difference. Is in. Country. Law have always been there so the, the sort. Of, privacy. Dimension, in general. Yes Europe because of its history and its use of data. Against. Its citizens, particularly during World War two has, a lot of more sensibilities, around, how, data, gets collected and, used and governed, then, in, the United States where we have largely, thought. Of it as a consumer protection issue as opposed to a human rights issue which is how it's framed in Europe and we, have largely let, it be. Regulated. At, the Federal Trade Commission level which is a like. With a commercial idea or an economic and consumer, idea behind it again, as opposed to a human, rights level, as it would be in Europe, Europe. Also has things right that, we've always dealt with which is in, Germany, and France it's illegal to traffic, in Nazi, memorabilia in here, it's a First Amendment issue, and. And so you've. Had to deal with those, types of differences. I've. Found, the challenges, to be. Once. You you, can sort of do it at the margins, of like okay, like we're not gonna have hate speech on the platforms, or whatever but, once you start getting into countries where it's a radically, different sense, of is, permissible.

To Have, LGBTQ. Content. On your site it's. Clearly, illegal. These countries, what. Is the resolution going to be and when you build a platform that's global, you. Have to make choices either the, content comes down your IP blocking, it your your taking, some step in order to be able to be in market and, one of the challenges, when you look at. The. Differential, European, approach to. Saying ask for, permission more than ask for forgiveness is, it's. Among the reasons why because by the way there's a ton of talented. Technologists, great, technical, universities. You. Know a stack of things and yet very little of these. World-changing. Technologies. Are coming out of Europe for the global scale I think they have all the talents and everything else I mean you have Skype, and you've got you, know super cell and you've got dealing but it's a relatively short list you consider where browsers. Are built which. Ones you know current created the networks think about search engines, think about all these sorts of things and. And. Part of I think the relevant. Question. Is. What's. The way that we get to developing, them because if, you say well. We. We, we should shift more towards Europe that, may say that well we should be more like like. For example like, in the position arrow picture the rest of the world and so the rest the world will develop those technology, platforms, you know maybe Asia other kinds of places and so, you have to kind of balance that out now I actually, as part of saying technolon, eyes I actually think that these things are refactoring. So, I tend to be in the look you have to make sure that they're there they're, there are some critically bad things like medical, stuff and other kinds of things you go no no get it right first but. There's other things you see look at it say okay that was a problem now let's change it and so I tend to when. I go over to Europe and talk about this I tend, to advocate, for more, of a like. Like. Refactoring. Then, and ask for permission you, know a forgiveness, than a permission has, approached if I can ask where do you draw that line how. Do you think about what the harms, are. That you're willing to accept to. Have that flexibility. So, in. This, book that I published, last year and Fall, Quarter blitzscaling. I described, kind. Of three kinds of risks that. Blitzscaling. Company, should should, track and pay attention to and part, of this was one part of the book was you, know here's how technology, companies in the future are going to be shaped. By a set of techniques that we learn here in Silicon Valley also practice in China to, help other areas in the world but another one was how do I help evolve that practice, about how we're doing it one was you, have serious. Kind. Of like you know health, implications for, individuals, right, so an example of something I wrote a an essay, on as a as, a failure case was theros right, so you say okay you're making bad blood tests, or doesn't you shouldn't be blood scaling that you should be tracking, that another.

One Is if you have a kind, of a you. Know kind of a moderate. Impact across, a large number of people then, that's something you should you, should consider too or if you're going to break the system like break a like. A payment system or logistic, system and that's the thing but. Within that basis. You. Know generally speaking you want to kind of actually, try it and see it and we factor it so, for example if, you say well ok these scooters are cluttering up our streets we should have had them ask for permission first it's like well look. People have to be careful about like what does it mean for you know like possible. Accidents, and fatalities and, so forth but, a refactoring, of cluttering, on your streets, with scooter is actually something you can refactor so, that's that's the kind of the framework that I sketched in the book I, was, addressing, business people and the entrepreneurs, doing this but. It's also a framework, law which you can think about as a general society as well I want. To pick up if we can on one of the threads that and Nicole, introduced which is this idea, of. Complying. With various, government requests. When the, range of values or legitimacy, of that government might be different. In different contexts. So right now there's you know a lot of debate, on this sort, of topic you know you see Netflix, taking down this comedy special that had. To do with Saudi, Arabia, and and, Mohammed. Bin Salman and you. Know they chose to take it down you have people. Protesting. Or at least asking questions about whether or not Google can should re-enter. China given their. Record on the. Way they might use the technology, you know how much, room is, therefore an American, company to assert. Its, American, values, when, it, is trying to operate in a global, scale in global contexts, in. Areas that have these different values yeah. So um I think, it's part of it is like stop thinking of yourself as an American you're gonna build a global platform think of yourself as a global platform, and, and, adjust. Accordingly and I say that my, training is that as a First Amendment lawyer right, so like I yeah. I was like I want to be a lawyer so I can pound the table about the First Amendment and. And. Being, at Google I. Had. To readjust. Right, so when we acquired YouTube, what. Had largely been a, an. Internet, of, you. Could cab in a market because there was a language barrier right, but once you get into video and images, it's, it's visual. So it's global, automatically. So an audience, that was intended for Turkey, is now, being seen in Greece and elsewhere and and, the sensibilities, are really, different one. Of the issues, I had early, on with with YouTube was. We. Were global. Someone. Posted. A number of videos. About the king in Thailand, and they, were offensive, they were pictures. Of him with a foot, over his head which is extremely offensive with his face made into a monkey and. Things. That were clearly, intended. To defame the king and that, is illegal in Thailand they have what's called a lez majesty law it makes it illegal to criticize, the king, so. If you're an American, with a First Amendment background, you're like what. Are you talking about the whole purpose, of free expression is to be able to criticize your leader and so. The, question came because the, we were contacted by the Ministry, of Information in Thailand they said we're gonna block you take down these videos or we will block you and. I. Actually. Then traveled to Thailand to try and understand what was going on to meet with their government officials to meet with our own like US Embassy to get some background to, meet with people, on the ground in Thailand understand, this law. The. Most salient thing that I was told was actually by one of the, the diplomats from from the United States who, told me like this. Is a country which over the last 27. Years has had 21, coups. The. King who was in his approaching. His 80s at the time was. The only. Source. Of stability in, their country for a generation. So. It means a lot to. Have insulted, him and that, is felt not just at like the highest dester lines but like all the way through the country, and so, for me as a First Amendment lawyer I have to back away from my sense of like revolt.

Against Your government have the ability, to criticize and start, to think from their perspective what is, appropriate speech, that that, creates. Civil, discourse, right, what, what is what is the thing that is the is. Essential. To that culture and be respectful, of that because I am a global platform and. I think that that at. The time like 2007. That felt like a really big leap for me to make and now knowing, where we are in our current political atmosphere I'm like okay. So. It's, not just isolated to certain countries mr.. BAE's, for you that so. Agree, with everything the Cole said I think I would say one thing and it's not so much as a as as. Promoting, a specific set of American values but I do think that one of the things that. Companies. Should do is say look this is what we stand for this is who we are this is what we're fighting for and so. You. Know in the LinkedIn context. It, was we're, fighting, for kind. Of individuals, ability to maximize. Taking. Control of their own economic destiny, so. We we. Don't as much address freedom of speech issues because we said look that's not necessarily tied to it but, what is tied to it is, if. You can represent what your skills are what your ability to work is so. If you have a country that says oh these people aren't allowed to work or whatever we still put that we we would still put their profiles, up we'd still you, know kind of facilitate that so I think it's good to say these, are the things that we stand for and we're doing to be clear about them and then to navigate, that and as. Part of that for example, you. Know when we I think we kind of surprised, the. Russian, government. And you, know they were like okay well we want all this data because we want to be able to have control no, no we protect our members. So we're not going to do that and you can do what you need to do because we're not going to hand over, individual. Data in bulk without. You. Know without some sense of protection here we'd rather not operate, if you force us not to operate right but that I mean herein lies a fascinating. Point which is that it's not an example of asserting, American values it's about a company. Having its own values and actually when we think about the identities, of these companies you know whether it is you know maybe they happen to be hosted here but they have a self conception, that isn't tethered, to the, geography in which they happen to be rooted sometimes well, I think the push on that point a little bit I mean the notion of a company having its own identity is that you could think of the company having an identity but a company's also made up of a bunch of technologists. Who also have their view about the work they're doing that either they, can communicate to the management of that company or they can actually encode, a set of values, into, what they actually build in that company and so when you think about sort of the different leverage points that way what an individual, technologist, can do what the company, wants to do more broadly in terms of fighting for its values what, happens when journalists. Want to shine a light on a particular, issue, where. The leverage points how do you see you know running, a company or as a regulator, how you navigate, all these things to try to come to some sort of place. That you think is leading to a better outcome when there's all these competing leverage, points well. I know, that Reed has done this because he and I have worked together over the years and so right. Like one of the things as a company, and as the leadership of the company is is you find your North Star right, which is like what are the essential, parts of us. As a company of the things that really mean a lot, to us and that we would be prepared to defend and, and some, of the work I do now is talking to young companies, about like, forcing. Them into what are pretty hard conversations, like what, would you defend. If, it came to it right, is it open Internet is it the privacy of your users it's the, free expression of the ability, of people around the world whatever, that is but but knowing, that helps. You create. Your strategies, going forward and then from, that once you have the North Star. There's. Very tactical, operational, decisions about like which markets, do we enter.

With What products, do we enter them how do we build those products, and then, it's, it's really the tail end then right the things that I was talking to like taking down certain videos then you're really like at the very edge of it because you made the hard choices up here, and. Part. Of we're gonna do is you build the company culture around it so I think it's really, important to say here's. What we would rather fail here's where we would rather shut down here's what other in doing this this is who we are and, about if that's not what, you want, to do this is the wrong company for you to be working at you. Should go work at a different company. And so I think it's really good for these companies say this is what our mission is this is what we believe in I'm good let's take another test of that so one of the things that's been in the news about some of the companies that you either work at or on the board of now Google and your tube Nicole, or Microsoft, in your case Reid question. Arises, whether or not the, technological. Developments. Or products that these companies are advancing, should, be sold to were put to the use for military purposes, especially on behalf of the, US government these. Are after all US, companies and, you. Know I, think. Most people here in the room know about the project may have an issue here at Google so I take your your, statement, just now redoes well, you should know what you're getting into and maybe you shouldn't be working at the company so the employees, who rebelled against, project maven at Google your answer to them is quit, go find a different company work out well, I mean Google, has to say and this is not you, know my place to because I'm not oh yeah. But. Do I say this is who we are this who we're not right now, if, it were me in that position I'd say look I actually think that there's. There's various. Ways in which look we are an American company responsibilities. To. The American state and the citizenry, in society, and some, of that is making society, safer. So, we do actually have an obligation to interface, with the DoD. In some ways and there's things we can do that we're pareto the, world's better off and the US is better off and we're gonna go do that and if you're not comfortable with that maybe, you should work somewhere else that would be my well. How about Facebook or sorry not Facebook yeah Microsoft, and Microsoft has facial recognition technology, it, couldn't sell it to non, democratic governments. Should. It. Well. So this is a place where as a board member I should say well I should defer to the company. I'm. Not its, sake and everything else I think, the companies, do have a responsibility, to think about and Microsoft, does this our chair Brad Smith there very, high principle folks saying, look how, do we think this is going to be used do we think it's it's.

A Good use or not and if, no then you don't do it right so now for example if if if, a if just, like country, X said well, we want to develop the following, kind of bio weapon, and you happen to be in our country and we want you to do that the answer's no that's, a bad, thing it's it's. It's, like a disaster now they say well, we think actually a facial recognition thing, that will work in, airports. For terrorism, and will, be part of how we're operating we're, going to have it well organized legal and judicial system, we're, gonna organize through, that it will actually make society, more safe then, as long as you're you know I kind, of have a big, fan of well. Organized democracies. You, know we can have a discussion about whether what we're currently in one or not. But. But. You know then then that's a different question okay. Can, I follow up on one, of the things Nicole said which, is this notion of a North Star you're, talking to startups I think that, idea resonated. With both of you you should know what, lines you draw for, a lot of us who aren't in technology. Companies but we may be users. Of the products generated, by technology, company, companies, or citizens, affected. By the impact, of technology companies, is there, some obligation for companies, to, communicate what. That North Star is because. Part of the you know the example that Hillary gave with Facebook's. Initial, mission statement, and you can look at the evolution of Facebook's, mission statement, over time we. Could take uber, as another example. We. Are discovering. What the North Star is over. Time which, is quite at odds with what we had. Perhaps been communicated. About, what the North Star is so, how do we think about like, is that an internal, guidance for the company is it, something that you. Know for the purposes of investors, thinking, about investing. And buying shares that that's something that ought to be communicated. Much, more clearly and public. Of course part, of what's hard is that that's an evolving, thing as you think about entering, new companies, or confronting. New frontiers, but help me help me understand, it from that perspective yeah. I mean in, every country companies, is a little bit different right but I do think. The. Exercise, of your executives. And. Actually frankly to make sure that your executives, go here well is to, get them on the same page in terms of that North Star and, there's a really hard conversation in, my experience like to get everybody in, one place and. Then. It becomes usually, in my experience part. Of the, things you communicate, outwardly, you're either communicating. It by virtue of the way your products work or you're communicating because you have a mission statement or a filing with the SEC but. You have to do right so typically. That would that would be the case now you. Do have companies. That say one thing and do, a different thing which is a separate, problem I think then. Determining. What your North Star is going to be. And. So yeah look. I generally, think I don't think this is something, you can easily put in a policy or law but I think it's much better and. Much more principle for companies say this is who we are there's what we're doing this, is what we're shooting for. Sometimes. You of course have secret, plans you don't reveal to your competitors but, part of them like for example very, early days in LinkedIn the. Kind of cultural, ethos that we establish whole company is of what you were doing was printed on the front page of New York Times could, you go to your friends and say yeah it looks a reasonable, thing to be doing and that kind of thing is a way of doing it and building that culture. From. Very early and you. Know I'm a critic, of, like. I think it's okay to kind, of advocate, for different, regulation. By. Building a product into it where you say actually in fact I think that the way our societies you change that could take uber as I click should in fact no, we shouldn't be locked into taxi monopolies, we're better off with logistics, as a service it'll make society better that. I was I'm strongly supportive of the, we're, now going to hack the app so the government officials look differently to them as they're coming out to that that, they shouldn't do and that should they should be hit with a stick with right, so so, it's kind of some and some on these things in place to the point is like they say one thing do another bad.

But Also like. Like, you should, be advocating, for a different, regulatory stance, not trying. To fake it I'm. Gonna push one one time here too because you know as like. Jeremy I'm not an insider in a tech company and I can listen to and hear the wonderful statements, about organizational, culture and having a North Star and leading with your values, sounds. Great and then, the real if four years ago yeah but. Yeah four years ago ten years oh now what what, what occurs to me to think however we're sitting here in the Graduate, School of Business that the North Star of every, company is to make money it's a maximize, the profit they can make and, that's. What, the incentive, structure, is in addition to the value that's, put, forward so, maybe that you know the current example right now to put on the table about this is jewel this, you know vaping product created. In the design school here at Stanford, with what seemed like a great value proposition here. Was a way to get people off of the. Carcinogenic. Properties, of, tobacco while, keeping your nicotine addiction and. Then managing, it and instead, it turns out that over time they market it to a bunch of teenagers under pineapple, and candy cane flavors and now there's a ton of kids, you. Know with an epidemic, now of vaping so, that the way I explain, that is not that the company either never had a North Star or lost its North Star the way I explain it is they wanted to crush their competitors into the ground and make a lot of money and they found a way to do it so, I'm, by no means defending. That all corporations, are, Saints. With halos. And. I do think that one. Of the problems in every. Area of society is, you have the structural incentives, that lead to some kinds, of corruption and definitely, while. Overall. The. Kind of collection of business and profitability incentives, has led to a lot of productivity a lot of product services right jobs you. Know greater prosperity for medicine all the rest so so, generally speaking when people tend to say well I'm not against capitalism, I think well like sure kind of a fool right, now to manage capitalism. Yep right and like oh let's deal with child labor let's do the other things those are important and so. So. It isn't you know a panacea, to say, just. Have a mission statement and you, know there's. A lot of businesses. I don't. Invest, in even though when I look at and say okay that could be a good, profitable, business where you have an interesting equity, thing because I think it has the wrong, impact now I have a privileged. Place of I had a lot of other investments, to make a bunch of money so I can make that decision and it's harder and in other circumstances. But. The. You. Know so I do think, that one of the questions. Oh and then the other thing that's you, know this but but to say for the audience, there's. This kind of question there's a set of judgments about yes, you're maximizing profitability. But you're also building brand. Longevity. Long term so, there's a set of market incentives about, like maintaining trust and, and, doing things that by which, people say okay we. Think you're good custodians, we like you and so forth and and so, a lot of kind of corporate, social responsibility, programs and other kinds of things going on so I think it's when, people usually say hey just trying to get every last penny well there's some organizations, of doing that you, know they part. Of the function of media and other things is to call it out and make sure that markets. Respond, to that in the right way. But, the and you know sometimes government, and policies, as well and. But. But, on the other hand I actually think that there is also it, doesn't mean that when people say things. Like look. What I'm really trying to do is make the world more open and connected that is not they, are genuinely, trying to do that I think that's and it and things, are really hard right like I think we are not served, well by trying, to dumb down the ethical question sure right and so the the the North Star is a place for, a, point. Of reference as, a North Star was actually supposed to be right but but, it in the actual application of, it the, conversation is nuanced the factors, that you're trading off are difficult, the Google, maven question, there, were really, important. Con arguments. On both sides of that I think that's also the case with a decision, to go into China right like it it, is it is an ethical question because, it's hard and so, to, me the obligation, is to have, the conversation. Right, and and, maybe in some of these companies I think part of the the.

Employee. Revolts. In some cases is that, that conversation is not being had well internally, is not he had transparently. Is not being communicated well, and so you end up with with, with a reaction to it. There. Are some times where you know executives, have to keep things kind of under wraps for a certain period of time but you have to at some point be able to explain it so, I think it's that follow up on that because you make a really good point I've just threw out an idea to pause it to see what your reaction is that, you know we have, these slides that in ten years there has been these changes in technology, and there's a North Star of the companies having some positive the idea would posit that the. North Star of these companies did not change the public perception, of their North Star changed, and so, what point does a company actually, say what we believed we thought we were doing is not what we're, actually doing, and maybe, we need to change our North Star and communicate, that does, that happen. So. I think it almost certainly happens I mean one of the ways that I, look. At what was happening with a tech industry, is. You know you kind of start makin the input, scaling I kind of went it's a transition, from being a pirate to a navy right. And so it's can you kind of start with your squash buckling, there's. An element of disruption, and then you eventually get. In a more structured and regulated system and ultimately part. Of what's happening is these companies are moving, with stunning, pace towards, being, part of social infrastructure being, charted, in. This kind of infrastructure and then that, now. Adds, society. As a customer, - right as a stakeholder at the very least and so, look, the thing that I was, trying, to encourage with some of the stuff I put in the book and some of the blog the. Podcast. Matches, a scale and, they and the blog, posts was. To say as, you, get to a massive. Scale you start thinking about okay what is what is what, is my intentionality, around social responsibility. And and. You. Know for example take. Something less charged. Than, you. Know the questions of you. Know filter bubbles and so forth you, do have like this kind of issue around like okay so frequently. A lot of online spaces, are not really safe for women you, know okay what do you do about that that's not okay that has to be fixed right and once, your social infrastructure, and there's everybody's, there you've got the whole range right. So you need to figure out how to make it a better. Clean. And well lit space in and in, all the, spaces that are important to go as one, of the citizens and I think that's that now, beginning, to think of we have. Responsibility. Not just to our individual customers who are showing us by, their choices and behaviors, also but also to society. So. Reads book is called blitzscaling. And I want to put sort. Of Nicole on the spot a bit, you, know she drove down from Berkeley today. And. She. Has been sort, of recorded, in a podcast making. The case for, a slow food movement for. Technology, right, as an alternative, to, the blitzscaling approach, that is that artisanal, gardener from Berkeley right, and. I want you to talk about where. That argument, is coming from how. You think it's different than, the get, out there break, things and iterate, sort. Of mentality, that we have with technology disrupt. And then adapt. What. Does it mean to go slow and when should we go slow versus, go fast yeah thank, you for the question so. That the, conversation. That I was having around, shouldn't. We maybe have a slow food movement for, the internet. Was. Actually with with Kara Swisher and, and the context, of it was. Russian. Disinformation and and. Sort of the poisoning, of a lot of social platforms, with. Content. That's just coming at us too fast and so, let me back up a little bit and sort of regurgitate, that conversation. What. I was trying to talk about we're like what.

Are The values. That were embedding, in these. Social platforms that we build or. Or any platforms, that we build so I, start. With the model of when I was at Google I started at Google in 2004, the only thing we had was search just. Kind of an amazing, thing like remember, when Google was just like a box, on it with a blue page and and. And. The pillars, of search the, design, elements, of search were. Comprehensiveness. Relevance. And speed, so comprehensiveness, we went in all the content we could find on the internet so that we could fill the library, relevance. Meant we would, and the question you were asking, so that we could deliver the, relevant, result to you that would answer your question and speed. Because we knew that we, would keep you, going. To more, places getting you better, react. Responses. If we could just speed up the way we deliver the the results. Around. Two thousand, five six, seven. We. Had acquired YouTube, and the. Social, graph the social networks were sort of on the rise. Behavioral. Advertising the, targeting, of advertising, based on who. You are where, you look on the web what you what you seem to be interested in started, to be to grow and so, the pillars, of how, we build these platforms that the metrics, of our success. Changed. From. Comprehensiveness. Relevant, speed to, engagement. Can, I keep you here can. I make you stay and watch more, things, or click, through more pages. Personalization. Not. Deliver. You the relevant, thing but, deliver you more of the things you like. And. Speed, and we're still on speed and, to. Me that, the, those elements, of, engagement. And personalization, is, the, rocket, fuel of all of those outrageous tweets. And Facebook, posts and YouTube and videos, that, just. Tunnel. Through your day and like make, you lose time, as you're sitting on the internet and, in, some. Sociologists, are arguing radicalised, certain, people into certain poll more even more polarized, positions, so, when. I was talking about should, we slow this down right. What if we decide those are not the pillars I want to live with that. The type of platform, I want to sit on is one, that values content, which is authentic. Not. That I need to know exactly who the author is but like at least I knew it came out of 4chan, originally. Or or Breitbart. As opposed. To the. New York Times just what is its provenance. So, authenticity. Accuracy. By some metric and the, context. Of how. To stand a piece of content if I value that. How. Do we design for that right, can, I take the ads optimization. Team that is doing so much for Facebook. And Twitter and and and youtube and port. Them over to that particular. Task for a little while because, when what would we end up with then so. That was the concept, of like can. We, get. Away from the models that we're doing right now which, are so.

Viral. Built, a morality, and, and. Sort. Of this this high-speed motion, and and. Create. A place where you could have. My. Friend Jeff risen, at. The National Constitution Center a Madisonian, moment as a reason, to public discourse what's. Good but so you were there at that founding, moment right and I don't know if you would characterize. The. Shift from you. Know relevance and, comprehensiveness. And speed, to. You. No engagement. Personalization. And speed as a change, in the North Star of Google. And, evolution. In a slightly new direction, but if you're willing tell us a little bit about what. Kinds of debates went into, that. Direction was there any anticipation. Potentially. Of the consequences. Of moving. In this direction for. The quality, the authenticity the things that you're saying we. Ought to have approached, this more slowly and thought, about it were these things actually weighed in some. Serious, way and the, consequences, anticipated. And some judgment, made by somebody that, this was the right direction. To go and for what reasons. So my recollection. Of it and it's been a number of years at this point and is. That it was too early to tell right. Like we didn't know what. Personal, personalization. Felt like the right thing personalization, felt, like a companion. To relevance, in a bunch of ways right which is like hey, if I know that you typed in Vass. But. I also know that you really enjoy fishing and not guitar playing I'm gonna deliver you search results that, are more personalized. To you right. And therefore more relevant and, and I think engagement. Seemed to feel like hey if they're engaged if we're measuring how. Much time they're spending with, us that, feels like, they're. Giving us some love right that feels like we must be doing the right thing because they're still here and I think in the original conception of, shifting. Those design colors which are not North Stars I think but design pillars I think. That that it felt, like it was aligned, with the mission still, and it, was not until we got to scale, right. That the, detriment, of it arose. I also, think honestly, like, I. Still. Believe in a free and open Internet I'm an, optimist, as well and. I don't. What. I what I feel naive about is that, we were gonna get exploited, for believing. In a free and open Internet right. That that we would be weaponized, by. Allowing. Our platforms, to be broadly. Available and, and, and I and we did prepare for those defenses, at all because, look when you get to society, and infrastructure, same reason a little laws, we, have police, forces we, have a bunch of things and as you get the infrastructure, you have to realize it as a different. Game that you're playing and that's actually part of what I think everybody's. Been learning in the last you, know kind of five. Ish years, for. This and I think that's the like. There's, a subtle thing that's really important and kind of one of the reasons it's great you guys are teaching this class, part. Of the reason you know agreed, to be here is. That the that. The the subtlety, is a lot of this stuff actually happens, one as Nicole says you can't predict in advance that's part of the reason I tend to be a fan, for, experiment. Measure refactor, right, as opposed to try to plan it out all out in the beginning did you just don't know where it's going to go and, as you get to this infrastructure but the second thing is it's. The subtleties, around what happens when you're a product manager managing. To a dashboard. Into a number and, part. Of the thing that we need in kind, of, you. Know within the profession. Of product managers program managers, engineers as always say okay, look we are maximizing numbers, because you maximize engagement you maximize revenues so that's the natural function actually in fact I kind, of wish more government systems had dashboards, they heard running you, know efficiency, in target on, but. Also, to realize that those, those can mislead you in some bad ways so you say oh how do I get more clicks well, actually, went maybe what I'm trying to do is get you more outraged and actually, when you think of society. As a customer, we actually don't want more, and more people to be outraged and so we should figure out how to get more clicks without, using, that you, know and kind of tuning, which numbers you're actually looking for and and it isn't really necessarily, a plot. From, the company usually it's unintentional and the, user it's kind of unintentional through product me I'm just going oh my my job is to. Increase, you know monthly engagement, by five percent that's that that's what I've been tasked to do and, you go okay yes that's good to do but do, so with some thought to what the broader, context. Is as you move in infrastructure, in your part of society and that's.

Part, Of it too as you, get there let's, push on this one just a little bit because maybe you can give a concrete example I like the idea that as these companies have grown become, global. They constitute, social, infrastructure, yes and at that scale, then, society, becomes as it were a customer, to there's an obligation there. Alright so the, product, manager, the developer, working into the product manager as a dashboard, that's giving, all kinds of indicators, on you, know marginal. Improvement, on time on platform, or whatever turns out to be what's the dashboard, looked like at the product manager level for the social infrastructure, well. So I think part of the things that you look at is you say okay and by, the way part of what. These. Platforms, need is some more. Illegitimate seeing, guidance from society because remember when when. Some, engine facebook engineers ran some experiments, about we bet we think we could make you happy or depressed, outraged. About you're manipulating. Us right right but part of what you're saying is say well we'd like to have less hatred we'd, like to have less with eg, we'd, like you to go back to those experiments, you're doing and tune them in this way I need some some. Guidance. From the society say yeah, look we'd like you to take thing because you can use you, know modern AI techniques, to do sentiment, analysis and, of what you're doing is you know kind of versions. Of of, aggression. Speech and all kinds we'd like you to tune down how much we'd like you having on your dashboard what. The sharing, and the volume, the percentage, mommy's and we'd like you to tune that down we'd like you to make it a little less shared like maybe to, share that one requires a thousand, likes versus versus, 500 likes right ears and you kind of tune in that way while you're still trying to get engagement up you're now trying to make it may. Be slow or I'm not sure slow would be the word I'd use right. But. You're you're, trying to make it more civil yeah right and in terms of and but. They need the guidance for doing that because precisely. One of the reasons why they tend to retreat to saying, hey. Look we're just doing the simple algorithm is they, don't want to be accused of trying to manipulate right, right, and and so what we're, saying is what we need is a society, is we actually do. Want you to manipulate in certain. Ways that we think is better off for society, now that then returns us back to okay who, is well organized to say that who has the competencies, for doing that. Into. Your operational, question is like I'm understanding, your classes like two thirds CS and then the other third if I don't know who you are I am, an American Studies major an, English minor a poetry, fellow I have like every, liberal arts card stacked against me right now but, maybe, we need more of us at the table right like so when you design your business when you design your product you, need more.

Ethicists. Anthropologists. People who, are filtering. Ideas. To you about, not. Just is there. A good product market fit but. Is there a good societal. That, well, but more specifically, though to, do that you also need, to either that's, part of reason I was earlier advocating, values because they love join the platform or not here's what our values are so, freedom market of choice but. If you're now going to to to. Refactor. After. Scale yeah, you do need to have some kind. Of a. Permission. Or encouragement from society, to so do right. I mean it doesn't have to be a hard, time on the scale but. You say okay we'd, like because by the way you. Start implementing. Algorithms, like th

2019-01-27 08:01

Show Video

Other news