2023 Albright Institute: Laura Manley on Technology

2023 Albright Institute: Laura Manley on Technology

Show Video

it is my honor and pleasure to introduce Miss Laura Manley today Miss Malley is the executive director of the shorenstein center on media politics and public policy at the Harvard Kennedy School shorenstein she is continuing her longtime work of supporting democracy by leveraging technology and data for the greater good previously Laura served as the founding director of the technology and public purpose project at the Harvard Kennedy School Belfair Center for Science and international affairs there she launched several initiatives to make technology more inclusive and equitable she has testified on Congress twice on improving Science and Technology expertise and capacity in government earlier in her career Laura co-founded the center for open data Enterprise where she worked with eight National governments on their digital economy policies she has previously served as senior consultant for the World Bank group and the United Nations group department for economic and social affairs Laura lives with her husband and two young children in Arlington and in her free time Laura enjoys cheering for the Red Sox cycling and figuring out how to keep plants alive let's give a warm welcome to miss Laura Manley hello everyone um well I'm just so pleased to be here with you um I was really excited when I got the email from Stacy so thank you for having me thank you all for hosting me here um what I read about this program and all of your profiles I was just so thrilled so I'd really love to hear from you as well as be able to share my expertise with you I'd like this to be a conversation so I'm going to spend a little bit of time going over my slides and talking through some of the stuff that I've seen in my career but this is not just a one-way conversation please if you have a question if you have something that comes to mind or an addition that you'd like to share feel free to just add that in raise your hand and we'll have a conversation and then I'll leave most of the time in today's discussion for us to be talking together okay all right um so as Greece said I'm the executive director of the shorenstein center on media politics and public policy at the Harvard Kennedy School but how how did I wind up there and a lot of times when I've given talks especially recently people ask how how do I get to the point where I can lead a research center especially focusing on these topics did you know it in advance and the answer is no for sure not actually my background started in in mental health I worked for the National Alliance on Mental Illness and I did that for almost a decade and it was actually my dream so when I was in Middle School they asked me what did I want to be when I grew up and I said I want to be an executive director of a Mental Health Organization maybe one day I would do that and I was lucky enough to do that by the time I turned 23 I was committed to it all through Elementary School all through middle school high school college and then I got to do it and I really really care deeply I still do care deeply about about Mental Health but I was 23 and I was like there's got to be more out there so around that time when I was executive director I also started seeing a shift in the mental health landscape and that people were starting to think about how technology could be used to reduce stigma for mental health make it more accessible to people in different parts of the world or in places where they didn't have access to a therapist so that's when I started thinking about technology and I really started looking at how do we think about complex issues of data sharing and privacy very confidential data especially on digital platforms so from there I went back to grad school and I really focused on Tech and some thorny issues around data governance privacy and Technology especially for issues around Mental Health and at the time many of the government agencies were starting to think about data policies in emerging Tech policies so then I full out went down the tech route but the important piece was that I was still able to focus on Tech and how it worked for different communities especially through the lives of data so that's when I went and I started the center for open data Enterprise and since then I've continued working on tech advocating for tech policies that are focusing on the human first so where where am I now I'm at the shorenstein center and um the focus of the Assurance team Center is really looking at the intersections of media politics and public policy and why does my background actually relate to that well for all of you for everyone around the world technology has fundamentally changed what we think about media and politics so I was hired to bring that technology expertise in and we have four real key questions at the shorenstein center that we work on so the first is how is technology impacting media the second what is government's role in regulating technology in emerging Tech the Third what are tech companies responsibility in regulating Tech and thinking about society and lastly how does Tech in our new media ecosystem affect democracy all right so in today's talk I want to talk about four things the conditions and Norms that shape today's technology the benefits and risks the impacts of those benefits and risks and then the road ahead what do we do from here and I I like to start with the story anytime I talk about tech development so let's go back to the 1880s when the Census Bureau was putting together their 10-year census at the time they approximated it would take about eight years to complete all of the data collection and get the results of the census but because the population was growing so rapidly they realized that by the time they collected all of the data from the 1880 census it would only be good for two years and by the time they did the 1890s census it wouldn't be good at all it would be outdated so a young engineer named Herman Hollerith he actually created a tabulating machine right here and this tabulating machine used Punch Cards it was actually inspired by conductors on trains and railroads that would use these punch cars to track people getting on and off the trains and because he created this Innovation it cut the work of 10 years or almost eight years down to three months and this became the standard across the industry and how we were starting to think about big data now Herman went on to found a very well-known company called bonus points for anyone that guess what's this company that exists today any guesses IBM IBM okay I start with a story because oftentimes when I come to talk about tech people assume I'm immediately going to talk about robots or AI or Internet of Things or something like that but technology is not new technology has been around for all of civilization and according to Oxford dictionary it is the application of scientific knowledge for practical purposes so there's nothing necessarily magical about technology and there's nothing new about technology now throughout each period of time in history there have been certain facilitating conditions and Norms that helps technological progress so in the 1880s there were lots of mechanical tools that were becoming very popular that made it possible to think about tabulating machines there was the growing use of electromechanical relays that Nikola Tesla eventually pioneered and the term business intelligence was first coined when people started to think of massive amounts of data at the time just information on paper to use to make decisions so if supporting conditions and norms facilitate technological development what are the facilitating conditions and Norms of today all right so let's talk about the conditions first first I'm sure many of you have heard about this the exponential increase in Digital Data but let me be a little bit more concrete every single day there's approximately 2.5 quintillion bytes of data created every single day what is that equivalent to it's equivalent to 10 billion Blu-ray discs filled up with information if you stacked it on top of each other it would be four Eiffel Towers all the way up to the top every single day okay and it's estimated that approximately 95 percent of the world's data was actually only created in the past two years and that and that uh trajectory is just going up second the ubiquity of mobile interfaces and increasing processing capabilities according to a recent study 92 percent of people in the whole world have a mobile phone and you're like okay well wait a minute babies don't have mobile phones but in many countries people have multiple mobile phones okay so 92 percent of the population has a mobile phone if we're just looking at numbers and 85 percent of the population has a smartphone with more sophisticated technology than was used to put the first man on the Moon and then of course the growing part power of artificial intelligence and it's estimated that approximately 90 percent of U.S medium to large size businesses is critically dependent on some form of AI for their operations does anyone use chat GPT yet all right you know what I'm talking about so I'm actually more interested in in the Norms the conditions are are quite interesting but I think the Norms have equal consequence and these Norms dictate behaviors that give power to the technological conditions that inform how Tech has developed three of the interrelated Norms that I want to talk to you about today are are accelerationist culture our understanding of specific users in how we apply our values and priorities to the tech we develop okay let's go through each one has anyone heard of this phrase everything that is not forbidden is allowed so this this comes from English law as a constitutional principle and it's a deep and underlying part of the American innovation culture the converse principle everything that is not allowed is forbidden is associated with European law and it's very different how we see Tech regulated with this fundamental difference in how our legal system is set up so at our core the concept of experimentation Innovation pushing the boundaries really defines this country our culture and the tech ecosystem what does that look like in practice and what happens when you add a element of speed we move fast and break things Facebook's original motto move fast break things and then move fast build things and then move fast with stable infra and then just move fast and then they're like we're going to give up on this one this is not a good motto we've heard fail fast fail early so innovation in speed and scaling is so fundamental to the way we think about tech and success in business in this country actually the founder of Y combinator said it is most important to think about rapid growth it is essential for startups there's a consequence to that what what does that actually mean when the most important thing is speed okay well here's an example when you look at the Venture Capital Community and particularly in finance what's put on a pedestal is the Unicorn status how fast can a company get to a billion dollar valuation regardless of how they got there how fast can you do it and this is an example of of a recent tech company bird that got to this is a unicorn got to a billion dollar evaluations fastest ever here's another example does anyone see the Uber the Uber TV show based on the founder and what the early days of the culture were like this is actually a slide from one of the early pitch decks about Uber and what makes someone competent to work at Uber super pumpedness uh I can't say it while I have a microphone on when we're moving so fast that we don't consider the unintended consequences of the product that we're building some pretty bad things can happen the founder of LinkedIn and a partner of Greylock uh recently launched a book called Blitz scaling and it talks about the lightning fast path to building and scaling a multi-million dollar company and there are dozens and hundreds I'm sure of books that are glorifying simply the speed but some things can't be reduced and simplified like fairness or equality and when you set bad values at the beginning of a company that Blitz scales the DNA pervades that company and especially when it's a tech company technology just puts a rocket behind whatever Direction you're going and makes it much faster so that's why a lot of the tech companies we've seen have issues with ethics and dealing with some thorny issues regardless of how fast you're going but it just makes it worse because of the speed okay another Norm understanding users so one of the great things that has actually come out of the tech space is human-centered design and I'm I'm actually a very big advocate of human-centered design and being able to understand more about very specific users and getting the the specifics of what you're trying to accomplish out there and matching that with users needs I think that's really great what are the downsides though so one of the companies that we looked at for a case study was lyrebird a couple years ago and and they focus on voice cloning technology they say that if you give 30 seconds of your voice they can clone it with almost 100 percent accuracy in a couple minutes and when you look on their website and you say okay well who is this for they say it is specifically for developers to continue the speech synthesis development field that sounds great right right um in an interview though someone asked what might someone trying to do nefarious things use your tech for said well that's not our primary user that's not a very specific user we're focused on our very specific user we've done a lot of research on our user and so that's why we've built it and uh in the Tech Community this is when we say that a Foundry doesn't like to admit when their baby's ugly but it's really hard for many people especially Founders especially entrepreneurs that put in the research to really understand how their technology or their product could be used in nefarious ways because that's not their focus that's not their intention and when we think about something like Facebook I mean Facebook was originally started at Harvard as a a rating site Hot or Not right and and who could have imagined that Facebook would have turned out the way it was I'm sure Mark Zuckerberg was not thinking okay well I'm going to take over the world and create the new Public Square for the entire world when Henry Ford was developing the automobile I'm sure he didn't think about climate change however it's important as a technology develops as a company develops to be engaging with users in making changes as the world change changes as your product changes and this is one of the things that we didn't see with wirebird that we haven't seen enough of with Facebook and many of the of the other big tech companies the last girlfriend I want to discuss is values and the inherent integration with a developer a Founder's values and priorities with what they're making inherently when we decide to develop and build something we have to make choices so for example like I said before about 90 of companies small to mediums are large to medium large and medium-sized companies in the United States are using artificial intelligence what is the foundation of what artificial intelligence is it's data and it's an algorithm how do you create an algorithm you make choices you create instructions so oftentimes people will say things like well the algorithm algorithm was biased we were the ones that gave the initial instructions and we trained it and we decided what data to use to train those algorithms and because algorithms and because artificial intelligence is so deeply ingrained in our products and services in our companies and our non-profits and how we do business it has significant consequences because it's making decisions like who gets granted parole who gets a loan who gets a job interview who is considered qualified [Applause] a recent example from a couple years ago of how do you decide on certain values was with Apple has always said we value individuals privacy over everything else so what did Apple do when the San Bernardino Shooter had an iPhone and the FBI reached out and said okay we need access to this information for the purposes of National Security what do you do do you pick National Security or do you pick personal privacy in this instance Apple said we are putting personal privacy above National Security we're not creating back doors for these phones we can argue all day about whether or not that was the right one but these are the questions that we need to be thinking about and discussing and being very careful about when we're looking at trade-offs another example of a company that really leads with values it's a choice is Mozilla so if you look at Mozilla and you look at the settings you'll see a lot more common language information about settings how your data is maintained how you attract et cetera et cetera that is a conscious choice that Mozilla chooses to use Mozilla Firefox okay and that's what I mean this is one of the the best examples in the private sector that I hold up saying they've said transparency accountability privacy and Safety and Security are absolute priorities for us and of course there are trade-offs sometimes when those values conflict but they are very very proactive in thinking about the values when they're putting out their product so I don't have any definitive answers on how to go through these thorny issues and what's the right answer what's the wrong answer but if you have one takeaway and especially if you're going into Tech the most important thing is thinking about unintended consequences and not just looking at the benefits of what you're trying to develop but also thinking about potential risks and what happens if someone uses it that's not supposed to use it how can this product how can this tool how can this service potentially be abused now I want to turn to the media which is where I am now what does all of this technological progress mean for the media well over the past 10 years we have seen traditional news outlets local news decimated just going out of business across the country and why because they can't keep up with instantaneous information the speed they can't keep up with the tailored content directly to you and they can show you exactly whatever content the the tech company is that align with your values and your priorities very specifically and the downsides of that though I mean there's so many downsides of that but you know I used to be annoyed when my parents would sit there and like have quiet time reading the newspaper every Sunday morning but yeah I appreciate it now in a lot of ways because they're actually getting content and reading it that they would have never gotten if they just only looked at their phone it exposes you to all different kinds of things what else seriously uh everyone I think in this room is well aware of the rampant myths and disinformation problem especially in light of recent elections especially in light of the global pandemic on top of that we have the question who is making these decisions we talked about values and priorities there is a tremendous amount of power that the people leading these institutions whether it's Mark Zuckerberg or Elon Musk that are not elected officials that they have to decide what values and priorities they are putting in place with how the world communicates and what does this lead to according to a recent Gallup poll the U.S has the lowest levels of trust it has ever seen in institutions and in democracy just at Large now here's the positive stuff here's the more uplifting thing at the shorenstein center part of the reason I joined The Insurance Center was because I cared so deeply about these issues and this is one of the focuses that we have our mission at the Sharon Student Center is to improve the information ecosystem to the ends of strengthening democracy and this is something that we all need to work on not just Assurance team Center not just Academia not just government and business everyone plays a role on this and we do this by looking at the information life cycle so we think about how information is created good information accurate information but also how bad information or misleading campaigns or intentional disinformation is created how it's distributed across platforms or shared through nefarious ways how it's then consumed how people make decisions on it and how people potentially re-share it and I'll just go through a couple of the faculty members that we have in the research areas that we focus on we have Professor Julia minson she's focusing on receptivity to opposing views how can people be open to people that they inherently disagree with on whatever policy topic it was really interesting to get her take before Thanksgiving uh Professor shared goal he focuses on how information sometimes bias is used for decision making specifically with policing Professor Rogers he focuses on how do you communicate with busy people I love that one um Professor Donovan focuses on Miss and disinformation campaigns very specifically and how that information is created and distributed throughout the web in Professor Sweeney focuses on how do you over all improve the information ecosystem and make access to information more of a priority for Citizens so across all of these areas I'm I'm very optimistic despite the The Bleak picture that I shared with you about our Tech ecosystem and our democracy I'm very excited that people are starting to realize and be more aware of the technology and the power they have in their hands and make more intentional choices so thank you very much foreign people to be able to do is think about unintended consequences a technological innovation how do you see those because part of it is that it's very difficult to see so what's what's going on right now that you see is limiting our capacity to see those what could we be doing better to be able to see this type of bad stuff can happen yeah absolutely I mean this is sort of I'm sort of contradicting myself here but um it's bringing in different types of users bringing in people with different life experiences different priorities themselves in getting their perspective and hearing from them how would this impact you that I would have never thought about and so this is why inclusive practices and having more representation in leadership positions is so important because you get diverse perspective in ways that you might not have thought about in in terms of how technology is is developed I will say uh historically if you look at people that have done forecasting or Futures planning or any we're really bad we are real humans are terrible at knowing what's going to happen in the future really really bad so the best thing we can do is bring in as many diverse voices with diverse experiences to the table to help do our best in thinking about unintended negative consequences thank you so I work in Applied Physics but I'm really interested in ethical Tech development and I worked with people who um do a lot of really interesting work in machine learning at the information Sciences Institute at USC and one thing that I noticed is that when you're having conversations between scientists and Engineers it's fine because everyone's on the same playing field in terms of basic education but when you look at something like the hearings done with Mark Zuckerberg or communication with different government wings or even my own sponsors who many of them were part of like the doe or the dod that really begins to break down so what do you think is the link between scientists and engineers and this and this really highly specified education and and really being able to communicate like what does this Tech do and and and how to explain that to people without that background I am so glad you asked that question in it I have slides for it I swear to God she's not a plant but this is actually the topic that I testified to Congress on twice because if we are going to have better policies that actually make sense in policy makers that aren't asking about the tubes of the internet or talking about um what was it Tom apple or something like the Tim apple right then we need to have better expertise in government to explain these to the policy makers and it is abysmal abysmal right now when you look at our Congress and how many people in how many staffers and how many chief of staffs have anyone with any kind of technical expertise like any um and this is I'm happy to send lots of information on this because there are lots of really deep systematic reasons that this is the case one of which is the fact that the average salary is thirty five thousand dollars to become a staffer in the United States Congress um but all of the recruiting that happens for congress happens for people that are doing political science almost exclusively um and it's just very very challenging to progress in a policy maker position from the ground up if you don't have a technical technical expertise the other thing is people don't speak the same language across different sectors but it's particularly acute with policy makers and scientists or policy makers and technologists and one of the things that we actually did at the top project was try to put into in some kind of graphic why this is a problem and why regulation is is late and I actually have this okay so so bear bear with me for a second so here on the XS access we have technology maturity okay and this goes on okay so underneath it I'm adding the scientific scientific community and so for people um in the science Fields it'll be familiar with basic science applied science and you have product development and then you have a lot and release to the public okay then we add in the entrepreneur or in the the investor community and how does that overlap with how a technology is developed in the scientific community okay so in the entrepreneur and the investor Community have ideation we have proof of concept or MVP then we launched then we grow then we scale and then we cash out right has anyone seen this The Innovation diffusion curve okay Innovation The Innovation diffusion curve is how a new innovation is released to the public and it's adopted okay so for those of you that when the first iPhone came out we're standing outside of the Apple store waiting for it you wore innovators or early adopters most of society doesn't come into the early doesn't come until the early majority or the late majority and the people that are still hanging out with a rotary dial phone you're the laggards okay so when we're thinking about how technology is introduced to the public it usually doesn't start here until we're past product development and on to ongoing product management and we're at launch why is there such an issue with government regulation though all right so let's on the y-axis let's put government involvement or oversight okay and now let's think of a couple examples here okay so this first one here we've got Facebook existed under the radar for all very long time a very long time and it only I mean still no doesn't have a lot of government regulation okay Uber the same thing okay 23andMe there was more regulation to begin with because that area of science has had more regulation for a longer period of time okay so you can see that it started higher and then when stuff really started to change all of a sudden there wasn't policy in place and there was almost no government oversight again but now it's going up again okay so because we have the conversions of the investor community entrepreneurs scientists and technologists and also government all of our timetables in how we're regulating are on different time scales so you have that as a problem and you have the problem that we're all talking different languages and it results in quite a big mess and so this is one of the things in particular that we spoke to Congress several times about this needs to come more into alignment we can't only be creating regulation once there's a huge blow up or a huge problem and we need to have more advisors in Congress from the ground up so they can inform you on this kind of stuff thank you so much for coming to speak with us um I just had a question specifically on the intersection between values and especially artificial intelligence I'm not technologically involved whatsoever I consider myself technologically illiterate to be honest but I did do a little bit of research on like AI being used in to calculate recidivism rates of um uh incarcerated peoples and one of the big issues is that when you're inputting algorithms that are put into these AIS and you use historical data to do that historically it is black and brown people who are incarcerated at higher rates and then you have these AIS take on this same role to calculate the recidivism rates or the rates at which they're going to return to prison and it's going to just recreate that same historical Trend so how can you eliminate these historical biases when they are conditions of like the value systems we've had for such a long time and still create artificial intelligence with that same data is there any data that doesn't have these biases is it possible to even like get rid of them in any way so that is my question so that's a very technical question for a technically illiterate person so it's a very good question um one of the things I'm proud about at our Center is um Professor goal is actually he's developed an algorithm called that's called Blind charging which is basically taking the historical data set stripping any kind of the demographic identifiers and then using that to help prosecutors when they're determining whether or not someone would actually be charged a couple years ago Amazon got in big trouble because they were using historical data for hiring and historically most of the people that were in senior positions were white men so of course they were recommending most of the white men Advanced so yes it's a huge problem there aren't any obvious fixes there are some things like anonymizing the data and de-identifying the data with demographic information that we've seen in some instances is successful um there's also synthetic data which is a data set that you create which has some of the most fair or Equitable kinds of characteristics as possible and then you can create that or essentially duplicate that at scale but this is this is one of the fundamental challenges here with so much of our technology if we're trying to create a better future in our past is problematic especially from a data standpoint how do you how do you do that when everything is fueled by data thank you this has been so interesting I was wondering if you what do you think about algorithmically uh like promoted content so I'm thinking about like recommended videos on YouTube or more recently like the for you page on Tick Tock because social media nowadays looks less like going on the search bar and looking for your values and preferences and more like an algorithm guessing what your preferences and values are and just promoting it to you yeah and what's your opinion on that kind of evolution yeah so in in the early 2000s 2010s people would say they're going to the internet for search for something now you go to the internet to get served something um and you a lot of times you don't have a choice in in what you're served or you're not um conscious of the things you're being served um I I don't know the exact statistic and I can I can find it and send it after but um we know for a fact that content that elicits a strong emotional reaction particularly rage is X number at least 10 times more likely to get further Clicks in engagement so if tech companies know this in clicks and engagement equals more money what are they incentivized to do they're incentivized to give you content that is going to make you angry that's divisive that's going to be polarizing and this is what we see there have been a number of studies on platforms like YouTube Instagram you name it where if you keep clicking on the recommended videos after a number of clicks you're going to wind up with Q Anon you're just going to um so yeah I think the only thing we can do is is be mindful of the fact that we're not searching we're getting served content and what it means to actually click on something you're feeding the algorithm something so I mean you could do your own experience don't do this but if you were to do an experiment and continue searching for a couple things and clicking you're going to keep getting served that content and it's going to get more extreme hi thank you so much like this is kind of like my whole thing so I'm very excited about it uh my question is actually a little bit about q a non and all that but like radicalization online and what tech companies like responsibilities or what you think the responsibilities should be like at a Reddit go through purges like they got rid of like the brain cell page that was advocating for hatred and violence against women QA non-pages are typically shut down you know kind of half and half depending on what site they're on but just like your opinion on that and how that uh those policies are going to have to evolve as these Pockets keep popping up of people who are listing very violent rhetoric and very hateful rhetoric and then going out and actually doing it and being attributed to these Pockets yeah uh I don't have an answer no one has an answer right now one of the things that's been talked a lot actively um especially over the past five years is um a policy called section 230. and section 230 was developed in the 90s as a legal carve out for Publishers so newspapers primarily but the media to not be held liable for Content that they published as long as they did their best to ensure that it was accurate et cetera et cetera our tech companies newspapers or publishers right and so tech companies up to present day have been protected by section 230 being giving broad legal immunity against being liable for anything that they hold on their content or on their on their platforms there have been lots of questions about overturning section 230 and then concerns about that chilling speech and then not allowing for the whole democratization of information so it's not a straightforward thing but this is one of the things actually that the schwernstein center in the Belfor Center at the Kennedy School have been working on researching we've developed an index of over 150 different proposals two dealing with Section 230 and what we could actually do we don't know but this is an active debate in Congress right now because who's responsible it's unclear it's really unclear yeah good question hi thank you so much um I have a question on it's sort of a sub question to the question of what is the role of government in regulating social media technology in general uh you mentioned how in the U.S it's

everything not forbidden is allowed and in Europe everything that is not allowed is forbidden how do you think this difference in government Outlook affects that question and what why are these differences basically in what can be done for the future I mean you gotta ask the founders for that like why it is but but I can tell you like a very concrete example of how it manifests in different policy change gdpr right um Landmark data privacy legislation that was passed in the UK or in in Europe um which prioritizes data privacy and making sure you're giving appropriate notification if data is being kept you're asking for consent and after a certain period of time the data is deleted fundamentally different than the United States where it's did you read this like 60 page user agreement that no one's actually reading and like fine we're going to sort of trick you and then you click accept fundamentally different the closest thing in the United States to that is the California data protection act CCPA um that's probably the closest thing and I feel like maybe they they had to do something because so many of these companies are based in California but that is the most clear-cut example of how we see these different ways of thinking about policy and legislation in action hi um I know we already discussed the issue of like people in Congress and lawmakers knowing about technology but I study computer science here and I'm also acutely aware of how programmers and developers are very unaware of the political science side of it yeah um so I'm curious what you think about the like potential ways to improve that gap on both ends um and I also just think it's really important because um it feels like government regulation will always be slower than the Innovations and so if developers know about this it seems more effective than trying to develop something really fast and then reacting to the legislation later I'm so glad that you brought it up because um I'm usually the person that's like poor government they're trying really hard tech people you're gonna need to be careful um so there are lots of organizations that are beginning that bridge the divide both ways so on the Congress side making Congress smarter they have things like Tech Congress like if you if you are a technical person you want to get expertise in government you want to get that experience look up Tech Congress it's an opportunity for you to be a fellow placed in Congressional offices or on one of the Committees using your expertise for real-time policy on the other side a good friend of mine actually Kathy Pham and several other of her colleagues founded um an agency called U.S Digital Services and um there's another one called 18f which is related and for people right out of college called coding It Forward all of these organizations help technical people or people from Silicon Valley to come into government and get that experience we see more movement that way or we see more movement the other way unfortunately this is why I spend more time talking about technical people coming into government and getting that experience because it's much easier the path of least resistance is I'm a government or I'm a technical person and I'm gonna go into tech companies because I'm going to get paid like 200 000 and like get my food delivered to my door um but there are lots of new programs that are starting like this that help to bridge The Divide there's other programs happening to help inform policy makers as well on some of these issues so there are a series of roundtables where policy makers can have discussions convenings with the technical folks to understand it and more a granular or plain language level but I think the other point is the technical folks need to have government experience in addition to policy people knowing about tech have something so a lot of people will discuss like the Divide between like free speech or like hate speech and how do you regulate hate speech without cutting down on free speech and I was wondering like uh was The Economist mentioned like oh Free Speech isn't just talking it's about like listening and being heard and I was wondering if about any other like way to characterizing that conversation you might have thought about and how that could be applied to regulating like what gets posted on social media there are very clear legal definitions around what constitutes hate speech the problem is less about whether or not there's an actual legal precedent around these issues and how it's then enforced at scale across all of these different platforms and that's one of the so you could have things that are legally considered hate speech in these platforms the corporate governance structure and the policies though in place at these institutions either won't catch it or it's happening so much at scale they don't know what to do about this and we're literally seeing this on on Facebook so so many of the things that are sort of bordering on what could be hate speech legally have happened and then they were taken down prior to Elon Musk on on Twitter or eliciting violence or something like that but now he's reinstated all of these platforms or all of these accounts and we're seeing a lot of hate speech or things that are sort of in between the lines flourishing on those platforms but government doesn't have any any power really to intervene in this because this lies completely within the corporate corporate governance structure so I I don't have a good answer on exactly what we need to do about this outside of the fact that this is a huge issue where we think a lot about corporate governance and corporate policies that hasn't been addressed um kind of like Deborah I I own the side of being deck illiterate I think most people don't know how any of this works and we're kind of hardwired to believe we have to use it now I don't think there's much there's no way to opt out of deck and it seems like more and more government interests seems to overlap with how tech companies see data as well like you have the government spying on people's surveillance crowdsourcing back doors and all of these things and it seems like maybe we have to be more pessimistic because the government might not want to get involved and clamp down so as a Humanities person it makes me think we have to reinforce our legal structures to try and enforce more private privacy protection and things like that do you think there's now it's all on civil society and users or do you really think the government is in the long run going to try to keep up with and actually stop the abuse of people's data tell me a little bit more why you feel like the government has the same interest as Tech I think just in Mass collecting data and surveillance that's my perspective on it between the fact that in physical spaces they're trying to get more eyes on the ground and they're using uh social media to track people down and see what they're seeing on the internet and in other countries not in the US maybe they're also using it to Target opposition they're using Bots to increase uh their own presence on Facebook like is happening in my home country so in my mind I don't really see them as completely separate like the government is trying to keep up with data and they're trying to stop them and data is pushing ahead too fast in my head there's like some gray areas where the government's not really that interested in shutting down the negative outcomes of a lack of privacy yeah really good question and it varies so much depending on what country you're talking about I've worked in countries everyone from Lithuania and some of the northeastern or Northwestern area European countries that are considered some of the most proactive and thinking about e-government and government services to Sierra Leone I went to Sierra Leone right after Ebola think about services and how we can understand what's happening on the ground and they have very different incentives and different ways of using technology so let me just caveat everything I'm saying with that that's such even the most advanced countries using technology don't hold a candle don't hold a candle to what the tech companies have and what they're capable of and the kinds of information they have not even close not even close and there is almost zero accountability or required transparency from tech companies compared to government so yes I can see in some situations I've done a lot of work in China and and you know that's a whole different landscape um where government has very very different incentives for surveillance but generally speaking generally speaking the real concern is actually the tech companies with unchuck unchecked zero accountability kind of power and if as a unintended consequence or a side effect government's able to utilize some of these Technologies that's you know that's an issue in some places but I haven't seen huge pushes maybe barring China and Russia um huge huge pushes to develop the tech itself for Mass surveillance my bigger concern is is the tech companies um and then to your later question um is Tech ever going to be able to get to a government ever going to be able to get to a place where it just halts Tech development of course not never never that's never going to happen and this is why all of these different constituencies across the world need to be thinking about their part in this whether it's Civil Society government as well but also what is the responsibility of the companies themselves and what as users can we do we can decide to use these things or not we can decide to share this thing or not and it may might seem like not very much but the difference since I've been in the space and thinking about tech regulation in only 10 years is so different when I was talking about this stuff like 2012 people are like you are negative you are so pessimistic and now people are more open to thinking about this and really being more mindful about the tech they're consuming and the platforms they're using so I do see a shift in it's everyone's part everyone has a role in in addressing these things thank you yeah thank you um so I know we talked a lot about like AI legislation and how that's obviously very important to our future but I've also read that sometimes these legislations are like over sensationalized or over-represented in Congress so I'm just wondering in your personal experience is that true or is that or is some other technology that's maybe not often in the news as much like a lot of the biotechnology is also brought up in Congress just as much but not reported or is the lack of representation in Congress as well as in the media yes so do you have a specific example in mind that you feel like was over sensationalized and something that hasn't been I remember something about like when when the alphago technology was first published it was like one of the huge huge deals right and it was everywhere and I think followed there was a huge wave focused on AI as it should be but I don't remember what it was but at the same time I think the synthetic like robot surgery was also taking off but that didn't get as much attention yes um absolutely one of the things that when I worked with the late secretary Carter um he said is the sleeping giant is synthetic bio so we had three areas of work that we focused on at the technology and public purpose project it was digital Technology's primarily AI um we looked at the future of work in the labor market because of all of these new advances in technology and the third was biotechnology and around the time that everyone was freaking out about open Ai and some of these and you know go and that kind of stuff there were babies that were Gene edited born in China um and it sort of made the news but then it really died off um and what secretary Carter was always concerned about was the synthetic biospace and because it's so much more technical in a lot of ways and gets less coverage it's easier to continue being developed and people are really not aware the other thing that's been happening um that we focused on was Gene editing or not genoto geoengineering so David Heath is one of the the leading climate scientists in the world at the Kennedy School and he's been working with with NASA and some some folks over at MIT to think about experimenting with our climate through geoengineering that's a huge deal there are serious consequences to that you probably won't see it on the front page of the news so yeah um excuse me sorry I work in Quantum and that's a big buzzword right now and there's actually already talk about fear of quantum winter um you know as what happened after the.com boom in terms of huge development contraction um for me that seems like a misunderstanding between what's possible with the science and what entrepreneurs or the media would like to get out of it or harvest out of it or think that they're hearing um what do you think the solution is there right because losing a decade of funding for scientists and Engineers is is a crisis but and and I think also possibly a drawback to society but what do you think we can learn from that or what do you think are ways to prevent that I mean when we think about the business model for news period and whether that's a traditional newspaper or if it's Facebook it's let me get the most engagement and so whatever headlines are going to get you the most engagement I mean for a long time it was Killer Robots and you know in sentient AI beings and we knew that it was just never the case but there was an incentive somewhere in the supply chain for that to happen I mean this is one of the reasons again that I'm happy to be coming to the Children's Center I have no background in media just be clear zero back and I said that in my interview but I said all of these things are converging in really unprecedented ways and we need people that have experience in policy and have experience in Tech and have experience in the media and then also in deep areas of science like Quantum Computing to come together and think about what's the best interest for the whole Community instead of hey how can I get some extra funding because this isn't the New York Times five times a week how do you battle the 250k being offered from Google like how do you incentivize people like yeah I think if I had a good answer for this the whole world would be in a difference or a government would just be in a fundamentally different place I mean I could ask the same question to someone that knows for sure that they want to go to the Peace Corps or knows for sure that they want to work in a non-profit but are getting dangled in an investment banker salary it's just really really difficult and and The Plea that whether it's folks from usds or Congress or any parts of government or even non-profits make is I mean specifically for government let me talk about that first if you want to make an impact in the world there is no greater Institution for you to do that at scale than government there's isn't there isn't and so it's it's a choice and in many ways it's a sacrifice but if you want to make a change that's probably your closest route to do that but I had a question kind of relating to your mental health background uh because you like one way that social media sites combat misinformation is through censoring certain words online certain controversial terms um and but sometimes this has the effect of muting like balanced and well-informed conversations about topics and I was wondering what your thoughts are on that that's a huge problem in fact at the Belfort Center we had a whole fellow a fellow their entire project was thinking about for the LGBT community how an overwhelming majority of content for that community that is completely unproblematic is just informational gets taken down because it's targeted uh tagged as pornography um the same thing we've seen multiple times with information about breastfeeding the number of times the percentage of times that that's taken down compared to other content is is significant because they just don't have um enough human content moderators so it's a huge issue this is a big problem in the mental health space um because it's a good thing but it's a bad thing that mental health is now only picking up steam in being very accessible on digital platforms so it's great because it is but there's a ton of Miss and disinformation out there on it you know people telling you that they're a psychiatrist or they're a therapist telling you not to take your medicine you know things that you really need to be a clinical or have some kind of clinical background to be doing um so this is a huge problem it's going to take a long time for the mental health field to catch up and for the tech platforms to catch up and get used to mental health content on the platforms and one of the reasons I was so sad when the Takeover of Twitter happened because all of the people that had any kind of expertise in mental health were fired okay well we are at time um thank you so much um I think [Applause]

2023-01-21 11:44

Show Video

Other news