Privacy and Democracy in Technology

Privacy and Democracy in Technology

Show Video

thanks so much everyone for joining us we're very excited to have this panel today we're sponsored by acs list and the new law tech center and we have three stunning panelists we'll start megan gray here on the right is a principal at gray matters law and policy she's an advocate for entrepreneurial companies at the intersection of free speech privacy online content and competition she's formerly general counsel and vice president of public policy at duckduckgo she bought some shirts for those who are interested and an attorney at the federal federal trade commission's bureau of consumer protection division of enforcement which was lead council in the first consumer privacy civil penalty case to her left is professor danielle citrin who's the jefferson scholars foundation schennek distinguished professor in law the cattle enchantment professor of law and the director of the law tech center here at uva she's the author of hate crimes in cyberspace and in 2019 citron was named a mccarthy fellow based on her work in cyber stalking and intimate privacy her upcoming book the fight for privacy protecting dignity identity and love in the digital age will be published next fall and joining us on zoom we're very excited to have rachel levinson waldman who's deputy director of the liberty and national security program at the brennan center for justice she leads the brennan center's work on social media surveillance by law enforcement bodies and is active on broader issues related to policing and technology including body cameras protective policing facial recognition and information collection in the immigration context her most recent work includes the groundbreaking release of documents exposing lapd's broad social media monitoring practices prior to joining the brennan center levinson waldman served as counsel to the american association at university professors and as a trial attorney in the civil rights division of the department of justice can we all give them a warm round of applause to join us rachel we we're in fact really pretending that you are with us so we sort of have scooched our chairs so that you're in as if you are sitting next to us that is very kind i wish i were there in person so this is a this is the second best right now i think to start if we can maybe start with in the order i introduced you all and you all can elaborate a little bit on your relevant positions and priorities as advocates before we move to a discussion of some contemporary privacy frameworks and changes sure um so first off the duckduckgo merch that's on the table there's books stickers shirts miss size shirts because it's the remnants of what i had but we've sent boxes to max so you will have your assorted sizes and just check in with her later my background i went to university of texas law school simultaneously got a master's degree in public policy and then after that i moved to los angeles where i worked for big law and was at the first internet age and i did content moderation i did privacy anonymous speech it did just about everything there so that perspective has been incredibly valuable as i've moved forward in my career into civil rights advocacy privacy advocacy government and then in house and then my own firm so it it is interesting and i'm very fortunate that i've got to see so many different angles the the part that interests me right now with privacy and how to deal with the toxic environment and and problematic speech that that we're all aware of is i'm concerned that we're focusing on the wrong thing it's kind of like being concerned about privacy and then having cookie banners like that has done nothing it has slowed down the cause of trying to fix the online privacy dilemma by spending so much time negotiating and enacting and implementing these dumb cookie banners which have done nothing to solve anything and so my concern is that the problems that we're trying to solve for now uh in my mind seem to be going in a similar mis misdirection that's going to slow down what we're actually trying to accomplish thank you i guess i'm not allowed may i ask a follow-up sure so what's the misdirection right i mean we have some myths and misunderstandings around 230 and i'm happy to give everyone a little primer on on what it is section 230 and what it was meant to accomplish but it'd be interesting to know kind of at the start where we're being misdirected in in your view sure i think the misdirection is that we're trying to get rid of speech that we don't like amplification um highlights uh you know toxic facebook groups all of those i understand why we don't like them i don't like them but i don't think if you're going to be practical about it we're not going to be changing the first amendment right so given that there's only so much we can do because that is what a democracy is it's going to have some really ugly people who talk so that's not going to change and i'd rather see more focus on okay what what we're really i think trying to grapple with is that we are inundated with this there is no other place to go we cannot escape from it there it drowns out more healthy conversations and and how do we get to a world where we're not completely submerged in a sea of of hate so i think that's really helpful to to start us off right and to frame our conversation because you might imagine i'm going to disagree with you a little bit um and what we mean by the concept of speech but just to start us off section 230 is all federal law that congress passed in 1996 as part of what was called the communications decency act just let's pause for a second decency it was like an anti-porn statute which believe it or not we once thought we could get rid of porn on the internet a useless proposition right but most of the statute is struck down is unconstitutional and and rightly so but what remains is a part of the statute called this is the title of it good samaritan um blocking and filtering of offensive speech and the idea was saran weiden and chris cox then two congress folks got together and they there was a district court a trial court decision that really upset them disturbed them because it found that defamatory speech what it jeopardized internet serve early internet service provider of facing with strict liability essentially as a publisher and so cox and widen they wanted to encourage responsible content moderation they wanted people to act that is the providers of the internet um platforms that they couldn't imagine then but were in the future that they would be have a legal shield for engaging as good samaritans blocking and filtering of offensive speech they also recognize that if you took down too much you have to act in good faith right so this is the provision that we're talking about at least so many of us are and congress like woke up to this issue like after 12 years of doing nothing all of a sudden it's the thing everyone wants to talk about and it is there are a lot of like myths and misperceptions about what section 230 does what the problems are like what the pathologies are um and so you have folks say the problem with section 230 is that it enables the filtering of speech my speech right and that really pisses me off it's censorship even though we're talking about private companies so it's not censorship in the classic first amendment sense so that's on the one hand one complaint about section 230 is that it enables the censorship or the removal and blocking of too much speech right too much speech that that i like right and on the other hand there are those who criticize section 230 for enabling the under filtering of troubling speech activity so on the one hand right section 230 is a gift because it has for the past 25 plus years given us the arab spring right section 230 is why platforms like twitter didn't crack down when we had sort of bravery uh on the streets of whether it was you know cairo like throughout the middle east at the same time section 230 is why we have revenge porn sites whose raison d'etre whose business model is the tormenting of individuals the publication of their nude photos without consent often with their names attached to them and what happens is victims go to these site operators there are over 30 9 500 of them in the country they're all of course located in the united states for the most part because why we allow them to operate without any risk of liability right so the most notorious revenge porn site and on ib shut down in by the danish police where does it set up shop las vegas alive and well and tormenting individuals so section 230 right it gives us a lot of great things but it also has given us not just troubling speech not just hate speech which is fully protected by the first amendment but online assaults that drive people offline so cyber stalking that is the combination of defamation privacy invasions of threats of violence that silence people then we have studies that show at the cyber civil rights initiative that the cost of online assaults they fall on the shoulders of women and minorities and often gender and sexual minorities and they chase them offline they make it impossible to to live and work in an age in which google is our cv right and so there are speech costs that i think we don't wrestle with enough and so i want to change section 230 and i'm happy to we'll talk about how i think we should change it but so i guess i i'm so glad you you frame this for us that is you're saying there's troubling speech and maybe one way to get it isn't changing section 230 but i don't think we're just talking about first amendment speech we're talking about all activity online which isn't just ones and zeros it's not just free expression and protected speech in a speech we do not we there's i feel like i'm such a great faculty here oh my gosh where's fred shower and leslie kendrick when we need them there's so much speech that we think doesn't come within the protection of the first amendment right we know that there are 21 crimes that are made up just ones and zeros and so to be clear it's not all speech and section 330 even protects the sale of guns right and defective products which is absurd okay so i'm going to stop there and let rachel you could tell i have strong opinions about this so but to be clear i mean oh no not the toothpaste protects the sale of guns it protects companies who are hosting right but they are getting a cut of the sale of guns and that's absurd okay glad we agree on that one okay rachel we're ready for you all right um thank you let me just give um sort of as megan did maybe just a little bit of background on sort of you know who i am in the work that i do at the brennan center and then i can kind of bring in my angle on social media which is a little bit different um so um as you heard i'm deputy director of the liberty and national security program at the brennan center for folks who aren't familiar with the brennan center organization that's based in new york it's affiliated with nyu law school we've been around for about 25 years give or take i'm in our dc office we have a 15 to 20 person dc office new york is now probably 125 people uh i always describe the brennan center is that we're focused on kind of shoring up the systems of democracy so whether that be access to the ballot campaign finance reform independent judiciary ending mass incarceration or in our program liberty and national security program um has hasn't been around for quite as long as the brennan center itself but historically we've been focused on post-9 11 civil liberties issues so that was privacy secrecy domestic intelligence gathering over classification islamophobia sort of a broad range and those are still very much our bread and butter um but so over the last now almost seven or eight years the our footprint has kind of expanded to also include issues related to policing and technology and civil liberties and civil rights in large part kind of coming out of a recognition that many of the issues fourth amendment issues and just kind of privacy and first amendment issues generally that we were seeing in the national security space very much transfer to the policing and technology space a lot of the technologies themselves kind of originate in the national security sphere and that are brought to domestic policing and often the funding or justification for police technology also comes out of kind of a national security or counterterrorism footing and then they are most often put into service in the war on drugs but justifications we need this high-tech technology because you know we're going to be kind of engaging in some kind of counterterrorism so does that a part of that shift to kind of opening our aperture to include policing and technology issues um starting now several years ago maybe three or four years ago i started looking more at use of social media starting really with how police departments were using social media um not in terms of kind of outward communication or education although there's a lot of that that you know we're looking for this person or we are you know hosting a block party on this date we're really using social media to gather information so that could be as part of criminal investigations as part of situational awareness part of just general kind of you know tracking and monitoring it's used a lot to make determinations often incorrect about gang affiliation and gang designation uh started looking into this several years ago kind of coming out of work that we have been doing generally on surveillance technologies in public spaces and also predictive policing programs and kind of seeing how these came together in a lot of ways to produce this sort of online but in public surveillance of social media and then of course there's another piece of social media that's somewhat less public that's about like using undercover accounts to to connect with people online um and we've also done work around how schools and school districts are you know using and collecting social media how they're using third-party social media monitoring tools to surveil students and then also increasingly how is a really big source of information in the immigration and visa vetting space um so as with many policies the that kind of collection and use was sort of highly initiated really under the obama administration but was like weaponized and forced multiplied under the trump administration a lot of that coming out of the muslim ban so in terms of as i say there's a little bit of a of a different gloss it's less about the kind of content moderation or content regulation aspect although certainly there are intersections um more about thinking about okay how do we think about governmental access to and use of social media um and we can get into more detail on this there's a lot to say but just sort of very briefly there are a few strands that come into this so one is the constitutional aspect and whether there are fourth or first amendment arguments to be made about surveillance of social media and we can get into that more and also just generally kind of policy argument and especially as the policy evolves and as the constitutional kind of doctrine has evolved when we think about privacy protections in public right so as you know probably most folks know even you know being being early in law school this is an area where the supreme court has really sort of shifted where it is and has started to think about what privacy in public looks like and i think there are arguments that that applies in some ways and doesn't necessarily apply in some ways to social media but in the meantime there is there just seems to be really like vast widespread use of social media by police departments and i can talk more about our research on that so it's really kind of a burgeoning and growing area great well i think piggybacking off that before we sort of open up into 230 more generally uh i'd love to hear you all three of your thoughts but starting with you rachel on what impact if any january 6 last year had on privacy and are we seeing a chilling effect the platforms that members of the far right and supremacist groups are willing to use and you know what is the role that government versus private parties are having in restricting or not restricting privacy in the groups we saw involved in january 6 versus groups you studied after 9 11. sure that's a great question i you know probably needless to say sort of the aftermath of january 6 in so many ways has been a major topic of conversation within the liberty and national security program and very much so sort of these questions around use of social media so you know to give sort of a little bit of like landscape setting both around the lead up to january 6 and what we've seen in the aftermath right so you know as people probably know it sort of emerged that there had been a lot of planning online right for january 6. um should not have come as a surprise to dhs to fbi to kind of any number of law enforcement agents and in fact it doesn't seem that it did come as a surprise right it sort of came out after the fact that there had been you know there was like an fbi memo from the um fbi field office in norfolk virginia there was certainly a ton of chatter online including by people who had previously been involved in criminal activity right so this isn't just a matter of well people are talking people talking about protests right news i think there are a lot of issues i i would not endorse the idea of um kind of monitoring social media to find out about discussions about protests but there were people who had been actively involved including leading up to that right there have been a lot of activity a lot of criminal activity at state capitals right and so for the people who have been involved in leading those there was a lot of chatter among them about about the lead up to january 6. as we all know um there was really a woeful amount of under preparedness um going into that day and so one of the things that came out of that um is especially dhs um which has been very involved in social media monitoring we want to kind of either launch a new initiative or at least expand social media monitoring work that we're doing and this is in addition to the fbi talking about what sorts of authorities it has with respect to monitoring social media um there was testimony congressional testimony fairly soon after that um in which assistant fbi director sort of suggested that the fbi is limited when it comes to monitoring publicly available social media it's really not the case under the fbi sort of governing guidelines there actually very few limitations on use of social media at least if you're talking about what what's publicly available it is certainly possible there are additional kind of practices or norms that are in place but there's a piece that uh actually an intern and i uh of ours and i published um in just security earlier this year really kind of digging into like what are actually the practical rules around the fbi's access to social media and the answer is they can do a lot on social media before even before any kind of criminal investigation has started up vhs what started to come out in reporting and then actually in um you know kind of disclosures from the department itself a lot of social media monitoring through an office of dhs that's called intelligence and analysis or ina this sort of social media monitoring initiative coming out of ina is focused on narratives and so sort of the dhs's cell behind this narrative focus is we're not focused on individuals we're not trying to do sort of broad-based social media monitoring to you know identify specific people although certainly down the line that could be a part of it right if they actually identify you know kind of over criminal planning right information could be shared with the fbi or state state or local law enforcement there which is that they want to look for basically narratives that could lead to violence selling less rachel when you say that are you talking about um like pattern spotting you know the sense of just how how much has the chatter increased are we seeing certain themes are they is that what you mean by narratives so that's a good question um i think there is probably a lot more to learn about what exactly is meant by narratives in conversations that we've had with dhs officials uh which were not off the record um their description of the narrative monitoring has been an example would be from their point of view um we are seeing increa not not just an increase in volume right almost thinking about it as metadata right where you would sort of like watch the hills and valleys and be like oh there's a hill something's going on but it's actually more content-based than that so the idea would be like whisking chatter about how a jewish community or a latino community in this city is really organizing around welcoming afghan refugees this group of people over here you know this group of white nationalists or white supremacists is angry about that and there's increased conversation about like these particular efforts and this particular community right so the way that it's being couched is really more about understand what threats are not necessarily to go after the people who are issuing them although obviously that could come into play but more to pursue sort of protection for the communities that are potential so i before we talk about this before we sort of get on to the next thing i i think it's really important to add a kind of a gloss and a perspective with a kind of civil liberties and civil rights clause on this um which is that i i am i'm certainly sympathetic to the this in general given where we are post january 6 as well and really you know i think you know what seems like a rise in white both in general and also frankly within law enforcement one of my colleagues mike german has done um research and writing on this on actually how embedded white nationals are within some law enforcement departments or agencies i think there are real questions that we have not at all about but on max's question on whether or not privacy has has changed following uh the january six events i don't see any change i'm not sure that's a good thing i don't see more privacy i don't see less privacy i actually haven't seen other than a bunch of um you know uh teeth gnashing uh any differences in any actual changes in how we're approaching this or from the people who are advocating for the overthrow of the government the uh stop the steal i haven't seen them gone go underground right which would be the smart thing to do it's very hard to scale right which is why there is an initial foray trying to find other social networks but that seems to have stopped and they're all back uh on facebook and such again so i i don't know i haven't seen any i'm curious if folks have a different perspective rachel i'm going to i'm going to join you because i think one um i think the question is who are we watching right and so rachel in your work sort of throughout and then our joint interested fusion centers the the question right is who are we watching not that we're watching right right and so whether there is an increased mass you know in the aggregate amount of watching i think it's fairly clear and rachel tell me if you agree that we are watching and continue to overly watch the communities of color muslim communities that is we haven't changed that so fusion centers you know busy watching arresting black lives matter's protesters you know that interesting oregon case rachel which i'm pretty sure the brennan center has some great posts about um you know where we're overly watching uh first amendment activities the privacy act says you can't store that information but except of course for law enforcement purposes which are maybe just changes the whole dynamic right you can do it um and so i i don't know if you think that there's it's this i guess that's your question right megan which is are we doing more of it is it the same right it's it's a whole lot of pervasive dragnet surveillance right but within the dragnet surveillance we are we tend to and have with significant civil rights and civility consequences watching communities who are long been watched so i guess can we ask that to you rachel i got it i don't want to cut you off but i thought it would be interesting no no and i don't question it in a way and i i certainly don't want to monopolize the the conversation at all um i mean i think the question about how is privacy being impacted i think part of the answer is we don't know right i have no idea how broad scope is of for instance who dhs says it's watching right the description of their suggestion very broad what they have said in conversations is actually were you know focusing kind of narrowly in terms of like the kinds of platforms or the kinds of sites we're looking at no there you know there's no inspector general report yet there is no you know internal like the privacy officer or you know civil civil liberties office report on this yet um certainly uh the you know the one thing i was going to add is that our concern always is and it's because this is born out over and over again in history to danielle's point is that regardless of the it is for the monitoring it absolutely will end up falling on activists and on communities of color right so today's going forward for a long time is white supremacists but authorities both in terms of legal authorities and kind of surveillance capabilities all shift to target music so you're not monopolizing a thing just to say rachel we love this yeah i'm curious then to follow up on the 2 30 section who are you looking at for the biggest harm is it private actors is it government is the intersection is it what's being done or is it what's not being done as the biggest threat to democracy and privacy today i wish you were still a duckduckgo just saying like you know i'm glad you're advising companies megan but like you're our privacy protector in the private sphere right well i didn't go to work for google and facebook that's what i said that's what i'm saying one of the few who did not um the the question on 230 right so 230 is for civil liability against private companies it doesn't have any impact on government actions so just federal criminal right so it does impact it i'm talking about like liability against a police department it doesn't it it just protects um private companies um now in in terms of how 230 has or has not impacted democracy there is a very good case that danielle and others have made is that by virtue of having 230 this incentivizes companies to allow you know speech that gets people excited about stop the steel which formants insurrection which undermines confidence in the election that is absolutely true that's the nature of free speech you can have conspiracy theories that get traction and adherence it also leads to viral amplification and of things like me the metoo movement and bml that you wouldn't have had before when you had pre-internet and it was primarily just pockets of activism that was very hard to to spread throughout the united states when you had you know the the printing presses controlled just by a few companies so um the question for me is not whether or not there's you know problems on the internet it's what are we going to do about it and if there's a way to do something that that is throwing that doesn't entail throwing the baby out with the bathwater and i think that is just a much harder thing to do because if you dislike stop this deal but you like me too you know in terms of regulation or enforcement and legislation it is very hard to distinguish between them um and i between the two you know if i have to take the stop the steel crazies with the me too's me personally i think that is better for society i think there is an argument that can be made um on the opposite side i just don't happen to hold that belief i wonder a quick follow-up there do you see either in the government or private parties any distinguishing between those two do you think that they treat stop the steel different from me too or are those it's all outside of 230 so you know do we see like so if you take for example uh twitter i think now you have seen some of these large companies decide on their own without any government regulation or enforcement that they are not going to be uh free speech bastions that they their corporate values are such that they're going to censor the stop the seal they're going to kick off donald trump right and that's going to cost them some money and some you know market share and that is a choice that they have made and then you'll see other companies who make opposite choices and that's what you should see right you should have companies who have different um ways that they want to operate their business and it kind of comes back to an issue that we haven't talked about which is competition law and antitrust enforcement and monopolization and i think it is very short-sighted to talk about privacy and online speech without bringing that into the conversation because that ultimately is one of the levers that can help solve some of these issues not solve but ameliorate right if you had a privacy law if you had greater number of platforms to choose from you know all of that would would help even out and and stop this just kind of runaway train of toxicity that we have right now do you want to weigh in sure you know i'm not shy right i feel like my students know what i think no but um i think you're right that part this conversation if we're going to solve for any of it we have to be talking about privacy legislation without question that we can't certainly amending and changing section 230 in the way that i'll describe is not going to solve for the woeful under-regulation of the collection use sharing amplification exploitation of our personal data there's no question about that so every time i feel like i give a talk i say yeah yeah section 230 we're talking about privacy legislation right and then staffers look at me like no that's too hard and second 230 can be hard but but i i so i don't want to get rid of it you said we don't want to throw the baby out with the bathwater i think that's right we don't want to get rid of so i don't want to get rid of section 230. but i'd like to condition the part of the statute that relates to the under filtering of content so that's section 230 c1 um that so what i would do is keep the immunity which only of course applies to the providers and users of interactive computer services just use the language of the statute now so you have this immunity right this legal shield so long as you're engaging in responsible content moderation practices and including amplification vis-a-vis clear instances of illegality by what i mean is under the laws on the books right that causes serious harm and so what that's going to do is say look you good samaritans you get it act like good samaritans and so one thing that i've been exploring i've i've argued for this reasonableness to be dressed in the courts something so ben whittis and i um i first wrote about this proposal in 2008 and it was not popular i have to say so like i presented it at the first private privacy law scholars conference and all of my friends including one serious civil libertarians also privacy friends said like you want a jail communist i'm not talking to you anymore and i was like huh like what do you mean michael from kid what are you talking about so he's very mad that we should ever touch section 230. so it is kind of the sacred

we think of it in a way it's been valorized and that's not to say it's very weird i will say this this sacred cow of 230. i i i also think and this this may come off as odd but i also think that we didn't need to have 2 30. like i think there was enough first amendment precedent and case law that actually that one case that motivated and then the other stupid right probably would have gotten to the same outcome so it is a little odd that we're putting so much emphasis on 230 and and and i'm not sure that we needed it it did allow uh some space to uh for for new companies to have uh save on litigation costs right and that again gets back to the the antitrust thing so if any changes you're gonna make to 230 it may solve the problem that you that we have with facebook and and they're like the companies like them but i'm i'm less worried about that i'm much more worried about what the changes will do for emerging companies but that's why reasonableness is such an effective tool right that is what's reasonable for a platform with five billion users let's just take facebook is different from what's reasonable of a startup but you know what that startup let's say does it host really destructive content including child predators that they know about and have noticed about the value of 230 is that you you don't have to litigate it's a motion to decide you know what they call it too bad so sad friends there's like a whole lot of harm on the table that has not been that has been externally born by pretty vulnerable people who've been styled but but the outcome is that the emerging companies who don't have the resources to litigate whether something is reasonable or not they're going to be unable to grow and compete and they're going to have like not very attractive business models and platforms right but the facebooks of the world will so but wait the startup that has the investment from big vcs right i'm not going to cry a river i have to say over the startup that's not well-funded because it's not a great business model um does that make sense like i think i think we say startup as if they have no funding right it's just like two kids in a garage yeah right they're like not all startups are no i'm not suggesting that they are all funded by the kleiner perkins of the world but but nonetheless that is you can do a whole lot of harm with just two people right and so i think that if you had a standard that required the internalization of some of these costs and creating a reasonable approach to what is not protected speech illegality right that shoves people offline whose speech costs we don't account for right um that you have to have some litigation cause that's called business right like literally business right the idea that there's no litigation costs for companies is a frankly absurd notion as we sit here at university of virginia law school right there is for most of any industry litigation costs because when you do harm right not to valorize um homes but i don't mind when you do harm right you wreak havoc you have to internalize those what is the illegal speech that you're thinking of threats cyber stalking violations of intimate privacy uh facilitating child predation just to take some examples right so the the illegal speech is the underlying speech that you're worried about not not the hosting or facilitating or i'm worried about the enablers of of but that's incredibly enabling itself is not illegal well it depends it could be a tort like the negligent enablement facilitation of crime right klein who loves that case i'm sure i have a lot of takers in the audience no landlord tenant negligent enablement of crime people come on friends um but there are tours right so you're right to point out that not all speech is liability inducing and so in some respects some of this like oh my god hair on fire section 230 freak freakout is on a presumption that everything is strict liability that is platforms will face strict liability and that's totally wrong take the first year classes right there is no strict liability right so that it's also i think an overestimate estimation of the cost though i take your point really well that that a lot of what happens though not everything is plausibly related to some forms of speech and speech by what i mean legally protected speech so there's that's true right we're going to have error costs in a reasonableness standard and i think the error costs have to take you know we have to take into consideration lots of speech we already lose under section 230 that exists as it exists today right so is there if i can ask a question and i should be very frank that i am not a section 230 expert literally in any way um we have like done we've certainly done a fair amount of work on sort of content moderation but in a lot of ways we've sort of steered clear of section 230 so this is really just to ask a question and i think sort of to what both of you are just speaking right that kind of dialing back on the section 230 protections could then i guess let me sort of rephrase danielle you are making the point that there is we're not in a neutral space right there already speech that is being lost so we can't say like well we're in this fine neutral space where there's section 230 protections and thus all that should be out there is out there and it can you know battle itself out in the you know marketplace of ideas right and i think that alone sort of talking about the concept of a place of ideas and whether that still has any real like force or salience now i think that could probably take up the whole hour itself but but i guess i'm wondering how you do account for right it's going to it's going to push the other way right so there are some kinds of each presumably that would come back in and then other kinds that would be pushed out in part i assume because platforms would become sort of much more enthusiastic right about their regulation right that seems like that's always been one of the concerns is that they would sort of draw those boundaries far too widely if they and obviously they can now draw boundaries whatever they want right that's sort of one of the beauties of section 230 is that there's actually a lot of regulation they can do i guess wondering how you would see that playing out or if there are ways that those concerns are mitigated right partly there's some like horrible abusive speech that's lost you know great a big loss what about the other speech of value that ends up kind of being dragged along with it right i mean i i think that's the the nuances and the trade-offs that any of this has to address and danielle correct me if i'm mischaracterizing you believes that the value of of the marginalized groups who have been silenced and who who aren't able to exercise really the place that they should have in society and the speech that they should be vocalizing because of the harassment and and bad behavior and illegal acts that um they suffer from that having those voices uh brought back into the public sphere is more valuable than the the over correction that will inevitably happen for some of the speech that is currently there and i just make a different value choice right but i think both both perspectives have value um i i'm not sure that there is a right and wrong here you know i i i just from my perspective i i'm not sure that the the voices that are currently missing from the debate that they will be part of the debate right even if you kick off all the nazis off the internet right you're going to have uh poor communities that don't participate in public debates and aren't on twitter and facebook for a whole host of reasons they have a lot more than just the fact that it's maybe a toxic environment for them right there's a lot of reasons why they're not participating in that and so because i'm not convinced that this is necessarily going to be the outcome but i do feel feel pretty confident um that there will in fact be people who kicked off the internet um that we don't want kicked off as a result of over and and misplaced um good samaritan type um uh systems you know i that's how i weigh it just uh just to um intervene just a little bit here the when you say the costs are only speech related you're not sure if we're going to see marginalized people you know take advantage of that that's not all that it is if only in under assault online you lose your work opportunities that is one cannot work with a google cv that's full of defamation that you're impersonations that suggest you're a prostitute nude photos you just you can't get a job so it is what i refer to it as we at ccri describe ourselves as we we are advocating for both civil rights and civil liberties online that is all of life's key opportunities our contingent on these technologies that are with us wherever we go and so the there are meaningful costs that as an empirical matter at least we have some information about the we do at least as the cyber stalking and invasions of intimate privacy we do have a sense of the economic emotional psychic community costs um but the broader question empirical one about you know who what would this ecosystem look like if we conditioned liability that is we had liability if you weren't engaging in reasonable content moderation practices and i suppose that we'll have to wrestle with should that ever happen and one thing that i think is worth considering is having i've been convinced by blumenthal's office that that maybe the courts aren't the right actor that if we gave the ftc enough funding that if which i love the idea in this big package that we might give the fdc a whole lot of funding will never happen oh please can't we just pretend for five minutes uh that that we might see an expert agency set forth evolving not stuck in the mud right reasonable practices in the face of clear illegality that allows for flexibility and nimbleness and change right so and what's interesting rachel you're saying you it does worry me also when you hear someone like zuckerberg say he's welcoming you know section 230 change which honestly is nonsense because his whole business model are likes clicks and shares like he didn't shut down facetime live when people were getting murdered right on facetime live why and internally it's clear it was making money there were millions of people watching the death videos so the idea that he really wants 230 reform is is truly a talking point by my lights so so i do wonder but i do think we have to as we think about reform do be really careful as megan you've urged us to is to think about the difference and and the shoulders on which it falls and how the difference it there's a big difference between and i didn't mean to disparage startups forgive me anyone who's interested in representing startups all very you know honorable things to do um but there is a difference of course between the behemoths they big they you know rightly i'm not gonna cry a river for facebook for five seconds right or or twitter or or any of these companies right the big ones right they are monetizing our eyeballs they make money off of selling everything we think it do and think and see and engage with online almost every effort that's almost the entire internet though they've just been much more successful at it well because they're advertising like twitter it's advertising part of its arm is a huge money maker same with facebook facebook knows what you're doing on your period tracking app even if you have never ever ever had a facebook profile why because they're an advertising company right but everybody you see all over etsy you see it with yelp you see it with uh with with pinterest you see it like the it's it's very hard to try to at least i have a mini app to come up so are you giving in to the model of advertising right now not a feeling right like does knows nothing about you um and so i prefer a a privacy law that bans surveillance advertising right but it is it's hard uh and i'm not i'm not very confident it will happen uh but the larger issue is okay yeah practical if we're not gonna have a privacy law and if we're not going to change the first amendment what do you do right and i would love to do something with facebook and twitter and google but i don't want to hurt the kind of lower tier companies that are the only possibility that some of these these big tech problematic companies are going to face in the next 10 years right so that's where i always get stopped you know and i do actually believe that facebook would like a section 230 reform i actually do i think they have the resources i think i see it as an opportunity to you know stop stall at least the growth of any uh burgeoning companies and for them to you know kiss ass on the hill and they can they can perfectly not perfectly they can easily implement a reasonable content moderation program and if that's what it takes to get uh congress off their back with uh antitrust cases that is a good trait for them um so i you know be careful what you wish for i guess is is where i i come out i do want to stop if there was a great sort of poking the bear pause point i think unfortunately we are over so we have to conclude here but thank you again to each of our panelists this was wonderful thank you so much [Applause]

2021-10-22 00:11

Show Video

Other news