Spotify And Rogan: How To Engage With Misinformation | The Problem With Jon Stewart Podcast

Spotify And Rogan: How To Engage With Misinformation | The Problem With Jon Stewart Podcast

Show Video

Solid podcast today. We're going to be talking about misinformation, which I'm not sure if it's a problem or not. I haven't heard much about it recently. But first, we're gonna talk about last week's episode, which was a big pig pig fuck. Y'all, I- ♫ I'm a professional. ♫ Pig fuck is a great term.

Last we were talking about Spotify and Joe Rogan and some other things, and I- and I had some comments on it. I thought, "Well, geez, I'm always someone who prefers engagement." And generally the commentary back from it, I thought was very measured. Coming my way, I thought. We essentially just sat back and waited for it to explode all over.

Amongst the, "Fuck you, I'm done with you, Stewart." "I'll never forgive you for x y z." "You're off the rails, old man, go away." I thought there was some interesting stuff if you sifted through it that was constructive. And I think some people made the point that economic pressure is also just another pressure point that you can apply to misinformation. You know, I think one of the greatest issues that we will face is that nexus of misinformation and disinformation and how do you deal with it on a corporate level, on a personal level? So I thought to myself, "Well, geez, what if I get somebody who studied this sort of thing, who is a doctor."

Who knows- who knows something about it? That's right. And so that is Jay Jurden. And so, Jay is a doctor... No, we are delighted to be joined by Dr. Joan Donovan.

Doctor, thank you so much for joining us. Thank you. Please, please tell me what is your role at the Harvard Kennedy School Shorenstein Center? So, yeah, the Shorenstein Center, I'm the research director and I'm also the director of the Technology and Social Change Project. So I've been researching the internet for about a decade. I've been at Harvard about three years now and I look at misinformation, disinformation, media manipulation campaigns, and I look at how the internet is a tool just like any other for bringing about the world you want to live in. Doctor, you are the Rosetta Stone and we appreciate you being here because this is the key to the future.

So did you have a chance to listen at all to the podcast from last week? - I did. - Oh -- okay. I did. What in your opinion did you feel like we got right, and what did we get wrong? And was the reaction to it, in your mind, expected knowing what you know of how the internet is where nuance goes to die? Yes, in the ivory tower is also where knowledge goes to die. So we are very much- Oh boy oh boy. Nicely done, Professor.

Well. Well, the problem is this, which is that, you know, people who are experts, doctors, have a really hard time communicating and getting their message out there. And so in this moment when I was listening to what you were struggling with, there were a couple of key things that I think some definitions might help with. Not to sound too pedantic here, but when we talk about misinformation in this field, we really are talking about information that is shared where people don't know its veracity or its accuracy.

So Joe Rogan really falls into the misinformation camp of someone who's just asking questions, has some ideas he wants to hear from a range of different people, but the misinformation is never error corrected, right? Like a good editorial magazine or even a newspaper is going to have a way to do corrections that the next day you're going to hear, "Hey, you know, we printed this thing and it was totally wrong." But the internet has really facilitated this flow of information where error correction just never happens. And the way in which the- How does that differ from disinformation? That's it. That's a campaign or an operation where-

Purposeful. You have people who are purposeful. There is intent there and which gets all the lawyers really rankled because they're like, "How can you know what's in a man's heart?" But we know Rudy Giuliani was really out here to try to, you know, upset the outcome of the election. He said- he said as much. And then there's a lot of background information that help us make sense of it. But disinformation, to put it simply, is either it's sharing inaccurate information for mostly political ends and sometimes finding- And with purpose.

With real, real purpose and veracity and planning and- but when we talk about like digital disinformation, this is different than when you would say something like, you know, the the way in which we handle the question around weapons of mass destruction in the 90s where, you know, politicians really had hoaxed some of the journalists into believing a state sponsored campaign. Right! Digital disinformation- By the way, I just want to point out very quickly. When you started talking about disinformation, Doctor, Jay Jurden disappeared from the program. He's there audio wise, but all of a sudden his video has gone out, which is a very convenient- They don't want us to talk about this, Jon. It's what they don't want us to know.

That is exactly my point. It's in that conjuring of the "they" that is where we - start to look around for evidence - Oh, look who's back! of who the "they" is. Dr. Donovan, Dr. Donovan. The reason this is so funny is because I didn't know that I disappeared. So you telling me, -- I might just be, I might just be a bad actor, but not even know that I'm spreading misinformation.

I'm just, I'm just hitting retweet. But that's the like, "My uncle said it on Facebook" kind of excuse, right? He's spreading this information. Right, right, right. What changes it for someone like Rogan is it's his brand. It's- controversy is his brand. Controversial conversations- Right. -is what Spotify paid $100 million for.

And Spotify wants to reject the fact that they're a publisher. And in this moment of the digital revolution that we're going through, to call yourself a platform actually means a very specific thing where Spotify wants all the benefits of being called a platform where there's a lot of user generated content, which creates a lot of chaos and opportunities for disinformation and misinformation, but they really knew what they were getting. All of these problems were there when Rogan was primarily using YouTube as a means to gather an audience. So here's where, here's where I run into trouble, and here's why I believe- Well, first of all, I know Joe, so I think you always grant more understanding and nuance to people you know, because you know them as more three dimensionally than what that appearance is, so. And we always demonize those that we maybe feel alien to us.

So that goes right in there. I'm already guilty of a bias, right? Mm hmm. But the second part of it, where I- where I run into trouble, is the thing you just said.

You talked about they want the benefit, but they don't want the accountability, and you mentioned weapons of mass destruction and it reminded me, like... the New York Times, right, was a giant purveyor of misinformation and disinformation. I don't know that the Times was purposeful. But misinformation.

And that's as vaunted a media organization as you can find. But there was no accountability for them. And I think where I get nervous is in the run up to the Iraq War and in the prosecution of the Iraq War, I was... very vocal, and sometimes cursed about about that.

But the mainstream view, the New York Times, was, "They have weapons of mass destruction, they have these tubes that can only be used for nuclear war. Saddam Hussein is this, he's that." Couldn't I Couldn't I have gone down and fallen down this if Viacom or Comedy Central had wanted to censor me, or had wanted to take me off the plat- Look, I'm not owed a platform. Nobody is. So it's not- it's not a First Amendment issue, it's not any that. We're really, once you're in bed with a corporation,

the deal is they have to sell enough beer and you get to do what you want. But my point is, this is- these are shifting sands. And I think I get concerned with, well, who gets to decide, what that-? I mean, in the Iraq War, I was on the side of what you would think on the mainstream is misinformation. I was promoting what they would call misinformation. But it turned out to be right, years later and the establishment media was wrong. And not only were they wrong, in some respects, you could make the case that they enabled a war that killed hundreds and hundreds of thousands of people.

And never paid a price for it and never had accountability. And just having an ombudsman print a retraction, to me, isn't accountability, so it's very easy to attack Rogan, but and I'm not saying that's not your right and that there aren't things there to talk about. But what I'm saying is let's be careful. Because the sands can shift.

Yeah, people are in a new information ecosystem and they're trying their best. But that's the thing about these platforms over the years is that we've, we've asked no accountability from them. They're not built by librarians who are actual stewards of information. And so it's been- it's taken us a long time, at least the last decade, to get into the moment where we ask more from these companies. We're asking, we want access to the truth.

We want it to come up first. We want more public interest contents. And so someone like Rogan really straddles the line because he reaches so many people, and he described it as a juggernaut. "It's out of his control now," but really, it's not. It is in his control. It's well within Spotify's control. So that's kind of the idea then is, because what I was saying is I'm generally more concerned about the algorithm than I am about the individual, because if the algorithm can earn your trust, it will place you into places that, you know, you assume that there's a gatekeeper.

It's sort of like the news, when it's in the New York Times, you assume that there is a gatekeeper there that has vetted this. But in reality, our modern media is kind of a... an information laundering system, where...

where the information comes from gets laundered through the aggregation process or the clickbait process or any of that. And so gossip and rumor become truth and fact become canon very quickly. Well, this is interesting because if you think about the history of the internet, the early web sites that were really popular, Perez Hilton, right? People were here for gossip and rumor. They weren't here for the truth. Right, right, right.

The news was actually really slow to get online. Dr. Donovan? It's part of the infrastructure of the internet itself.

So to be in this moment where we're demanding more truth means that these platforms are becoming institutions, right, like the New York Times, in a way, where people are asking them to be more accountable to the audiences that they claim to serve. And the Neil Young part of this is really interesting because tied up within it and very little is talked about is a labor dispute about Rogan getting $100 million and musicians getting, you know, a penny for 350 plays. Oh wow. So within Neil Young, there's a twist. So there's also an economic aspect to this that's very different.

Exactly. And a lot of it gets brought up, you know, in the moment where Neil Young just is like a catalyst for a bunch of different grievances that have been happening in the background, particularly about Rogan, but then also about Spotify having to stand on its two feet and say, "This is what we are." So there's been allegations of racism with Rogan, COVID misinfo, of course, that Neil Young talks about. There's also lots of people that believe Rogan is anti transgender, right, and has his own opinions about trans people in sports.

But where is the platform at equal volume that allows trans people to counterweight what Rogan is saying, right? And so there's something happening in the information ecosystem where we went from platforms that were supposed to serve people -- Don't you think that ended years and years ago? Because I would point to, let's let's use this as an example. Let's say you're The Simpsons, right? All of The Simpsons. All of the Simpsons. You're Lisa, Bart, Homer. I'm Lisa. I'm Homer. You can be Homer. OK I knew what was coming.

You're on Fox, right? The same people that pay you pay Hannity. What is your responsibility, you know, when you say, "Where's the platform of equal voice?" That's the fairness doctrine not for politics but for social issues, right? But we've never had that. And as an artist what's your responsibility to the tube that you're in and the company that you're on? And I I struggle with that. Like, I don't know what you do with that. Like, do we now expect The Simpsons to say we will no longer be a part of this company? Or like, how far do we go with that? It's a big question right now, and I think we're moving from culture wars to content wars in the sense that the way in which these fragmented, fringy opinions start to bubble up and coalesce online makes it seem like there's a lot more people with the same ideas. Yeah, like there were a lot of people that thought Lady Gaga should have got an Oscar nom and that's not the case.

It's not. Not the case at all. And that's that's innocent. Leave Lady Gaga out of this. I just want to point out very quickly, that's @jayjurden.

I would rather think about it, not from the perspective of being an artist or a comedian, but from the perspective of owning a technology company. Well, what do you do? How do you protect and provide safety for your audiences or your customers, which by and large, the customers are advertisers and the users are like the rows of cabbage that just get harvested, right? Like me and you are inconsequential individually? Doctor, did you just refer to me as a row of cabbage? I mean you just — You know you're being cultivated. I think just a single cabbage. Single head of cabbage not a row. Yeah you're just one single cabbage. Yeah.

Listen, listen I'm a sociologist and it's an old reference to a sociological theory by Weber, where it's, you know, like -- Oh sure but you don't have to explain to us, Doctor. We all know Weber. We know Weber. You know Weber.

Yeah, yeah. I didn't know this was a freshman course, yeah. But the idea here is pretty simple, which is to say that if you are a technology company, there's been all this confusion about what a platform is. A platform can mean a place from which to speak. A platform can mean a political agenda, and a platform now can mean a computational infrastructure that delivers content.

And so that designation of a platform is something that we're going to argue about, because we're going to say, is it the responsibility of the person who's speaking to be responsible to these audiences or we're going to say it's the computational infrastructure, the actual technology itself? And what's interesting about the New York Times example is the New York Times would put the burden or the onus of disinformation on the sources and say these are the people who are most responsible. And we've seen very similar things with Facebook where they're saying, "Well, we don't know what's traveling on this crazy superhighway of information. We don't know where it goes. We don't." And then you see something like Stop The Steal happen and you're like, if not for your technology, these groups never would have been able to get aligned and meet and plan. And so you are culpable or responsible or accountable for these actions. And that's really why we have to understand these platforms as doing the organization and the coordination of things that happen from the wires to the weeds, and it's really important that we understand that.

So what is our vetting system for this? Is it crowdsourcing? Like, I mean, Wikipedia to some extent does that. But you know, is the answer the blockchain of information? God, no. No. That would be wrong. I have just failed out of the Dr.'s class.

It's slow and it's reactionary in the sense that what we need to do is actually stem the flow and the tide of the damage that an individual or small groups of people can do. And this is where the problem of platforming really comes in, because if you are able to distribute your information to millions and millions of people and you have no responsibility to what the aftermath is, the true costs are then paid by journalists that have to debunk it. The true costs are paid by academics that have to research it and then down the line anybody who might be harmed by believing, you know, that you can treat COVID with some kind of bleach or light therapy. And so what's important is that we think about platforming differently in terms of the scale. And one of the things that I've really advocated for is that we, we reduce the scale and the speed by which information travels so as to be able to do as what you're suggesting is to have some kind of crowdsourced intervention or not to let information scale to a huge amount of people before we can actually have any evidence that it's true or false. And this is like misinformation that ends up leading us to run on the, you know, the grocery stores where there's no toilet paper and we're like, "Why do we have a toilet paper shortage?" And it's because people are reading that, you know, there's going to be a military call to arms, What? and everything's getting shut down again.

It's happening again. Get toilet paper, Jon! Get toilet paper. But that's the thing is, it's -- it has these real world effects.

The wires to the weeds is an important — So in terms of platforming, is engagement, in your mind a fool's errand? Engage -- Do you do you recommend pulling back, or do you recommend engagement? Because I still believe in engagement, like I've talked– I mean, damn, I had Donald Rumsfeld on my show. Like, how do you learn nuance without engagement and how do you get understanding without nuance? And I guess that's my fear is that we lose that. Can I say something also about that? Please. There's one more thing that also kind of ties into the story in the narrative of that. For any othered person, for any marginalized group, you don't have the privilege or even the space to not engage with people who might not have your best interest at heart.

There was never a time when women, Black people, or queer people were able to live in America and say, "I don't want to engage." That is a new take on interactions and power dynamics, but it is also a very privileged take and I don't throw that word around all the time because it starts to like lose a lot of its meaning. But I'm a queer Black person from Mississippi I — Wait, what? Yes. Yes, yes, yes.

How did you get past our hiring process? Breaking news. Jon, I want to say Twitter does a lot of bad, but it also does some good. Now, the way that people talk about not engaging with someone is they go, "Oh, I just would never even talk to them." And that's amazing. As long as that person isn't your boss, as long as that person doesn't provide you with housing, as long as that person doesn't provide you with an opportunity to make some money, and as long as that person isn't a gatekeeper for you to have access just on day to day things. So I think engagement for people online shouldn't start to — it shouldn't start to mean engagement in general and in total because the story of America is a lot of people having to engage with other people who do not have their best interest at heart.

And you don't have— it's not a kumbaya moment, and you don't even always have to engage with them in a positive way. But I think there's an ability of a lot of people to say, "Hey, fuck you, that's wrong." And that still engagement. I think Black people saying, "I don't like this." Black people telling

Joe Rogan, "Hey, man, I know you have friends that are Black comics who would fucking get in your ass over this new compilation that just popped up on the internet." That's a level of engagement that you have to discern and you have to be very intentional and specific when you say, "Oh, I would never engage" because women don't have the ability. Minorities don't have that ability. A lot of people for the longest time haven't had that ability.

Boy, that's —Jay I think that's such a great point. Right. For the majority of the world you're right, Jay. It's not a — it's not the privilege of like, "I'm taking my ball and going home."

This is my life. And saying, "Fuck you." Saying "Fuck you" or saying "I don't believe that" or even saying, "OK, this is the study you're going to cite.

This is the study I'm going to cite." That's engagement. Yeah. Dr? Well, you're all right. Everybody gets an A+.

Oh my God! Hit the air horn. I think there's, you know, there's a couple of things going on here where engagements —on the one hand we're talking about it is this interpersonal relationship, something where you're saying, you know, "Should I talk to my aunt who's like, gone down this rabbit hole", right? And the familiarness that you're talking about earlier about why and when you're willing to give someone a pass and what you're going to engage on and how you're going to have your boundaries, that's really important. We just came out of four years of having one of the most divisive presidencies, political polarization moments in our history. Not just because, like Trump is who he is, but because everybody was called to atone.

Everybody was called to say something. Everybody was called to have an opinion. And social media became the way in which they express that. And then Twitter trends, and other kinds of technologies, and these algorithms really worked on harnessing those and that information and then polarizing it so that you would go one way or another. And remember that the internet itself is highly participatory.

Facebook is empty shelves. Amazon is an empty warehouse. You know, like YouTube is a Blockbuster on a Friday night where you can't get any of the new releases.

You know, there are all these places where people fill in the content, right? And so you have to think about it in terms of like how do you measure this participation where everybody is being called to have an opinion on things that they might not have had an opinion on and would maybe in public conversation say, "Gee, I don't know which way to come down on that issue. Seems highly contentious." I don't think that's allowed anymore to not know what side to come down on. It's not. It's not. It's like — but it's as you think about it, though, and you start to scale up in the aggregation of all of those opinions makes us feel further and further pulled away from each other, which is why we look to influencers to set the kind of terms of the public debate that we're going to have. And so someone like you engaging with someone like Rogan, people are going to argue that he or you are punching up.

I'm reminded, of course, of the crossfire moment where you were just like, "Can you take this work more seriously?" You know, to Tucker Carlson. But even that was— That was an important moment, though. But it was much, much misinterpreted because everybody said it was about civility.

It had nothing to do with civility. It had everything to do with honesty. I don't care if people yell at each other. I just want them to be honest. Yeah. You could see a trajectory in his hardening of his position over the years as he became more and more wedded to being an opinion maker, more and more wedded to being someone that has these really outside of the mainstream positions on race.

I can see the Newsweek article now. "Jon Stewart responsible for Tucker Carlson." It's a villain origin story. My God! Well it kind of is. You know, but it's to say that media is part of the issue that we have to address. Would you have talked to Rumsfeld, Doctor? I have a brain trust of people that would never, ever let me do a public show with Rumsfeld because their jobs are on the line.

Because you don't have a Ouija board? Is that why? Last question, Doctor. Here's the last question. We have a show and we have a platform. So what would you suggest for us as measures to guard against, you know, even accidental harm? But still maintain kind of like my belief in engagement, which I think unfortunately, I'm going to end up, you know, having that forever. Yeah, I think that's OK, and that's, you know, be an advocate for the truth. Mm hmm. What brings us towards clarity is hearing from other people and understanding from other people but don't get hoaxed.

Going through the vetting process and making sure that the person isn't just trying to turn a dime on colloidal silver or whatever supplement of the day. You know that they're not going to come out here and be like, "I got a great new treatment. It's called horse tranquilizers. You won't feel a thing," right? You know, just do that background research and always try to tell the impact story. And this is something that I tell journalists all the time

which is that platform the people who are harmed by this stuff. Platform the people who don't have voices in the debate. Or the people who are struggling with how to understand the world around them and what's going to matter.

Thank you so much, Doctor, for taking the time and really enjoyed the conversation. Such a fascinating world. Well, always welcome to clarify things. And if you need me at a moment's notice, I'll be here. Thank you very much for all that.

Thank you. And I'm so glad not to have gotten detention. Well, I'll see you in my office at 5:00.

2022-02-12 07:06

Show Video

Other news