The Future, This Week 12 Jan 2018: how bots use us to make money online and a recap on algorithms
This. Is the future this week on, Sydney business insights I'm Sandra Peter and, I'm Kai Rima. Every. Week we get together and look on the news of the week we discuss technology, the future of business the, weird and the wonderful and things that change the world okay let's roll this, is the future this week Sandra, and I haven't semester break but we have pre-recorded this story before we left today, we discuss how BOTS are using us to make money online and, the recap, on algorithms I'm. Sandra Peter I'm the director of Sydney business insights I'm Kareem a professor at the Business School and data of the digital disruption research group, saw. Sandra what happened in the future this week I actually, have no idea I've been on holiday but, there are a couple of stories we wanted to pre-record these, are stories that have appeared. Over the last few. Months and we. Didn't get a chance to discuss. Them on the show because we've only got three stories every week but we thought they're really important, to do before the end of the year so this one is from medium, and it's titled something, is wrong on the Internet, this story is really depressing, though read it the story is written by writer, and artist James. Bridle. And it comes on the back of another, article, that appeared, in The New York Times that. Inspired. James, to dig, a little deeper, and really, get into this phenomenon. Which we try to unpack for you it's a disturbing, one so much. Like James, is not unpacking. Every detail in his article, we're not going to unpack, every detail in. Our discussion. But it concerns, the, way in which, content. Is created in, particular on YouTube, and then presented. For advertising, dollars and, the. Particular. Phenomenon. He looked at is, content. That is created. For small children basically, and presented, on YouTube, for kids so, we're not talking here about normal. Kids videos which are weird enough as it is, you know unpacking. Toys all endlessly. Looking, at surprise, kinder, eggs although that I do find quite, soothing, yeah, nursery, rhymes and, finger, family, songs, the, certain trends, and crazes, that are which, then lead to hundreds. And thousands, of videos being produced. That. Is weird, as it is and the, educational. Value for. Children, of these videos is highly questionable but. The, author is actually looking at something much darker that comes on the back of this so here's, one the number of videos, that actually have with these really strange names things, like surprise. Play-doh X Peppa Pig Stamper, cars, Pocoyo. Minecraft, Smurfs kinder played those parkour brilli oh I will distress this is the title of one, video yes. There's also cars, screaming, Ben Sheets Lightning. McQueen Disney, Pixar, or Disney baby. Pop-up, hell's Easter, XPrize. What. The author did was then went on and looked at these videos and figure out just how, odd these videos are they're a combination, and mix-up of images. Sounds. Music. Sometimes, entirely. Animated. Sometimes, acted, out by people. Or people, with things here's. A short clip from one of them. So. Let's try to describe what is happening in this video so, this video is a mash-up, of the. Finger. Song which appears in hundreds, of videos, the. Visuals, however is, characters. From Aladdin. Changing. Their heads, and then from the side there's this little girl from the speaking of me coming in crying. Or laughing when. The heads not. Matching, or matching the. Body so it also happens randomly yeah it's a completely, random, mesh up and it's, quite clear that these videos are. Computer. Generated by BOTS. No, human probably, had a hand in creating, those, videos, and the author says there's not, only hundreds, and thousands, of channels, which published.
These Videos, but, there's thousands, of these videos, which sometimes, have millions, of hits a lot. Of which are probably again, by, BOTS. To beef up the numbers and make sure that this video, are being listed in these search results in, the playlists, that are presented. To children and let's, just be clear these videos range from the just, bizarre, like the example that we mentioned. Two videos, that are actually, quite disturbing, or violent, or just. Completely, inappropriate. Things like peppa pig drinking. Bleach instead, of naming vegetables. Or Peppa Pig eating, her father so, there's, a phenomenon, that mixes. A number, of different, kinds. Of videos, there's. Satirical. Or outright, trolling. And abusive, videos, that, shall Peppa pick in all kinds of different. Inappropriate. Situations. Then. There's these computer-generated. Mashups. That we've, just discussed, but there's also a disturbing. Number of videos that actually. Involve, humans. There's. A phenomenon, where parents. Are using, their children, to create videos. Of very questionable value. One. Of which is mentioned in, the article where children, are actually shown. In quite, abusive. Situations, being. Subjected, to pain or other uncomfortable. Situations. A channel. By the name of toy. Freaks, which has actually since, been delisted. By, YouTube and deleted. On the back of this article, but again there are hundreds, and thousands, of these channel this is not one person there are hundreds and thousands of these videos, but the even more weird, ones are where, a, group of human, actors, and. Axons. That, quite, clearly comply. To a set of keywords that are generated, by an algorithm, to. Optimize, the, listing, in search results, to maximize. Advertising. Dollars and so these humans are just acting, out random. Situations. Of toy, finger. Puppets, and crawling, around the floor and dressing. Up where what the humans are doing is to basically just create content, that matches, those keywords, let's. Try to unpack this and figure out what exactly is happening here, and why. Because. As we said this is quite disturbing, and we, gave the example of the children's. Content, that is being showcased, in the article and that was the topic, of the New York Times article as well and there's also an article in The Washington Post that focused on the same implications. But again with very little effort you could basically make the same, argument. For videos, that feature white. Nationalism. Or violent. Religious. Ideologies. Or they, could be about climate, change denial. Or about conspiracy. Theories or, about all sorts of fake news so we think this is quite important, to discuss so, at the heart of this problem is a combination, of monetizing. Content and, automation. So. If we take a look at YouTube, YouTube. At some point allowed. Content. Producers, to monetize, their content by having, ads placed, alongside or, now inside, those, videos, and while. There is a legitimate business model, that allows, people, who create content for YouTube to make a living and. Advertisers. To find. An audience and, by itself that is a legitimate thing it's a legitimate business practice. Right the problem gets, out of hand once you add BOTS. Algorithms. And automation, to, the game and we've. Talked about this previously and we called it colonizing. Practices, that. Basically. Sit. On top of illegitimate practice, which is content production and viewing, and the whole experience of YouTube where, the advertising. Practice then basically. Takes over, and, it all becomes about generating. Advertising. Dollars and the, content, is then produced for. That very purpose and that, can be parents, subjecting, their children to all kinds of crazy and more and more extreme situations. To. Make a living, and generate. Income from their YouTube endeavors. But, it's also people. Employing, BOTS to automatically. Create, content. That is then, automatically. Being, keyword, and listed. Just, to harvest, advertising. Dollars, and as, the article, makes clear some of these videos make very little, money by, themselves, so their videos that only have four thousand, five thousand, views so not huge numbers but, if you think about the number of these channels in.
The Aggregate these things actually make real money and because, you can produce, these, videos at veneers, cost, it doesn't, really matter how, many cents. Each. Video brings. In if you can create hundreds. Of thousands, of these videos, it's. A so-called longtail, phenomenon, it all adds up to a worthwhile, endeavor. And we also want to make clear that this is not a phenomenon, specific. To YouTube, we have seen this in other places like it in Amazon for example where, merchandise. Products, such as t-shirts Cubs. Mugs. Smartphone. Cases are, being offered that are quite clearly, being, generated, by, just grabbing. Either text. Blocks of the internet or pictures. From, Google. Which, then leads to such bizarre. Such, as a smartphone, case that shows an elderly, man in diapers. Or t-shirts. That, have keep. Calm and hit her on them products. Which the author says. Might, not necessarily, have. Ever been bought, by anyone or, even created. By anyone, no one actually sat down and thought this would be a good idea to create know what rather algorithms. Have generated. These images, and these t-shirts, as offerings. Then listed, on eBay or Amazon so, again. This is a phenomenon, that has zero cost in its creation but. Might potentially lead to, income. On the back of something, that happens. Without, any human, intervention so. It's quite tempting to dismiss this, sort of examples, as examples. Of trolling, or of algorithms. Gone awry or even. Of parents. Not minding their children enough to not watch these videos, but there's actually something much, more disturbing happening, here the. Problem, is the systematic, nature with, which this happens these are not isolated, cases this, is happening a lot and this content, is infesting. The lists, and channels of legitimate. Content, so. It's presented. As if they were legitimate, BBC. Produced peppa pig videos or, content. That at first, glance is indistinguishable. From, legitimate. Children's. Content for example so it's very hard for parents to actually pick up on this in the, first instance, and you could argue why, are children watching, YouTube, that much but even that YouTube is actually running a platform. That, is called YouTube for kids there's. A reasonable, expectation that this content, is actually appropriate, for children so, there's, the systematic, and the automated, way in which this is happening the deeper issue is however that the reason these problems, are occurring in the first place is the revenue models that platforms, like Google and, Facebook and, YouTube are built on and the, fact that there is very little that these companies are willing, to do and very little that they can actually do, to prevent, this from happening, again and the reaction, of Google. In regards. To its YouTube, platform, is quite telling they said as soon as we were notified, we took action and in this case delisted, this one YouTube, channel but. A reaction. Like this waiting for someone to flag inappropriate. Content, that is then being, removed, these, are V the, extent. To which this is happening as mentioned, in the article suggests. That the, platform providers, are largely accepting. Of this practice, they can't do anything about it because it's systemic and baked into their business. Model and what they basically say is we've, created this, monster and we all have to live with it and if it creates collateral. Damage then we'll. Do something about those cases, but we, basically cannot, know everything and things we don't know about we can't do anything about, so, they're, basically blaming. The algorithm, and pretending, innocence, which is incidentally, a story, we have talked about previously in, which we will recap today it's worth noting, though that companies. Like Google and. Facebook are, actually, monetizing. This this is their business model, so even if they were to accept, some responsibility for. What, is going on we need full responsibility. They are complicit, in building, this because this is their business model, so the problem is in the very platform. Nature, of these, systems, so first, of all what. Happens is a decoupling. Of content. From its source so, if you go to the ABC, or BBC website and you watch Peppa Pig then you can be relatively, certain there, what you're watching is legitimate, content it's age-appropriate, it's.
Work You write it what these platforms, do however is, that they, decouple. The, content, from its source you're no longer watching Peppa, Pig on the BBC, you're, now watching Peppa, Pig on YouTube and because, anyone. Can use keywords Peppa. Pig and there's a lot of pirated content. And mashed up content, what happens is that, legitimate. Content is being mixed in with fake news with pirated. Material with, mashed up material, and that, creates the problem in the first place add, on top of this the algorithmic, management, or the automation, of how, these. Things are being presented, in. Your Facebook stream but also in play, next lists, in YouTube, streams and the, same algorithms, that allow these things to bubble up would also allow legitimate. Content creators, who might not have a lot of views or who are trying to enter the platform to bubble up to the surface or it might even be crowding, them out since many, of these videos as we've seen are very well optimized for keywords, and for maximum. Exposure so the decoupling of content, from the source the automated. Keyword, based, presentation. Of the content plus, the incentive, to monetize. Content. With advertising, then, creates, incentives. To game the system and create content, for the sole purpose of either, spreading. News for some ulterior political, motives, or in this instance to just reap advertising. Dollars, buy content. That has zero, value and zero cost, in production, that is purely, geared towards, gaming. The system in making money which. Then creates the phenomenon, that we've been discussing. So there's no real solution to this because it seems, to reside, in the very nature of how these platforms, work and. Hiring. More people as in the case of Facebook and we discussed this before or. Even enlisting. Vika pedia, to weed out fake, news only patches, that fix up some of the symptoms but they don't go to the heart of how these platforms. Work, and the awareness can only take, us so far we actually live, lies where we do outsource, some of the decision-making that we have to algorithms.
Give Us recommended. Videos or recommended. Playlists, or other things so as long as we do that awareness, can only take us so far the, last issue that the article brings up I thought was worth mentioning was, the fact that the author says what concerns me is that this, is just one aspect of, a kind of infrastructural. Violence, being done to all of us all of the time and we're still struggling to find a way to even talk about it to describe its mechanism, and its actions and its effects, and so, the story we've discussed, today on the future this week highlights, just how. Difficult. It, is to unpack. What is going on for people who are just users of this platform consumers. That the fact that how these, platforms, can be skewed by profit, motives, what the mechanisms, behind them are how, generally. Inscrutable. Many of the services that we use today are and this, is nowhere clearer, than, in the reaction, by the companies, themselves reactions. Of Facebook, or Google in actually. Pinpointing. The problem and doing something about it and, while this is all we have time for for this story we thought we rerun two stories that are immediately. Relevant. Here the first one where we discuss, how facebook, is planning, to enlist Wikipedia, to solve its fake news problem, and the second one where we discuss whether platforms, can blame, the algorithm, and pretend, innocence. Facebook. Taps Wikipedia, to fix its fake news problem, for them this story comes from Mashable, and it's really an update of a continuing, story that we've covered on, a number of occasions follows. Directly from last, week's, the, algorithm. Is innocent, story, the story that Google, and Facebook are, pointing, to their algorithms, as the culprits, in the fake, news issue, around, Facebook, posts, and the u.s.. Election. Inappropriate. Ads that, appear next, to YouTube videos, or, Google. Inappropriately. Highlighting. Links, to fortune, in its top stories in the aftermath of the Las Vegas shooting, so, the idea that the algorithms, frequently. Come up with things, that are less, than optimal, say Facebook's. Try to solve. This problem last, week as well the solution they came up with was to hire another, thousand. People to actually manually, remove, some of this content or tag some of this content, this comes on top of the seven and a half thousand. People that I had previously hired, in various, rounds to fix similar issues around inappropriate. Content, including, suicides, and racist content, and so on and so forth this. Week we have a new solution. So Facebook has noticed, that there, is one, organization. That has successfully. Tackled, the fake news of fake content, problem and that is Wikipedia a, non-for-profit.
Community. Of more, than a hundred and thirty thousand. Moderators. And millions, of users who contribute, to creating. The, online encyclopedia which. Has a, mechanism, that really works very well in weeding, out, problems. Buyers, from. Its content. Base so is this a case of if you can't fix it outsource it and where. Does the burden now lie what, Facebook, has done is, actually say, that they're gonna attach. A small I, button. Next to the newsfeed next to the articles that you see and you will be able to click on that and it, will take you to the Wikipedia, description, of the publisher, you could now follow on this button and read up more on, who the sources and what, their background, is and so on so, in, a world where we are all struggling, for attention. And we're spending less, than two, three seconds, on videos and articles the. Burden is now on us to go to a, Wikipedia page and look up the irrelevant, information so. Facebook, in this way is not actually, solving, the problem there absolving. Themselves, by putting the burden on the user and actually. On the Wikipedia, community and. It raises two big questions the first one is will. Anyone, ever click. On those links will users understand, this and as you say when. We're browsing through Facebook, where, we actually want to leave the Facebook stream and go someplace else to read up on the publisher. Is this even realistic, but. Also what, were to happen if, this catches, on and if the publisher, information on, Wikipedia, becomes, the. Source for weeding out the fake from the real and, that's, not for it Wikipedia, is not perfect, it's not really well on curbing fake news but, also it lists volunteers, real-life. People who have some time to spend on this but not all the time in the world so what happens if we redirect. The burden, to, this not-for-profit community. So, I could just create, an, article on Wikipedia. Of a fake, publisher, and make it sound really, really well and then. Propagate. My news on the Facebook stream and when, you click on it you have this great information on Wikipedia, and now it is up to the moderators, to actually, patrol. And monitor. All the things that might be popping up in their community, to prove that what happens on the Facebook community is legitimate. So I'll be pushing the burden on a, community. That is not actually, equipped, for this and whose purpose, it is not to solve the Facebook. Fake news problem, which takes us back to the real problem, which is the business model that companies, like Facebook are built on as, long as the business model relies on me clicking, on more and more things the problems, inherent in these algorithms are not easily, solved, or addressed so, is this just a media stunt where facebook can say we have done something there is this button now because.
If, That, business model is to work, Facebook. Has no incentive. To make users, click, on these, links and leave the Facebook, stream where. They receive, information, and ads and where Facebook can actually monetize, the user so. Is this, just something that they do to, be seen to be fixing the problem, or will this actually work that's the question here. The. Story is from the outline and it's called the algorithm. Is innocence, the, article, makes the point that Google. And Facebook who, have been in the news a lot recently for. All kinds of different instances. Of. Presenting. Inappropriate. Content, that they are deflecting. Responsibility. Onto the algorithm, they basically say the. Computer did it it was none of our making the computer malfunction. The algorithm, presented. In. Accurate, content, so Sandra what is that about on Monday for instance, when the worst mass shooting in US history took, place if, you, were to google gary, danly the name that was mistakenly. Identified. As the shooter who killed a lot, of people in Las Vegas on Sunday night Google, would present, quite a number of threads, that were filled, with bizarre conspiracy theories. About the political views, that this man had story, sourced, from the website 4chan which, is basically, an, unregulated. Discussion. Forum, known for, presenting all, kinds, of conspiracy. Theories, and not. Necessarily. Real news and the, point was that Google presented, these links. In its top stories, box, that sits right on top of the Google search page Google. Don't want them to say that unfortunately. We. Were briefly, serving, an inaccurate. Website in our search results for. A small number of queries, and we rectified. This almost, immediately. Once we learned about this mistake. In. An email, sent to the author, of the outline, article. Google also explained, the algorithms, logic, where this algorithm, had weighed freshness. Too heavily over, how, authoritative, the story was and that the algorithm, had lowered its standards for, its top stories because, there just weren't enough relevant. Stories that it could so the news was too new essentially. For, the, algorithm, to find other relevant, things, that it could present or so the story goes so, it was the algorithms, fault absolutely, and really this wasn't the first time we blamed the algorithm, back in April the article, mentions the face, app app that, had released the filter that would make people more attractive, by giving. Them lighter skin and rounder eyes and, it, was called an unfortunate. Side effect of the algorithm, not intended, behavior so, it was an inbuilt. Bias, that, attractiveness. Was, essentially. Associated. With whiteness. And of course there are the big, stories, of the past couple of weeks where, Facebook, have allowed, advertised, to, target, people who hate Jews. In what was called again a faulty, algorithm. And we also have discussed, this on the podcast previously. Stories. Around YouTube, presenting. Inappropriate. Ads on, videos. And. Let's, not forget the. Whole story around, Facebook. And, the, u.s. election. Where Facebook is frequently, being blamed of, taking. An active role in presenting. Buyers. News, stories fake news to, potential, voters that. Has played a role in the election outcome and, also that, Facebook had said that this, idea was crazy that fake news on Facebook had influenced, the outcome of the election, but then, coming, back recently, and saying that they are looking into foreign.
Actors And Russian groups and other former, Soviet states as well as other organization. To try to understand, how their tools are being used, or being taken, advantage of to obtain these results, so, Facebook, Google, and, others. Working with machine learning an algorithm, algorithmic. Presentation. Of, content. Are frequently. Blaming. Their, algorithms. For, these problems. They're saying it wasn't us it was a faulty, algorithm, so. Let's examine that idea of a faulty algorithm, so what, would have truly faulty algorithm, a in order to determine this let's remember what. We're talking about traditional. Algorithms, are, definite. Sequences, of steps. That the computer, run through to achieve a result to bring the, software. From one stage to another so it's a definite series of steps which we would call an algorithm and we can determine when it malfunctions. Because, we don't end up in the state that we intended, to be in but. Machine. Learning works differently, machine, learning is a probabilistic. Pattern, matching algorithm, which, in this instance did. Exactly, what it was supposed to do present. Certain. Results, that are relevant, to the topic on some, criteria. Semantic. Nearness. Or some keywords, that it elicits, and so, the fortune, article, was relevant because it was talking about the same topic, these algorithm. Are not designed to either exclude. Faulty. Information or deliberate. Misinformation nor, are they built. To account, for bias no, and in fact they don't, actually understand, what they're presenting they just present, stuff. That is relevant to the audience as measured, by will, someone, click on it so relevance is measured, after, the fact so I'm being, presented, with a set of links and when, I click on those links then the algorithm will learn from this that next time presents, something to KY that is similar, to what I just clicked, on and so, over time the algorithm is improving, what it presents to me to elicit more and more clicks so that I like stuff that I share stuff, it also in the case of Facebook presents. Me with potential. Friends, and if it presents the right people I might create more connections. So, really. What the algorithm does it optimizes. Engagement. On the platform, links shares likes clicking, on ads and therefore revenue for the company so first of the algorithms. Are not per. Se faulty, they are doing what they are designed, to do we just happen not to agree, with the results, that they are presenting, but they are working pretty much as they were built, to work yes the problem is not so much that the results, are inaccurate.
It's More that they are, inappropriate. And the, algorithm, has no appreciation, for what is appropriate, or inappropriate because. It. Doesn't understand, our world it doesn't live in our world it doesn't know anything about. Culture. About, norms. About what, is right or wrong in other words as someone said on television it doesn't give a damn so the question is how do we fix this how does Google go about fixing things so first of all can you fix this so, you can't fix the algorithm, the algorithm does exactly what it's supposed to do it does pattern matching and it presents results. That are relevant but it's also essentially, a black box we discussed, this before. So, you don't actually know how. The weighting. In the algorithm works and what will come out at the end the only thing you know is that it will present something that is probably. Relevant, to, what, you were looking for so the reason it would be really hard to fix this is because you don't exactly, know what type of information you should change and also the data that you model Don is biased, to begin with so how do you go about changing, them and we're not talking algorithms. That were trained with a definite, set, of training data that you could change to eradicate. Or minimize, bias, those. Algorithms. Learn on the fly they, learn constantly, from, what people are clicking on so, people who are clicking, on links. That I associated, with a political, leaning, will then be presented more of those things that they are potentially, clicking on which also leads to the echo. Chamber effect where people are presented, things that just reaffirm, their beliefs and we talked about this previously so. The whole idea is not for those algorithms, to be unbiased. It's precisely. To exploit, bias, to present, things that are relevant to people, to, have them click on more things so. Facebook's solution, to this and there's, a good article in Business Insider looking, at this and as always we will link to all the articles, in our show notes and you can explore all the resources, that we had a look at Facebook's. Answer, is to throw, bodies, at the problem, so, on Monday Facebook announced that it would hire another. Thousand, people in the following months, to monitor. These ads for, instance like the Russian ads that we saw like the Russian ads linked to fake accounts, that we've seen during the US elections, that. It will hire a thousand, people to, remove, the ads that don't meet its guidelines if this. Sounds a little bit familiar it's, because Facebook's, done this before if we, remember, the sad incidents of live stream suicides, and live stream murders, that, we've seen on Facebook this is when Facebook said that it would hire about three thousand new people to, monitor some of the content, on top, of the four and a half thousand, people it already had so. We are now at over eight thousand, people that are monitoring, are these the jobs of the new economy. Sadly. Yes so, what we're talking about now is a, system. Where a vast, array, of algorithms. Is in charge of. Discerning. Who, gets to see what on Facebook, what search, results, are being presented, on Google, the, kind of ads that are presented, alongside YouTube.
Videos. And because those algorithms. Are. Not really very intelligent. They, are very good at matching. Relevant. Content, to, search, results and, to. People's known preferences. But, they have no appreciation for appropriateness, for things, that might be out of line things that might be illegal things that might be offensive, so. You have to add this human, layer of judgment, of often. By the our low, paid jobs, who are in charge of weeding, out the most blatant. Obvious mistakes. That those algorithms, are making and, intuitively. This idea of hiring more and more people to throw at the problem seems. A good solution seems like a reasonable, common-sense solution, but if you take, a closer look and the Business Insider article, also takes a closer look at this there are quite a few things that we would need to figure out things, like who are these people that were hiring are these contractors. Where, are they from are they in the same places do they understand, the context that they're supposed to regular basis, do they make that judgment, exactly, is there a training, they're thought to look for specific kinds, of things where does reasonable filtering. And and. Inappropriate. Censorship, start how, does this then inform, how Facebook's, algorithm and machine learning processes. Work when does it start flagging, things that it wasn't flagging, up until now are any of these organizations than. Working with government. Authorities, or with other people to figure out what are some of the standards, how do we develop the standards by which this would happen so there are a whole bunch of questions that remain unanswered and. Yes, this is a step forward but probably not in ultimate, solution to the problem and the bigger question, is do. We have a good understanding of. What the problem is, because. Eradicating. So-called. Bias. Or. Diversity. In search results, is not. The ultimate solution to every search that we do on the Internet absolutely. Not so there are a couple of other really good articles, by William, Curtin who also, wrote the outline article, and he. Gives a couple of really good examples for, instance if you do a Google search for Flat Earth it should give you a wide variety of stories that the earth is not flat but also that there are unfortunately. Still a lot of people out there who believe the earth. Yeah and you might want to actually look, up the flat, earther movement, and what, ideas the. People are into however. Same. Author did, I search for the, Great Barrier Reef and. The. Top story is presented, by Google. On the Great Barrier Reef were, some, from the Sydney Morning Herald around, the coral crisis, and from, Wired, magazine talking. About the crisis, of the Great Barrier Reef but, the other story was a Breitbart, news saying. That the coral reef is still not dying, that nothing is happening and, that this is all a great conspiracy so, the. Idea of what is a point, of view versus, what is probably. Complete. Nonsense because it just goes against, all the science that we have on the topic is it irresponsible. For Google to attach. Some, kind, of implicit, credibility. To a story, that is pushing these things around the coral reef which, goes back to the old problem, that the algorithm, does not really. Understand, the intention. That goes with searching, for, a particular topic, and also that it cannot really distinguish, between real. News fake, news between. Scientifically. Sound facts. And just. Opinion. Or propaganda. So. Where does this leave us first. There, is a, huge, problem associated with bias in algorithms. And it, has a number of consequences some, of which we spoke about on Q&A, that have to do with how we hire people or. How we grant, people parole, but there's this whole other range of consequences, of bias in algorithms. Second. Is the language that we use to talk about this we, talk about faulty. Algorithms, doing the wrong thing, so. We anthropomorphize. These algorithms, as if they had agency, as if, they were actors. That would make, those decisions, and therefore would make mistakes, or, apply the. Wrong judgment, and incidentally. That allows, us to absolve, ourselves to, just point to the algorithm as the actor, who made the mistake, but it is our job, or indeed some of these companies jobs, to get the thing right yes, but here I want to interject what does that mean, for these companies to get things right, what are they trying to do what are they optimizing. On and if we're looking at what Facebook, does essentially, they're, in the business of connecting, every one of creating.
Engagement, On the platform, they're not really, in the business of providing. Balanced, news what, they are optimizing. Is clicks at. Revenue. Connecting. More people because that leads to more clicks sharing, and at revenue, the. Problem, of fake. News or, bias. Imbalances. They're basically a sideshow. For them it's an unfortunate, side effect of the thing that they're trying to do of creating, more connections, and engagement. It is something that they have to deal with but it's not their purpose, to actually. Be a balanced, news outlet, and neither, is Google, actually doing this for them it's much the same it's actually, about advertising, and, you drive advertising. By exploiting, people's, world views and preferences, and yes biases. And the, problems that we're discussing are emergence. ID effects that they have to deal with and they do this by layering filters. Of people, and other algorithms. That try to weed out the most obvious, problems, so, are you saying that because it's not these companies jobs it absorbs, them of any responsibility. Absolutely. Not it's not at all what I'm saying what I'm trying to say is we need to understand, what they're trying to do to then realize, how these problems come about and maybe, ask the question, whether they are actually optimizing. The right thing whether we actually want those platforms, which, have. Become the. Internet, for some people who spent most of their online time, on platforms. Like Facebook, whether. We need some, form of at least awareness, in the first instance, or regulations. Or some standards, that will provide incentives, for these companies, to actually deal with the problem not, as something that happens after the fact but actually removes. The systemic, issues that create, the problem in the first place so at the very least we need to talk about these issues have a public. Conversation about, them and be aware that they are happening and, I'm sure we will have to revisit this issue because those problems are not. This. Was the future this way made possible by the Sydney business insights team and members of the digital disruption research and every, week right here with us our sound editor Megan wedge who makes a sound good and keeps us honest, our theme music is composed and, played life of a set of garden hoses by Lindsey Paula you can subscribe to this podcast iTunes. SoundCloud stitcher. Or wherever you get your podcasts. You can follow us online on, Flipboard Twitter, or SPI that's it Peter edu delay you if you have any news that you want us to discuss please send them to SBI. At, Sidney, dot edu.