#MLBerlin—Closing Talks

#MLBerlin—Closing Talks

Show Video

Yeah. So. How, did we do that. On. The one side there's a playground. The, other side there's audio engine we. Used, many different sensors. We apply it on the Craig playground, and attach, them to a da da. Way W. Software. Using. Instruments, from native instruments for. Example reactor. For. The censoring we used on. The one side we do some, people. Of our crew packed and on. The other side a. Process. Was to map, the values. In the right way to, the musical. Elements. So if you have a sensor it's, not always right well you get out of it for. Music. We. Can see this project also, as a sound interface, that every, toy of the, playground. Is transformed. And, basic. Elements, of sound just, like the modules of a modular synthesizer. So, we created. A, step, sequencer, using the steps, of the stairs as you can see in the picture and, we. Use the slide. As. An envelope, switching. From high to low. Rotation. Wheels. The wheels of the tricycle. Are. Used are those blighters and the. Spring is used as a sound filter, the, instruments, can stand. On their own or they can be combined, to create a live performer the. Sound composition. Is composed, made from layers one. On top of the other. With. Different. Textures, and rhythms, that we, can easily understand. Creating. In this way super. Fun experience, which is not, only for, kids. So. Already, from, when you're testing, who you are graded by children, or pedestrians. They were looking, at us and wondering what's going on, so. What's unique about module, and the big big. Round kit that's. Inspired, by the common elements, of the playground that means if you have this modular, module, in you can attach those sensors, in your neighborhood, playground, and have, a sonic layer on top of the existing playground. And, also, we are not trying, to make an artificial, movement. It's just a general movement yet that you do in the playground, that, gets translated into the basic, component, of the sound and as, you playing, the playground, you start to understand, how each component, of the sound combined. And make. Music. So. Of course we believe that motherland will have impacts on various domain but we are particularly, happy, that we're bringing up the electronic, music which is traditionally, associated, with dark, Club, that, kind of atmosphere into, a bright, urban space we're. Also very happy that people. Who, have no experience in, music making can, be part of those creative, aspect, of the music just, by having fun in the playground. Also. We are really yea happy, that we make people move and last. But not least we. Are also really excited, to see all the hipster parents, who usually just sit on the corner of the plague on not, doing anything just watching their kids can. Be part of the playground, and have, a inclusive. Experience, together. So. What's next we would love to bring this to the world so. Please come by in our booth up there after this presentation and, also if you just want to play with the instrument, yeah feel free to come we also have a website which you can find in the hackathon.

Site, Thank. You. I. Think. We start with a huge applause, for all of the teams what, amazing. Projects. Great. Work everyone. And, while we're setting up for the panel there, were there were a couple of things that I saw in these projects, that really resonated, with me and I think that resonate with the Media Lab so. One was we really, believe in diversity we, think diversity makes, things, more interesting makes, things more creative makes things better and I, think you could see in a lot a lot of the teams that there was there, were very different backgrounds, working together it's, not always easy, when you bring people together who come from different worlds but we think that the final results, are usually, worth. The tension, that you have to work through if. You're coming from if you're bringing these different backgrounds, and. Then secondly, one thing that struck me is a lot of the projects, were, not just developing. Technologies. But they were actually developing, technologies. For other people, to develop things so. Even with the playground, you, know it would have been fun to just turn the playground into a musical instrument that people can play but. The way I understood, it is it's, actually a modular. Set. Of components. That I could use to turn my own playground, into a musical instrument and then the kids could change the instruments, so they're not just playing. What someone else has designed and I think that's very. Interesting. About the Media Lab that often, we think about technology as this generative. Force that, other we design technologies for other people to design other technologies, or, other things I. Think. We need a second, to to, set up the shots for the for, the panel are we I. Have a quite a slightly, odd request, for all the participants, who are still logged into the workshop. Wi-Fi. Can, you please log out. The. Reason is we're going to livestream, the panel, and we need the bandwidth, so. One. Of the limitations of doing Church, is there isn't there's. No fiber optic cable running into the basement. Of the church. We. Like to make our panelists, work before, they get to speak. Okay. Hacking, the panel already. So. As. A, huge, craft rock fan myself I could not resist to use, the cover. Of the bench, machinae album, to, introduce the theme of the panel which is a mensch machine. And materia, and. It. Human-machine. Material. And. It's. Something that the. Medial app is I think fairly. Well known for, technology. And innovation. Over, the course of the last 32 years we've you. Know developed. A lot of technologies, spun, out a lot of startups. I'm. Losing my first panelist and. But. I think what's what's been more unique there are lots of places that are doing technology and innovation, I think what's been more unique about the Media Lab is actually that we are interested, in the. The. Relationship, between the, technology and human, systems so, it's not about the technology per, se it's about originally. Maybe the interface, between humans, and technology, and then in a second wave maybe it was networks. The communities the technology enables and what that does for humans and, I think now we're getting into a stage. Where the, material, per. Se is. Becoming, all kinds, of things its bio its genetic, engineering. And, one, of the topics that we think about a lot and that these three panelists, are thinking about and working on is AI, there's, a lot of interest in artificial. Intelligence as, a material. That, we could be designing, new kinds of technical but also human. Systems, with so I want to spend, the next 25. Or so, minutes exploring. Some of the, issues. Around this relationship, between humans. And machines and, the, material, of AI, and, what, kind of possibilities this, material offers us but also. Kind of responsibilities. It, demands. Of us and I think I'd like to keep. This quite informal, so more of a conversation, I've encouraged, them to interrupt. Me to interrupt each other to, have. A lively. Discussion and I also encourage you if you. Want to ask a quick question you. Don't have to wait till the end there, are some microphones, I. Would, ask you not to do long statements, but if you want to throw in a little, provocation. Or a little question, feel, free to signal, us and then we'll make sure to pull you into the conversation.

That. May be I'll start. With Joey. So. You. Know the Media Lab in its role. As an institution. That trains. Or. Educates. Designers. And developers of technology and I know training, you. Already, yeah it's, you. Don't have perfect control and you don't have perfect predictions, so it's, I just have a baby, now I have a 15 month old baby so I'll. Use, a baby metaphor, so when you give. Birth to a child you have, control of what the child sees, and, eats and does and, but, they but you don't have complete control of what the child becomes or what they actually do but. You're still responsible for the child so if you create. A font, if you create a machine if, you create a smell you are, responsible. For the, aesthetic. Educational. Environmental. Societal. Impact. Of the thing that you make but. You don't know exactly what's gonna happen so the trick I think is to be, very, aware of the, contexts, of all. Of the systems that it will hit and try, to instrument. Your, interaction. In, a narrative iterative, way so you can watch, the child so when you're seeing the child running around and you think it's about to, do something you can try to intervene but really I really do think it's it's it has an ongoing relationship, and, it's like introducing. A, new, life into, a complex system and and. Coming, back to the topic of AI I know that you've been involved, in a class about AI and it's actually not a class about, AI it's a class about the ethics of AI, so I'm curious why. That framing where did that come from how, is that different from what other people are doing, I think. Some, other people are doing at the class that I'm teaching with. Jonathan Zittrain is specifically. Harvard. Law School Harvard. Kennedy School of Government and, MIT, and it's lawyers philosophers. Engineers. And. Policy. People but. III think that you can't you, shouldn't be allowed to make laws or policy without understanding technology, and you shouldn't be allowed to make technology with that understanding ethics, philosophy, and law. And policy and so our. Goal is to get engineers, to understand, the other side and vice versa and our success has been we've got some of our best engineering, students who go take start. The law degree program it's harder I think to get lawyers to become engineers. I. Was, gonna go to York next but maybe I already change, the flow, because, Julia's looking at me I'm looking at you yeah you, shouldn't be allowed to make laws.

I. Mean if this, is the standard, that I have to understand, AI and, technology, in order to make laws well. I guess, a lot of people, who elected, me probably, didn't think about that at all and I. Think it's if. This is kind of the the goal that we aspire to that, we want, to be governed by experts, then, we would probably need an entirely different approach to, democracy. That we have today I think, today at. Least in the ideal sense it's a lot more about trust like who are the people who we, trust to actually. Try to seek out the expert information that they need in order to make informed decisions sure. When. I said understand, I think it doesn't mean that you have to be able to code but, but in in we, have in the US Supreme Court though the guy who just retired he. Didn't. Use computers, he, only wrote with pen he, didn't like electric. Lights and he. Was judging, things, that had to do with the internet and we have a lot of the, forum when I was fighting, with Lindsey Graham about the Internet, he, had never used email you know and so so you have to have at least a sensibility, to understand, the architecture otherwise. It's, it's it's it doesn't it's W so I don't think that you have to be an expert, but you have to understand, it it's a slightly different thing I mean. Perhaps we. Could say that only, young people should be allowed to design schools, and on the old people should be allowed to design retirement. Homes and only people who use the internet should be allowed to govern the Internet and that might actually be a good approach I mean what I see in my work a lot is that. We. Make quite general, rules that, apply to the Internet, as a whole that are designed to only work with Facebook, and YouTube because. The people who are making the laws only know Facebook, and YouTube and kind of perceive them as being the entirety, of the Internet and when. I asked them well how would this new legislation. Would. That, would require you for example to scan every, uploaded. Piece, of work. For copyright, infringement how, would that work for tinder or, how would that work for get up then of course I need the person, who I'm talking to to know what tinder I get up is actually about and ideally. To have used it at some point and that is very much not the case but then you could wonder, is this because our politicians are too old is it because they don't have enough expert. Expertise. Because. We're asking our politicians, to make more and more with, less and less resources and, I think both of those could be part of the problem so let me this was great let me jump back to York because, I know you're thinking about the, types, of structures and systems that we would need to support an engagement with things like AI at the, level of a state so, we've talked a little bit about who should be allowed to govern who, are the experts who do we trust how, do we maybe prepare, people to to, take those responsibilities, what are some of the other pillars.

Let's, Say of an, approach to this well. I think my first point would be that there, needs to be an understanding of, both sides, you, know AI, is politics. And you, know those coding, and designing need to understand, that they heavily affect, society with, what they do you. Know maybe not when you come up with a music engine, but, we, have this understanding, in our society. You know among the general population, that AI is more, something. Really. You, know remote, or dangerous, like a terminator, or something, and we don't understand, yet that it really influences, us in our everyday life. I mean maybe not AI but algorithms. And. So if you live in New York these days you, know the question, of whether you're to which high school your kids go, is decided, by algorithms the, question whether the police is patrolling in your street or not is decided by algorithms, you know the the sentence, a judge sends, you to or releases, you from prisons, is supported. By algorithms and. I think we have to have an understanding, that everybody. Who's programming, those things has, a political. Responsibility. And so we need the training on both sides make people aware of, what they were doing and. Not only make lawmakers, and politicians, aware of you know what modern technology, brings to our society. So. This is a perfect example so, right, now in, the u.s. we are passing. Laws that require, algorithms. To be used to support judges but. The companies. That are selling them at you. Know police. Expos. And, conferences. And they, are signing in the procurement, letter that, the data is not disclosed, and the algorithms, are secret and so, when, you have a court case when, a defendant, is trying to ask why she. Or he can't get probation they, can't force. The stuff, in court and if, the judges, or the lawmakers, who pass the law to include the stupid risk assessment, thing which they shouldn't have in the first place they. Would have put in the thing that said oh the data should be open the, source should be open and and, and that's the example that I would say if politician. Knowing enough about or, a lawmaker knowing enough about the code, - or software, to be able to say oh this, should be in the law, about procurement yeah. I think there's also a difference, in the situation. Where the state is actually, actively, employing, AI because. Then I would say the primary, responsibility. Is not with the developer, then the primary responsibility. Is with the state which, developer, to actually choose and, then which. Which, kinds, of companies. To use and what kind of rules. To give them but I think that not all of our lives are controlled by the state so, of course especially, if we're dealing with companies that are employing. A, at a grand, scale and, that are that are building. Quasi, public, spaces like Facebook for example they, do have a responsibility, even, in areas where, they are not kind, of government mandated. But, I, think, yeah, in order to be able to to judge, as a, politician, we're, using, AI make sense you, do need a certain understanding. Of it but I think what is even, more, problematic. At the moment in the way that politics, approach AI is, that. It is used, as a way of basically. Having less work that, we. Are confronted. As politicians, which with a huge, array of extremely, complex problems, and increasingly. An answer, of politicians. That are dealing. With limited resources limited money, is to say well let, the big companies fix us for a fix it for us and we don't exactly care how you, remove. Hate speech from the internet for example we just want you to do it and if, you want to keep. The algorithms, that do its secrets then we will let you do it because you're fixing a problem for us so, I think there's also an issue of kind of who actually has the ultimate responsibility.

About. These these. But. I think there is a broad field where it's not the direct responsibility, of a government or a state but, it affects, the, chances. And equity, issues of a society, whether, I get a credit from a bank in might, be just you know some issue between me and the bank but, if I don't get a credit anywhere, I'm reduced. In my chances and opportunities and. It's, today it's it's algorithms, it's it's you know maybe even intelligent algorithms. Deciding. Whether I have, a right or access, to a credit, and with that to an opportunity and so, here I see just a very broad field where. It's not direct government responsibility. But where you can't just you know look away and say oh just leave that to the finance industry they'll, figure out some way but it is directly government responsibility. To the extent that regulators, report, to the government so the, credit score's in the United States that are now selling, your, credit score data to. Marketing, companies and there is a law in the US that says you can't sell personally, identifying, information with. Credit scores there, selling the house so, they send predatory. Advertisement. To the house and they say the house is not doesn't, have privacy and then, what's happening is they're now using credit scores to give jobs and gives loans and now, they're including, Facebook, data into credit scores and LinkedIn. Data integrated stores so it's making this horrible, loop of and, and you have a regulator the government's supposed to be watching, this and they're letting it happen and it's, very also. There's one other I'm gonna make one other point so so if you go back to in history, to the, insurance. Business, when. They were trying to calculate the premiums, and actuarial there, was a quite, a debate in the United States about. What. Is fair so. So, should poor, people be carried by the rich people should your, premium, be, decided. By your specific, category. Of risk should, people with pre-existing conditions be, allowed there were all this discussion, about risk and then, the technical. Statisticians. Came in and they made it very complex, they came up with a single definition. Of risk. That was too hard for the activists, and the people to understand, and the, fairness became, a mathematical. Technical, decision, and then all of the discussion, about. Democracy. Democratic discussion disappeared, the same thing is now trying to happen in AI, and machine learning where you have different kinds of fairness fairness of. Discrimination. Fairness, of outcomes. Fairness, of need. Fairness, of accuracy each community, is going to have a different view of what's fair but, what they're trying to do the community of statisticians and machine.

Learning People are they're trying to come up with a checkbox that says this is the definition of fair just like the insurance industry. Many years ago made, their determination of, what's fair and the government, doesn't, know enough to intervene, and so this is another this, is an example where the government could, or, should have an opinion they don't have an opinion and I think the experts are now sort. Of fighting over this and this I think is an important battle yeah. I wonder actually if we could go back to you you and maybe you could talk a little bit about your experience. With legislation. That deals. With technology regulation. And, both, the frustrating. Sides of it maybe and the limitations, but, also you, know I think there are some very positive aspects. In your work that. That we should also highlight. Okay I will try to not depress, you too much I actually found it very fitting, that you started, with crafts back because craft work, symbolizes. A bit in technology, policy one of our huge, challenges. And that, is speed because, the European, Court of Justice is, currently dealing with the court case of, craft, vac vs.. Moses, para hem I don't know if anybody, still knows who Moses pellham is but he was like, a german. Hip-hop. DJ in the 1990s. And in, the 1990s, he used a very short sample of a soft track track now, under, kind of traditional, copyright law it's very clear that if you take somebody else's, melody, that they have composed, and you use it as your own you need a permission for that but, what is not at all clear is if you use more modern technology which, in the 1990s. Was sampling I don't think we would necessarily think, of it as very modern technology, today and you, don't actually take, somebody, else's melody, you just take kind of two seconds, of sounds, of metal banging against, each other and use, that in your own work and make something new out of it like and the, u.s. copyright law it would probably be, relatively. Obvious that this is not an infringement, but. I think, the the. Point, that I want to make is that in 2008. 18, 20. Years later the, courts are still fighting over this and I, think that is a big, problem that we have not yet come up with the. Mechanisms. To make democratic, decisions, about how to deal with new, technologies. That are not new at all anymore by now, making. These decisions in a reasonable. Speed and I think that is a huge problem because, in. The meantime. The. The, cultural. Use of technology, is just going to create facts, for better or for worse and if we now decide that either we think it's morally wrong to take. Somebody, else's, metal. Banging on pots. Or not it. Will not change what. We consider, as actually, fair. Like the the law, will invariably. Be far, behind the, technological. Development, but on the positive side I think you can see that at least in. European, politics in the European Parliament.

There. Is a certain responsiveness, that if, you have. A, big community, like an Anya in community, like. Youtubers for example, actually speak. Up with their own stories, and tell politicians who may not have first, experience, with creating on the internet that, a certain, proposal. Would, kind. Of threaten, our very kind. Of cultural. Surroundings. That can have an effect so, we just, had a vote about this in in the European Parliament in July where, actually a majority voted. Down what the supposed. Experts in the committee had decided, because, suddenly an entire kind of generation, of 17 year olds was writing to their representatives. And saying this is completely out of touch with how we use technology today, so, I do think that kind. Of technology. Is all on, the one hand kind of under, governed because, we are too slow to, react to the developments, but at the same time it's a huge opportunity to kind, of have new ways of communicating, with your politicians, so, I found the the proposal, about using. Blockchain, to kind of track politicians, promises, very interesting, in that sense I mean I think they, are probably flaws, to it because it it assumes that every politician can make a decision, by themselves, and that they're an absolute ruler, but at the same time I think it's kind of grasping, this potential, of using, the Internet using technology. To create, these different backgrounds, so maybe I'm, curious, if you've if you've seen other spaces if you think we need more spaces. For this kind of conversation we. Definitely need more spaces, for that kind of conversation because, first, of all if the general public isn't aware yet of what's going on there, is no need to. Discuss it because you don't know it's happening and you. Know maybe an example if you build a new building in Germany, you know you can put down the proposal in some hidden room in some public. Building and if, as a neighbor you don't know that something happens you. Wouldn't look at the proposal, and maybe intervene, in Switzerland. If you want to build a building you, have to put a frame of the building, um you, know made out of those metal. Pipes. To. The place where it's supposed to be in everybody walking by see something, is happening, there people get aware so, the first step is general, awareness in, getting people involved you know there's something which affects everyday, citizens, life and that everybody, should be involved, in the. Second issue of a conversation, you need more than one partner, there. Is a discussion, in the scientific community there. Is a discussion in the political community there, is definitely, lots of going on, in the in industry, but, there are not enough platforms. Where those different. People meet so, we, need awareness and we need platforms. For, a conversation. In in, order, to get. Everybody. Involved. And not only informed, you. Know politics, often or you know industries, as it's enough to just keep people informed of what's going on I think, here we really need participation. And involvement and. Actually, this reminds me of something I saw in one of the presentations. Where the, my, xai there. Was this like gasp. In the audience when they saw the, you, know what what this new company is doing or this, new fake company is doing but. So I think we also need art I think there's certain things here. That are actually easier to communicate through art or through, performances.

Or Through. Experiences. And maybe. Not necessarily just. Having the facts written down and provided, to everyone in the newspaper, or, the public service announcement. So I wonder I don't know maybe back to you Joey you, know at the Media Lab we think about art design, science, and engineering as, as this, wheel you. Know yeah I think that, art is very good at kind of posing. These provocative. Theories. That kind of, evoke. An emotional, reaction but, then we should kind of go in and explore because when I was looking at this myaxx app. What. I was thinking was that okay, this may be a fake commercial. Product, that we don't want to be in the market but, at the same time it's also a very real experience that, a lot of women already, have today I mean this is exactly how stalkers, use social media and they may not have, artificially. Intelligent to, support. Them doing it but, it's nevertheless a reality, for a lot of women so, I think if we if we think that is and if we wouldn't, want that to be kind of a business model we should also think about okay how do we make sure that the Internet actually makes. Make. Sense and works for everybody who is already using it today. Yeah. I mean and I think you. Know when when, I was, on the pre ARS electronica board. But, a jury for the Internet you. Know I think one of the key things is that artists, will use the tools in ways, that they're not intended, and. It, they are creative. So that, the app I think is, kind of on. The edge of art for me I think really interesting art is when they break the tools and it does two things I think it advances. The, tool and it, also. Allows. Like, you said Philip to look at it from a different direction so there's there's also a very positive piece, so so there's a critical design which is to criticize, and show the negative but but, at the media that we we think about things like. Photography. Or, computer, graphics, and games where a lot of the artists, were also involved in the technology so the, the the form was able to evolve in a very interesting way but, forms, where you have the technology, and the artists separated, like. Television. Or newspapers. The. Form, kind of got stuck and wasn't, able to adapt to the technology or, the social, system. So so bringing art and technology together has those two things I think it provides the, societal context back to the engineers but also takes the engineering, and moves, it in in a creative way. But. But I think it's the job of art and also of science and education to demystify. The black, box of algorithms, because. As long as the general population thinks, is just something we cannot understand, you, know it's completely closed is completely, complex, you.

Know It's 1 billion lines of code you know how should I then. It. Needs science. Education, and art to, slate. The complexity. In something, either visual, or understandable. Or simple, and so. I see here more than just breaking the rules which is part, of the game but, it's really this translation. Issue into. Something, everybody. Can understand, and understand. That it effect and. You. Know my my. Xai, or however this wonderful company was called is exactly. That it confronts, me with. Something. I put out in conversations. Digitally, and Clay's it back to me in a super visual way and I understand, who, you know this is just affecting, me and the way people can, can understand, me or think about me because. It's our digitally, on the web yeah. I completely agree. With that point that it's very important, to demystify because. That actually will lead to better policies, but I think one big challenge that I face there as a lawmaker, is that, there is, an entire industry of, lobbyists. Whose primary job is it to, confuse, us as. Politicians, about, what algorithms. Actually do like I have set in seminars, designed. For politicians, where certain. Academic publishers. For example, try to explain, to me if we allow, scientists. To data mine academic, articles, then Trump wins the elections, and these. Kinds, of I mean I'm not making this up this this was a seminar. I participated. In and I, think this is yeah. Kind of showing that we have to sift, through so, much misinformation, from. People. Whose whose, job is it is to basically confuse. Us and try to lead us all wrong yeah so, one, thing that I found encouraging. I think we've been kind of focusing. More on the dystopian, sides. But that, I found very encouraging, is many. Of these projects, explored, ways to, demystify. Technology. By allowing, more people to use, those technologies and, I. Remember a conversation with Eric from Lego, before the workshop actually he said you know 20 years ago you. You're the, kids started going to these maker spaces and, learning about electronics, and about, sensors, and motors, and then the next time they go to the supermarket and, the door opens.

Magically. You. Know they look at the door they're like oh I know exactly how this works like you know it gives this incredible, sense of ownership and, power. In a way over the world by. Understanding, how to use these materials in these technologies, and. It's much harder to do with AI and, some, of the more modern technologies, but, some, of the projects, actually I think we're pushing in that direction you know like how do we make, them Tinkerbell and then you're not gonna get to the highest. Sophistication. Necessarily, but that's not that's, not needed you, know go. Ahead I don't, actually think they're harder so, I think that they are a little bit harder right now because we don't have the interfaces, but the the the, AI one actually comes from work, by Stefania. At, MIT who is trying to figure out how to teach young children about AI using. The the bricks and using robots and and the concepts, of AI aren't, that hard as, long as you sort of and, we in the real world you know when you cook you don't actually understand, the chemistry of what's going on but you have a cookbook and you think you know how to cook and you actually can walk into a restaurant say I know how to make that omelet and it gives you the power so so I think that a lot of the emerging, technologies, are just complex, because, they haven't yet put into nice, Lego bricks so, once you have the bricks the. Experts, know how to make the bricks but, once you have the bricks and you understand their function. It, isn't that hard to learn and I think getting. Back to your, boy if, it's it you, know it's the design is it designed to express. What it does or is it designed to confuse and I think and to me there's a little bit for me difference between art and design I think design is about how to make something more suitable for for. Use in society. Art, is actually a little bit different I think it's more of a it. Doesn't really care about. Whether. It's useful for you it's art is more of a perception and it's more of a provocation so but, but anyway that's a technicality, I so I think that once the technologies, get better designed, they'll, be easier to understand, intuitively. Just. Perhaps. As an example of this, how. This can be translated kind, of also into a policy field so a few years after 9/11 there was this huge, security. Craze. In. Politics, that we still haven't gotten completely. Over and in, the kind of youth political group, that I was active in at the time we, decided to organize a security, conference which, is what all the political parties do you know it's kind of the, hot, topic of the day and at, the security, conference we thought about okay what is actually most likely to kill me as a 16, year old girl in Germany and then we talked about everything. From suicide. To traffic. Issues to drugs, but. We also did things like for example isolating. DNA from, a banana and like a half-hour, workshop, because. I think it's it's extremely, useful to. Kind. Of well. Use transparency. And understanding, as a means of countering fear, and I think at the moment there's a lot of fear in the AI debate, which may also, be part of the reasons why politicians.

Have A tendency, to kind of push the topic away to the companies, like Mark, Zuckerberg coming. To the European, Parliament is, telling us hey. I will, solve all of our problems that. It will get rid of fake accounts and it will get rid of illegal. Behavior on the internet practically, because, he has an interest in saying that because his company is building the air I so, if. We are extremely, afraid of these problems, that he is promising. To solve for us then we are of course very susceptible. To these kinds of promises and we don't have the tools to question, whether this is actually the way forward. Before. We continue I do want to invite all of you to participate. So. If there are questions there. Is a hand I think how does it work the microphones, are here in the middle so. I would ask you to please get up and and if you can go to the microphone, and maybe just say your name and who, you are hi. Yeah, my name is Juan I studied. Physics here at the TU Berlin and I've been working as a programmer, for the last five, years more or less in this field and, I. I. Don't. Want to sound, like a like, alluded or something I think technological, progress is very important, but. There's I've, become, slowly disenchanted. In different. Aspects and I think what, what joy does this, example that he just said about that. Judges, are being supported by algorithms that, are not open, and. They're, just completely defeats everything. Of what. Are the fundamentals, of a justice system so. I, feel, I, don't know I want to see what what takers three opinions on on isn't, it necessary that the that. The tech community. Maybe. Finally. Just. Acknowledges. Some shortcomings. And what, you can do what, you can't do for. Example, Joey said that that there's, this this there's, an initiative working with the with, the Kennedy governance, school and with Harvard MIT, but. That. Sounds to me extremely Ella 'test there's have there has there has been an ongoing. Movement. In public education of, just like dismantling, this this whole principle of that we should learn how to, learn and what to learn and now people just learn something that's how you can be embedded into an economy and. Just. This acknowledging, of there's, there's no amount of smart contracts, or blockchain, or AI, that. That, will fix the problem that for example that there should be massive. Public outrage that. Even this algorithms. That are, are. Practically. Helping. This decisions, by judges a that. This code is not disclosed at is not disclosed, and there's, no technology's gonna change this shouldn't, be there. Shouldn't. Be there also a massive public outrage, if, we see that, humans, discriminate. In for example Court decisions, you. Know I just wish we take a more positive look, I mean clearly you know some of the software products, proved not to to fulfill all the promises we. Have hope but, you know they they, both, can be less discriminatory. And. Take. Away you. Know. Repetitive. Tasks. From judges, giving, them more time for, the real important, tasks, I mean, if a judge has five seconds, or ten seconds on average to. Decide on probation, or not and you, know I don't want to be there you know I want a judge to you, know have more time for my case maybe because, something, you know algorithmic, helps him do, you, know very repetitive tasks.

And Beforehand. We talked about the finance industry and, the problems, with credit ratings yes, there are problems with credit ratings but on the other hand in the traditional banking way, something. Like fifty or sixty million Americans. Unvisible. To traditional, credit scoring, and only. When you use you, know, algorithmic. Systems, that, acts as more data those, people become visible, and get a chance for credit so, I think we really have to see both you know the more opportunity, more chances, with a positive, outlook of what could be possible, and obviously. The things that go wrong if you don't have. The right rules the right regulations, and people. Misusing, the power of algorithms, and AI I just, don't you know I wish the discussion, wouldn't only, go in this dystopian way, we, tend to lead the discussion sometimes. Too quickly I'd. Like to add something. Because. Just. Hold on one second I don't want to make it a back-and-forth Yulia wants to get in maybe she'll broaden, the conversation give. Us a second at the risk of making your point for you I think there is a huge, difference between, criticizing. AI being, used at all and I don't think that was the point and III being. Used in an in transparent, way I mean I completely believe that there is a place, for AI to be used and that, you, know it can be a huge benefit, but I think the reason perhaps why you are outraged and a lot of people out there are not is because they don't believe that even, if the data were, disclosed, or even if the AI. Were, transparent, that, they could have. That they would have the facilities, to actually interpret, that so, I think in the in the open-source community, it kind of works because, it is a community well, kind of theirs to develop a community, where everybody has a basic, understanding of, programming and, can, look at the code and understand. Something but, there. Is also kind of this this broader, layer, of the community, of open, software users. That, may not be program, is themselves but that trust that if the if, the chord is open, then, hopefully, somebody else will look at it for me and then we had things like I don't know a heartbleed, where it turned out that well just because something is open doesn't mean that a lot of people had actually looked at it so, I think we. I. Completely. Agree with you that it is outrageous that we. Are using these in transparent, closed systems, but, I think people will only be, about, it if they feel empowered to actually use the information that we're asking them to disclose, and I'll, actually, be, a little bit more extreme I think. There are cases where you shouldn't use technology, so I agree that we should make the system more efficient, but for instance you know when Mitch Kapoor was talking about Auckland he said that they were using in jail, an old. FileMaker. Pro database, an Access database in an Excel spreadsheet and they, were doing everything by hand and it took two days to process it fix. That first you know and to me you, know like the electronic, voting machines are a bad idea I think they just are a bad idea you shouldn't have him and and, and I think it's possible that certain, categories of risk assessments, just won't be fair because the underlying data is unfair because I think that what happens when you look at all of the American systems you have data, on poor people but you don't have data on rich people and and there are some systemic biases, in social.

Systems And, depending. On where, you want to go whether you're just trying to keep, business, as usual, and punish. Poor people and move power, to the rich people then, you might want but but I think maybe thinking about the. To your point it it, kind of makes you but that you have to look at the whole system including. The humans including, everything and and, then I would deploy algorithms. Much, more strategically, on what, are the effect in the whole system rather than right now you're trying to make each subsystem. More, efficient better accuracy, and policing better accuracy and risk assessment, better accuracy and parole but that isn't making the system more fair and so I I'm, I'm concerned that, we, are looking at trying to make the judges job more efficient when, we should try to make the judges job more effective, and so I think it's slightly different but, but, also making, it more consistent. I'm only making the point that humans, are not terribly, consistent. And judges aren't very consistent, either but it's very hard to get at inconsistency. And discriminatory, behavior. Of a single person once. I have it gorilla. Designed, and transparent. And there I'm completely, with you then at least I can in a democratic, process openly. Discuss, about. Fairness. Consistency. And, in parameters, but I want, to talk at the next layer because the, current criminal justice, systems. It's, bias against, poor people and bias for rich people so why can't, we use data, and machines to fix the whole criminal justice system rather than just trying to eliminate a little bit of bias from the the. Judges, who are already terribly, biased because they're publicly, elected in the United States and they're so so so to me I want to set the bar higher I want to say can we use data, and machines to understand. A causal, system in a theory of change of society and that we use the, AI, not, to look at the defendants, the people but to look at the politicians, to look at the judges and say let's, look at the history of the judge judges across. This, jurisdiction. And say, all right these judges, that, tended, to be conservative, caused, crime. To increase in their neighborhoods these judges who tended to let people go for, drug crimes decrease. Crime in their communities, let's try to understand, why rather. Than saying can we make the judges job more efficient that's kind, of my target. It's hard it requires, a lot of political will but I think we're just trying to set basically, we're trying to automate existing. Functions, in a, democratic. System that isn't really very good yet and I. Guess we all agree if we M, now. Digitized. The, unfairness. Of the current analog system, we do the worst job of all I'm. Hesitant. To allow another question, but if there is another question. Yeah, go ahead the microphone, is in the middle there so no I was joking please ask more questions. Possibilities. Are going forward and, there's, a lot of things that technology very, much that, but are not analyzed, that we kind of start to see that they're going in directions that are am. I doing something wrong. Okay. Better. Okay. Okay. So I wanted to ask about the when, we talk about technology we talk about progress, and the technology, moving forward, and a lot, of technological, solutions that we come up with like turn, out to not be, a beneficial, change like the example with the judges and I wanted and then we kind of leave it to the legislative, system to solve for us so we do a big mess of technology and then we like okay, let's have, politicians, regulated, and I wanted to ask if you know about any initiatives. Who try to invent. Technology to like go. Backward and the kind of change. The direction I don't know. Well. I don't. Think that necessarily. Just. Because there are technological. Developments, that are bad. That. There isn't progress. Or that it would, be worth to just go back so for example if you look at the car I think it took like I don't know thirty years before they had seatbelts, so I think we are still in the kind of kindergarten, phase of digital, technology.

And It's regulation, and I. Mean. My. My party the pirate party were kind of sometimes, accused. Of just wanting, a wild-west, on, the Internet which is actually not what we're about at, all like so, for example net net, neutrality, was. Something, that perhaps you didn't need, as a law, in, the early days because it was kind of built into the technology but. As companies, were trying to, to exploit, it and change the architecture. You needed, the legislature, to step in so maybe that's an example, of, what, you're talking about that basically that the technology, of code development, goes in a kind of bad direction and then you take the law to, go back to something that worked, so maybe net, neutrality is an example of doing that but I don't know if there's the, same kind of in among. Technologists, themselves, that they try to do that. Electronic. Voting. There. No more quick there is another again. The mic is in the middle. So. We've you've. Mentioned design, as distinct. From art but I. Guess. One, way to talk about design, is as a system of methodologies, and a way like different ways of thinking and working together and. When, we have these experts, we still need, specialists. And, it's not like everybody. Can be a generalist. So. I guess, my. Question is do you see a case, for, creating, more of a role for people. And systems as a, kind of connective, tissue. Since, the other. Ways that we used to do this seems to be failing at the moment I. Think. One. Of the biggest, problems, that we have right now is, the. Silos, of disciplines. And there's. A lot of work talking. About it but basically you have very tribal systems. In. Whether you're talking about academia, or whether you're talking about business and what, we have is these systems, interact, in a very formal clunky, way and. The, federal, funding, so government, funding goes usually, along these paths at ten years schools. And, it's. The if you're in between these spaces it's very difficult to get funding it's very difficult to get a job if there's no job description, and it's very difficult to get. A degree it's very difficult to get anything and so, what, we're doing at the Media Lab is very, much trying to explore how, you create, a legitimate, job. Or. Religion sort of art, and engineering, or, something like that but but actually what you want to do I think we need to do is to be able to explore the.

Spaces. That aren't even just, between two disciplines and I think it starts with education I think that that you want project-based. Learning that isn't, constrained by classes. And disciplines and like the hackathon, you learned what you need to learn in order to get things done I think you can have specialists, but I think specialization. Should really be about if. You have a passion about something and, you do it and you get to be the best in the world that this peculiar weird, thing and then, the world has a way to find you because you have a youtube video or you have a you, have a website I think it's it should be extremely diverse, specialists, rather than a whole, bunch of specialists, that have a guild and they, all know how to do this certified, way of doing you know bolts. Of turning so so to me I think, that the internet actually allows us to, develop learn. And connect, these. Broad. Array, of specialists. In these completely, non-existent. Fields, and I think that's the opportunity but the universities. And the schools and the job description, job market all of those structures, get in our way. Perhaps. You just add one thing to this I think that also the internet, and this having, access to all these specialists. And experts who, you, know explain to you how to build an airplane on YouTube, they. Make it easier to become reasonably. Expert. In a number of fields whereas, like 50 years ago you would probably have to study 10 10 times that long to, get to the same level of expertise, because you have to basically learn, everything from, scratch, and you have less access, to the, mistakes that other people have made so, I, think there is definitely a space for specialists, but probably. If. You get to a place where they cannot communicate with each other anymore because they are in. A community, of only experts of the same kind and it can also, be kind of limiting. But. Maybe, just adding one. Different perspective, and yes. We need the expert but also we need in the broad general public, a basic. Algorithmic, understanding, you. Know I think we just got to teach you, know young kids. You. Know the the basic, understanding, how algorithms, work not, everybody, needs to program, them but, you need to understand, how they influence you or how other people, build something, around you using. Technology, and. That. You, you get best and they're I'm completely, with Joey if you have project-based, learning and in schools and, you, build applications, you engineer, them and you understand, how you, know just just an algorithm, functions, yeah. I think also there, is kind of a problem in our curricula, that we, spend a lot of time teaching children, things. That computers, are already better, at so, really. Kind of analytical, problems, but, we. I think we should be focusing, the education, much more on the things that computers are bad at because, then we can really kind. Of have an added value from, technology and yeah. Probably achieve more and have a positive outcome. Yeah. That's actually maybe. A really good moment to to, slowly wrap up and also it's really unfortunate, which really actually build, on the conversation. We've had here one is we. Had 50, or so, people. In the in this church for a week who, were working on technology, projects. And they, were they, spoke 35. Or 38, languages, but. They were experts, in many many many more things and it, was really fascinating to kind of throw them together into these groups where they would discover things about themselves but. Also things, that they could share with other people and, I think it was exactly this this idea of learning where you need to bring those different perspectives together and then have people build, things together then, that's when they really understand, the other perspective and also more about the technology, and.

I Want to end on a maybe, one positive example. Of AI that I saw in the week that. Didn't get built to my disappointment, on. Monday. We'd so. The workshop had this first day that was field trips all of the tracks went out into Berlin and they explored companies, and they visited spaces, they went to museums and they met interesting people and, I. Was going, along on one of these field trips again, with Erika sorry Eric for constantly, referencing, you but and we, were reflecting on how refreshing. It was that we could, see the sense of unlimited, possibility, in the young people who were part of this workshop everyone. Was talking about all we could do a start-up in this area or we could be working here at the university, or we could be starting a nonprofit and, there were like all these possibilities, and. One. Of the teams had considered, building an AI that, would reconnect, you with your younger self where. You could have a conversation with, yourself when, you were maybe 18, years old and the. URA I could, say hey Phillip remember, that time when we went. On holiday in, Italy and you. Fell, into the water or, you stole the sailboat, or. You you know met. This wonderful person and had an amazing conversation and, you a I would know these things about you and could have kind of have a conversation with you and I thought it was for. Me that was a beautiful vision. For having an AI that is transparent, that I control, that adds. Something, to maybe my reality, that's actually helpful to me and it, did give me that sense of wonder and of possibility, and Eric and I both came, to a moment, in our conversation, we were like we. Still have these possibilities, so, for, me that's what we hope to get out of this workshop just, a sense of possibility and, a, group of friends that we can do these things together with and. I want to thank everyone I want to thank my panelists, and I want to thank everyone, all of the participants, and all of you for coming for. Really, a fantastic, week together so, thank, you and there'll be drinks and food.

2018-08-19 18:54

Show Video

Comments:

Great talk!

Since you disabled comments on your submarine to aeroplane communications video, I'll comment here. The concept of communication using acoustic ripples will reveal the submarines position, making this a very stupid idea.

Other news