Neil Davison: "New Technologies of Warfare: A Humanitarian Approach" | Talks at Google

Neil Davison:

Show Video

Thanks, Birkin for the invitation. Yeah. The invitation, it came through our communications, department, back. In April and obviously is a time of intense. Debates within. And outside Google, about the military application, of AI and. Since. Then there been some significant, developments obviously from, your side google's, AI, principles. But also the international level which I'll touch on governments. Increasingly. Recognizing, the need to, retain human, control over, weapons and decisions, or the use of force so. It's. A pleasure to be here I think it's actually, a critical, time for. Discussion. On this issue. Militaries. Are. Heavily. Invested, in development, of robotic, and. Digital, technologies, including. The. Software that underpins both and increasingly. Of course AI and machine learning I think, it's also critical. That. Technology. Developers, and the, tech industry, involved, in the conversations, that are going on both at national level, and international, level. Now. I want. To be totally, clear at the outset I'm not going to solve any of your personal or collective ethical. Dilemmas, and I would never claim. To or try, to but. What I can do is explain, a little bit how the icrc, the International, Committee of the Red Cross approaches. The issues of new technologies, of warfare, and. So. Before. I get into that I'm actually just gonna say, a bit more about, the bigger picture so I'm gonna actually be talk a bit more broadly than AI and ethics and we'll talk about conflict. And talk about the role of the. ICRC, and we'll talk about new technologies, and then I'll get into the, implications, because I think it's important to understand, the bigger picture. So. First of all quickly what, is the International Committee of the Red Cross for, those who don't know we're, neutral, independent, and impartial humanitarian. Organization, so. We work to assist and protect victims. Of armed conflicts, and other situations, of violence around the world victims. By victims predominately. Civilians, but also, combatants. Fighters, who, injured or detained. And armed conflict, so. We're. International, organizations who have status in international, organization. But. We don't have any government, membership, or participation, so, we're independent.

Our. Mandate comes from the Geneva Conventions. International. Humanitarian law, the rules of war are essentially the, basis, for, all our work and I'll get a bit more, into that later. Quickly. This is where we work wherever there's armed, conflict, you'll find us. We're. In about 87, countries over, fourteen and a half thousand. Staff. Predominantly. Like I say where there's armed conflict, but what was so in capitals, around, the world for humanitarian. Diplomacy. And preventive. Work. Working. With governments. And others, there. Should be probably a small dots red, dot if, I have a red dot on here but. I do somewhere. Out there in, Silicon, Valley there's, now someone, this year who's working out of our Washington, office to engage with with. Tech companies in. Silicon Valley. This. Is a, like. A funding, slide but the point is not the funding it's, just to illustrate where, our major field, operations, are so these are our 10 biggest operations, this. Year as. You can see where, there. Are the major conflicts, taking place or major post post-conflict, countries. Funding. That. Slide has. Disappeared. But, essentially. The largest portion of that graph is government's, governments. Are predominant, funding them the. The, second largest slice is the European Commission but we're increasingly, looking to improve. Funding from private donors. So. What do we do just very briefly one, of our major areas of work is protection, so. We work to protect those, affected, by armed conflict, whether, it's by, collecting. Information on how conflicts, are conducted. And raising, concerns that we have with the authorities, who. And those, parties, to the conflict could. Be visiting, detainees. It. Could be helping. To restore. Family. Links, between those have been separated, during, the conflict, so, that's protection. There's. Also assistance, and this is both emergency. Humanitarian assistance, but, also I, would. Say more long-term capacity, building for essential services whether its water, and habitation. Whether. It's medical care and, medical, services. Or. Whether it's things, like risk awareness this slide. On the picture, on the right is risk. Awareness for the civilian population, about the risks of unexploded, munitions, the. Other, major, aspect. Of work is prevention. And, this really, is. About, long-term, dialogue. With. Weapon. Bearers those who carry weapons, militaries, and others. The. Political, military authorities, and civil, society. That's. A long term work like I say. To. Promote. And. Uphold the rules of the rules of law, the. Rules of war sorry the, rules of law, and. To. Carry out humanitarian, diplomacy, basically. In support of support. Of our humanitarian work. Picture. On the right is taken. From, our desk a few. Weeks ago in The Hague at the meeting of the Chemical Weapons Convention. So. Rather. Than try, and explain. To you the. Rules of war in a rather sort, of roundabout. Fashion, I'm going to use this short video. Since. The beginning, humans. Have resorted to violence as, a way to settle, disagreements. Yet. Through the ages people. From around the world have, tried to limit the brutality, of war. It. Was this humanitarian. Spirit that, led to the first Geneva, Convention. Of 1864. And to, the birth of modern international.

Humanitarian, Law. Setting. The basic limits on how Wars can be fought these. Universal. Laws of war protect, those not, fighting, as, well. As those no, longer able to. To. Do this a distinction. Must always be made between who, or what may be attacked, and who, or what must, be spared and protected. Most. Importantly, civilians. Can never be targeted, to, do so is a war crime. When, they drove into our village they, shouted, that they were going to kill everyone I. Was. So scared I ran to hide in a bush I. Heard. My mother screaming, I, thought. I would never see her again. Every. Possible care must be taken to avoid harming. Civilians or. Destroying. Things essential, for their survival. They. Have a right to receive the help they need. The. Conditions, prisoners, lived in never, used to bother me. People. Like him we're the reason my brother was dead. He. Was the enemy and was nothing to me but. Then I realized, that behind bars he, was out of action and no, longer a threat to me and my family. The. Laws of war prohibit, torture, and other ill treatment, of detainees, whatever. Their past. They. Must be given food and water, and, allowed, to communicate with loved ones. This. Preserves their dignity, and keeps, them alive. Medical. Workers, save, lives, sometimes, in the most dangerous, condition. Fighters. From both sides were, wounded, in a deadly battle we. Were taking them to the nearest hospital at. The. Checkpoint, a soldier threatened, us to treat his men only, we. Were running out of time and I was afraid that now all of them were going to die. Medical. Workers, must always be, allowed to do their job on the Red Cross or Red Crescent must not be attacked the. Sick or wounded have a right to be cared for regardless. Of whose side they're on. Advances. In weapons, technology have, meant that the rules of war have also had to adapt. Because. Some weapons, and methods of warfare, don't distinguish, between fighters, and civilians, limits. On their use have been agreed. In. The future Wars, may be fought with fully autonomous. Robots. But. Will such robots ever have, the ability, to distinguish, between a military target and someone, who must never be attacked. No. Matter how sophisticated, weapons. Become, it is, essential, that they are in line with the rules of war. International. Humanitarian law. Is all, about making choices that. Preserve a minimum, of human dignity in times of war and make, sure that, living, together again is possible, once, the last bullet has been shot. So. Yeah it's all about limits, international. Humanitarian law, they're always at war so. It. Doesn't say anything about whether, or not a war starts, whether it ends it's about the, conduct, essentially, and it aims to limit the consequences. As. Mentioning the video weapons. Have. Particular consequences. And the ICRC's, worked on weapons issues for over, 150. Years. Our, two core, I, guess. Raised, on depth in that area are core areas of work are to look at the, humanitarian consequences what, we see in armed conflict, and. Interlinked. Their compliance with the rules of war and that includes considering, whether new rules might be needed as in. The past we've caused we've called for new rules on chemical weapons banning. Them, banning. Blinding, laser weapons, banning, landmines. Banning. Cluster munitions.

Rules. To regulate the arms trade even last year. Over. A hundred, and twenty governments. Agreed. A treaty to ban nuclear weapons, and, all. Of this our contribution. To this is based on, either. That real or, perceived. Potential. Humanitarian consequences. Out. In. Today's conflicts, this is really, what, kills most civilians. It's. Heavy explosive, weapons, in towns and cities and, it's arms, trade. Transferring. Weapons to those who aren't. Using them in compliance with the law so. This, is something we're also working on of course but. We also try to look ahead and look at how new, technologies, are changing conflict. To. Look at what the consequences. In humanitarian, terms may be and to look at what the legal issues might be look. At the compatibility, with existing law, this, is a a, meeting. We held a few weeks ago on the. Humanitarian consequences of, cyber weapon, that's looking at potential. Risks for healthcare first essential civilian, services, in even the internet core. Structure. Now, I. Would. Say our focus over the last 15. Years has been, predominantly. As you would expect on, increasingly. Autonomous robotic. Systems, on, cyber weapons, in warfare and now increasingly, on the software that underpins both. For. Us I. Should say we're not anti-technology. It, should be clear we're not anti-technology, so. It can be the military, technology, even. In weapon systems can help with compliance. With the rules so, a precision. Guided bomb that uses a laser or a GPS, signal to precisely, land. On a target, legitimate, target can offer better. Compliance, with the rules in the necessity, to distinguish, between civilians, that, you can't be targeted, and legitimate. Military targets but. That technology, in itself is. Not inherently good, for civilians, because. If that bomb is precisely, on, a hospital or precisely on your house then. It. Doesn't matter that that technology, is precise it's also the way it used so, this, is obvious. Really but it's sometimes. Lost, and it's an important point you have to look at the weapons technology, and also the way it's used now this slide is. In south Lebanon. 2007. And. It. Shows a orchard. Contaminated. With cluster munitions now, cluster munitions have been around for a long time before they've prohibited in 2008. Designed. For, things like destroying, a military runway, loads, of munitions spread, all over. But. That's. Not how it always used if, they're. Deployed. Around villages, where people were living and working and, if they land in trees then. In this case they hang down like Christmas, decorations, as, you can see they have a little tether on and when. Someone, comes into that orchard that'll, take your arm off or kill you, so. The point is is that you. May. Claim, the the reliability. The effectiveness the safety of a certain technology but, can't. Be an abstract claim has to be in the real world and the context, in which it's used. And. That's something we keep in mind as we look at new technologies. So. Of, course everyone has in their mind the. Idea of what an autonomous, weapon is my favorite, nightmare if I can put it that way at, the moment is from Black Mirror the episode metalhead, than if you've seen it I recommend it. And. Maybe. We'll get there oh no, but, maybe we will but. The. Fact is. Autonomous. Weapons already exists, all being a limited form and they don't look anything like black. Mirror or, The Terminator the. Way, the ICRC defines them an, autonomous, weapon is one that can select and attack targets, without. Human. Intervention, so. It's the machine based, on its sensors and then it's programming, that, is triggered by its environments. And self initiates, an attack so. It's, a clear distinction between a. Remote-controlled, or a directly human controlled weapon but.

Critically, It's not to do with the sophistication, of the technology, doesn't matter if it's got quite, simple programming or very advanced AI it's. The lack of human involvement in its. Triggering. So. I. Would. Say if. There was one thing to think about when I think about autonomous, weapons the first thing to think about is unpredictability. So by. Definition a. Robot. A machine triggered, by its environment in a complex environment. You. Have some level of unpredictability. About. When. It's gonna fire, what. It's going too far against. And. Where, it's going to fire even and that depends a bit the, nature of the system but. There is a already inherent, unpredictability I think. That's important to remember as we move. On through this discussion. So. Today. What. Do we have this is a report from Stockholm. International Peace, Research Institute from, last year. Most. Autonomous, weapons today, they're. Quite constrained they mostly attack objects, they're. Mostly designed to attack objects and in fact the vast majority of air defense systems. There, was some and, I'll get to in a minute some, which. Have. Some degree of autonomy, that search, for targets over a wide area and, there. Some. Anti-personnel, systems. Which, have a degree of autonomy, and targeting. Although. They don't quite fit, the definition of being autonomous yes, I. Would. Say it's important to say that like I say they're constrained most of them target objects they're, mostly used in areas where there aren't many civilians, they're, mostly. Constrained. In time that they operate autonomously, they're, mostly constrained, in space operate. Autonomously and critically they're mostly humans supervised often, the human can intervene and deactivate. Even, before, the. Weapon system carries, out its attack. So air defense systems like I mentioned, on ships also on military, installations, on land they. Detect incoming missiles. Rockets. Even. Some of these systems, that detect incoming, rockets. They, have at, the moment the ability for a human soldier. To verify, the. Machine says a rocket coming it's very simple it is not like advanced AI this is trajectory.

And Speed it fits in the trajectory and speed if it's coming to you within this speed then. It fires. So. They. Check it if they have time and they do have time even for incoming. Rocket the missiles they don't is going too fast and that's, one of the, drivers. For autonomy, in general is speed from. A military perspective. Loitering. Munitions, most of these are remote controlled so far different, shapes and sizes but some of them were autonomous already. There's. One which searches for radars. And can. Search. For up to nine hours over, hundreds of kilometers for. A radar system. That. Uses a very simple technical signature. Electromagnetic. Signature of a radar but. You. Don't know where it's gonna land, over. The next nine hours you don't know when actually. You. Don't even know what, it's gonna hit or what the next to that radar. This. Is it, illustrates the issues, with autonomy, this, is, the, only type of. System, oh that, as far as I'm aware that's deployed, that is directed, specifically. At humans obviously you can have humans, inside objects my aircraft but. Specifically, at humans, targeting, you. Find these in a handful of countries at borders. Perimeters. Now. So. Far as, far as is understood, these, things can identify, human. Targets. Automatically. They use human, shapes heat signatures, again, I don't think it's particularly, sophisticated. But. Then they send a signal back to the human operator who then decides whether. It can fire. But. Of course you don't even have to change the software for that to be autonomous and actually the the manufacturers, already say, this, could be fully autonomous the, users so far doesn't. Put it in that mode or decides to have it set up differently so. Point. Is this. Is a current issue this is not a far future issue that's, being around for at least 10 years. The. Other thing to remember about. Autonomy. And weapons is that it's not about specific, categories. Of killer. Robots right it's a function, it's, a function that could be applied to any weapon. System. And. That's really important, to remember when. You're thinking about the implications were also thinking about how to address the implications. No. Militaries, and. Naturally for them they're, investing, heavily in robotic systems.

Armed. And unarmed. Land. And sea of all different, shapes and sizes this. Is a capture. From a video explainer. We did with Vox media in the US. There's. An infinite range of the types of systems that could emerge. There. Could be remote controlled there, could be autonomous it's a function, obviously, you. Know as everyone's, well aware you can buy one of these type of small. Drones in your local. Fanuc. Electriss--. Electrical shop. For. Mac being I don't know if you have snack in Zurich, probably, not what's your local electrical shop and Zoo. Medium. Art we have that in Geneva too so, yes. This. Was I was at a conference and, the. The. Company saying this we're saying well you know we could use this we could put some grams, of explosive, on it it could be sent off to kill. People autonomously. This. I mean there's, that, aside there's already a lot of investment, in small drones and of course in swarms of giants. And there's. A question about where. The human role will be in that in swarms, for example there's a massive. Incentive. For. Autonomy, because how do you control. 500. Small drones, at. One at the same time. Remote. Control small drones are already being used by. By. Non-state armed groups in Syria Iraq Ukraine. So. Serious question I mean the question is for all of these systems where was the human role where I mean to, what degree will there be, autonomy. In the targeting, and that's what. ICRC. Is focused on yes. Flying, navigating. Driving, ok fine but what's. The really important point the point of one point is the targeting, and that's what's important from a legal perspective and, also an ethical perspective, now, I would never come here to. Lecture. You about software. You'll know far more about it than me but just. To say that obviously. The. Central component, these future systems is a software. It. Is where. That. Software, will, either be used to directly initiate, a weapon making it an autonomous, weapon system or it may just be used as a decision. Aid a decision, support system for humans who commit, than, certain decisions I think, there are different issues raised by those I think, they're more acute with the one that directly triggers a weapon but they're also concerns, where these, systems are using being, used to inform human, decisions. It's. Not theoretical, and. As. You're well aware. There's. Heavy military interest, around different, parts of the world in military, application, of IAI and of course for a wide range of different purposes and decisions, and so on one. Of them is, targeting, what's.

Known In, the military is automatic, target recognition and this is just an example it. Could have been pulled from anywhere about. Efforts. To harness advances, in AI machine, learning in image recognition in facial recognition in, behavior, recognition, in pattern recognition for. Targeting. Like. I say it could be directly to trigger a weapon or it could be to as. A decision, support aid for for. Humans could. Be to identify object. People patterns. Could be even to predict. As. I'm sure you could tell me better than than. I could in terms of software capability. Nowadays. So. There's increasing talk in a, conflict situation about. Algorithmic. Warfare, I think. That's a fair. Fair. Description so. The. Important point I want to make is that the. Big picture is this is all about decisions. It's. About decisions, on the use of force decisions, to kill injure and destroy it's about the relationship, between humans. And machines in those decisions and, yes. For. Us the most, acute decisions, to look at now are decisions, on, whether. Someone's killed or whether the building is destroyed but. Of course these could have much wider implications. And other decisions, you, know I'm conflict, arrest, and detention, which. Type of military operation, is carried out even, Nuclear, Posture strategic. Decisions. And. There. Are parallels obviously with the rest of society what, discussions. In other areas, in transport. Medicine. Finance. Criminal. Justice perhaps one of the best examples, where, decisions. Affecting. Human lives are already being. Influenced, by. Algorithmic. Data-driven, systems. But. Decisions, which have this. Kind of consequence. So. I. See. I see we've. Been trying to get a handle on how the technology, is developing in order to inform a kind of legal and ethical assessment. And. Like I said one of our concerns in general about autonomy, is unpredictability, and then, we want to learn more about AI. And machine learning and. One. Thing that at. Least to. Us not really. Be interested to discuss this after, is we see a potential, problem of inherent, unpredictability. With. Machine learning algorithms, a lack. Of transparency, in, how they function. Maybe. A knowledge of the input and the output but, not what happens in the middle the. Questions of bias. Questions. Of safety. Unknown. Failures, and. Am. I critically, changing. Functioning, over time imagine, a weapon, system. Before. You introduce a weapon system in armed conflict you have to decide. Whether it you have to test it have to decide whether in the circumstances. You use it it's gonna be, operate. Within the law well if it changes its functioning, over time then you can forget it because you. Know you can't assess that. But. There you know there are there. Are these technical, questions. That that. Are. Really pressing now and. Urgent. The. ICRC's, approach I mean so what to do about all of this, our. Approach is to say we, need to keep human. Control, we. Need to complete you man control over weapon systems and over decisions to use force this, is a piece, by our president, earlier this year when governments, were meeting in Geneva to. Discuss autonomous, weapons, and. It's not an easy. Question, actually. What, is, the. Required level of human control. But. It's a question that needs to be answered both from a legal and ethical perspective. Because. The loss of control. Has. Really, serious consequences, for civilians, for aid workers. For.

Fighters For soldiers as well. An. Example my. Colleague in the field mentions, whenever I talk about this issue he goes yeah century. Weapons, how. Would I negotiate, access, a, border, or a checkpoint with a century weapon. So. Like. I say they're important, both legal and ethical dimensions, to this this on the legal dimension. There's. Been a serious misconception. Over. Many years this idea of machines. Applying, the law somehow. Inanimate, objects. With legal agency, I don't. Know where it came from, at. Least from ICRC's, perspective, we've been crystal clear that the rules of war they. Apply to humans, they're applied by humans, machines. May carry out functions with different degrees of automation, but ultimately human. Judgments, are required to. Comply with those laws, when that means actually. The. Law already limits, the degree of autonomy, that's acceptable. Because. To, explain this a bit more. When. Human. Soldiers, fighters, are carrying out attacks they, need to distinguish. Between civilians, and combatants in that specific circumstance, not in general there now they. Need to make, a judgement of whether, their attack they're going to make the risk it makes the, risk it causes to civilians is proportionate. So. They. May attack legitimate, military target, greates risk for civilians is that proportionate, they need to make that judgment now then not, in general. They. Also need, to take precautions so suit the situation change, over time. They. Need to cancel it they might need to stop it, so. They're contextual, they need to be there, so. This means there. Needs to be a human involvement and. It, means limits, on autonomy, from a legal perspective. But. Of course there's also the ethical, issue I mean I know this is something you've. Been grappling with in a probably a different, sense but and, again. It's it's across society, it's this the role of humans, and. The. Relationship, with machines in decisions, that affect people's. Lives and of, course these ones are the most significant, types of decisions. Many. Governments. Civil. Society organizations. And and much of the public you, know the, adamant we cannot delegate, decisions. To kill to machines. But. What does that mean and. We need to work out what that means for. Us, it means you need to have sufficient human. Intent, to link the, intention, in the, specific, context, of an attack to, the consequences. This. Is not a generalized. Decision. To be made years in advance this is at, that point in time, that's. A specific human role in that decision, about whether people. Are. Attacked buildings. Are destroyed. There's. Also a bit an, issue about. Human. Agency this is something that the ICRC, is strategy, at the moment not only talks about human control, over decisions but human agency and I think this gets to this idea about human intention, and. There's. Also the question of human dignity. What. Does it mean. What. Does it require in. Terms of that human involvement in those decisions, to uphold human dignity and, this is about really, and. People can fall in different ways on this in terms of their ethical perspective, are you purely looking at consequences. Were. You also looking at the process, you looking. At just if someone was killed were you looking at how they were killed and why. This. Is an important point about human dignity. So. What's. Our role in all this. This. Is a picture from the first meeting, at the United Nations in, Geneva. 2014. And I've been I've had the pleasure I would, say. Sometimes. Not so pleasurable the, experience of being there for. Every one of them. In. A multilateral, sense, especially in the current. Sort. Of world that we're living in. There's. Been some progress but, governments, are completely. Divided, both. On the scope of the problem, of. Autonomous weapons and also what to do about it, so. Actually, the majority of states now want to see new law, that. Would either have.

A Requirement, for human control. Specified. Or that would. Specify. Category. Of autonomous. Weapons that you would prohibit as. You, might have gathered from this presentation. If. You're going to have any kind of regulatory approach, it's going to have to be along the human control element because what I mentioned about it being a function, not a specific category. Other. Governments. I would. Say once a kind of middle approach where they want to agree some politically binding, not legally binding principles, we'd, like to have human control and we, think it's you, know these, are some of the key elements. Other. Governments, some major military powers among them they said nope we were, satisfied with existing rules we just need to ensure they're, implemented. Now. That's. On what to do about it now. I think one way to, say. Well, to clarify, why, some. Kind of international limits, are needed and ISO cs4, for several years been calling for international limits. To be agreed it, hadn't said whether they should be new law politically. Binding or otherwise but there needs to be some, limits and one, of the reasons is it's when you ask someone or, a different government or different military or a different person what is sufficient, human control they give you very different, answers someone will say that, means remote control, directs, remote control, in that. Circumstance, as, example. With many, existing arm Jones the other end some would say at. Some point in time I. Programmed. This. Weapon system so. Therefore in all. Future circumstances, it's, kind of fun of my control now. For, me that that. Shows there's, a need to, look actually what that means, in practice. We. Would like I say we already have some weapon systems that have been used lawfully, and without some ethical concern that aren't under direct remote control, so, perhaps. It's not at that end but then I'm, pretty doubtful, about, being, at that end however so. There. Needs to be work to. To. Determine, what human control means in practice, and. This is where we've been pressing government's repeatedly, and most recently in November to. Actually answer this substantive, question because whatever option, you choose new, law politically, binding whatever you, need to work out this question what does it mean yes, all governments, have now agreed on the importance of the human element human responsibility but, what does it mean in practice what's, required. For. Compliance with the law what's required for acceptability, with, our values and this, is a, central. Question that, needs. To be answered. Now. What's what's, your role as. Technologists. As working. In the tech industry and far. Be it for me to suggest what your role is of course that's, up to you but I, would, say that. The, engagement, of. Technologists. And technology, developers, and the tech industry is critical, in these discussions. Absolutely. Critical. And. It's. Not. Just on the technical questions it's important on the technical questions, so. If someone says to me this. Weapon system of this type of technology this AI development, that's going to save civilians. That's. Going to protect lives. That, needs to be interrogated, someone. Is of, course it's on the purse the the developer, who claimed that to say well to explain how the technology, works but there needs to also be some. Assessment. Of critical. Assessment, of the technology, how it works, not just the capabilities, I know you're all problem solvers, with, new technologies also the limitations actually are more interested in the limitations, than I am in the capabilities, in. Some senses. Not. Again, because anti technology, but because it's important to have a realistic, assessment because this is not just theoretical, it, is already. Being, tested and used so we need to know what the limits of the technology are it's not well, in 20 years you, know we. May may reach artificial. General intelligence it's, more using, this machine learning, in this system now what, are the limits. So. I think it's been encouraged it's been really encouraging, I would say compared, to other areas that I've worked on particularly, bio technologies, and the risks and biological. Weapons it's, been actually quite difficult to, get the engagement of, biologists. On the dual use risks, of their technologies, in their research let's see well it's not relevant we're not developing biological, weapons it's, true they're.

Developing. Vaccines, when. You medicines or whatever, but. Let's. Be nadir is a difference in the tech, industry I would say at, least from, my perspective as, an, outsider. Technologists. Have been very involved in the discussion, that, being. Of course things. Like google's, AI principles. There's. Been microsoft's, call for, a digital peace in cyberspace, they've, been their call for regulation. Of facial recognition i, think someone, mentioned to me earlier that also someone. At Google have been talking about concerns. About facial, recognition they've also been open. Letters by scientists, by industry. CEOs and robotics and AI about. Autonomous, weapons raising, some issues and concerns so, I. Would. Suggest. Respectively. Respectfully, that it's. Important, that you. Stay involved and that you get more involved it. Could be to, work with organizations like ourselves to better understand the technologies, could. Be to work with ngo's. It. Could be more at a, company. Level or industry, level in. Terms of deciding. How. Technologies, are applied and looking at the risks. That. They raise. Bringing. The expertise, that you bring. I've. Been like I said I've been at the last five, years of discussions, at the UN in Geneva and, it's. Predominately governments, and NGOs. There's. Not many. Technologists. There's not many tech companies. Well. I think this is. An. Area where. They. Could be more involvement and it could be, really. Quite beneficial. So. With that I'll. Say, thank you very much for listening and I'll. Be pleased to carry, on the discussion with, you and the time we have available. You.

2019-05-31 14:35

Show Video

Comments:

We've taught apes such as ourselves to do so, but can we ultimately program Artificially Intelligent machines to tithe once a week, and pray five times a day? - j q t -

Just make tool instead of weapon. Much more profitable. It is believe me. Just give it a try the Nirvana becomes ours both before and after.

And stop forcing the German government to go against the interest of people they are supposed to represent... they generally do without increasing military spending.

Let's talk about technology of world peace.

Watch "Video screened at UN Human Rights Council meeting February 28 2019"

Other news