{"id":9311,"date":"2020-02-20T08:47:42","date_gmt":"2020-02-20T08:47:42","guid":{"rendered":"https:\/\/revistaidees.cat\/?p=9311"},"modified":"2020-02-26T11:13:25","modified_gmt":"2020-02-26T11:13:25","slug":"killer-robots","status":"publish","type":"post","link":"https:\/\/revistaidees.cat\/en\/killer-robots\/","title":{"rendered":"Killer robots"},"content":{"rendered":"\n<p>The world faces a very critical choice about the future of warfare. This is not due to the growing political movement against fully autonomous weapons. 28 nations have called on the UN to ban such weapons pre-emptively. Most recently, the European Parliament voted in support of such a ban, whilst the German Foreign Minister Heiko Maas called for international cooperation on regulating autonomous weapons. And in the same week that Maas called for action, Japan gave its backing to international efforts to regulate the development of lethal autonomous weapons at the United Nations. <\/p>\n\n\n\n<p>At the end of 2018, the UN Secretary General, Antonio Guterres addressing the General Assembly offered a stark warning.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p>\u00abThe weaponization of artificial intelligence is a growing concern. The prospect of weapons that can select and attack a target on their own raises multiple alarms \u2013 and could trigger new arms races.Diminished oversight of weapons has implications for our efforts to contain threats, to prevent escalation and to adhere to international humanitarian and human rights law. Let\u2019s call it as it is. The prospect of machines with the discretion and power to take human life is morally repugnant\u00bb.<\/p><\/blockquote>\n\n\n\n<p>No, it&#8217;s not this growing political concern that illuminates the critical choice facing the planet. Nor is it the growing movement within civil society against such weapons. The Campaign to Stop Killer Robots, for instance, now numbers over 100 non-governmental organizations such as Human Rights Watch who are vigorously calling for regulation. <\/p>\n\n\n\n<p>It\u2019s also not the pressure of such NGOs to take action. Nor is it the growing concern of the public. A recent IPSOS poll shows that opposition to fully autonomous weapons has increased 10% in the last two years as understanding of the issues grows. <\/p>\n\n\n\n<p>Six out of every ten people in 26 countries polled strongly opposed the use of autonomous weapons. In Spain, for example, 65% of those polled were strongly opposed, whilst less than 20% supported their use. Opposition to and support for autonomous weapons was similar in France, Germany and other European countries.<\/p>\n\n\n\n<p>No, the reason that we face a critical choice today about the future of warfare is that the technology to build autonomous weapons is ready to cross out of the research lab (disclaimer alert: where I work) and to be implemented by arms manufacturers around the world. <\/p>\n\n\n\n<p>In March 2019, for instance, we saw the Royal Australian Air Force announce a partnership with Boeing to develop an unmanned air combat vehicle, a loyal &#8220;wingman&#8221; to take air combat to the next step of lethality. In the same week, the US Army announced ATLAS, the Advanced Targeting and Lethality Automated System which will be a robot tank. The US Navy also announced that its first fully autonomous ship, the Sea Hunter had made a record breaking voyage from Hawaii to the Californian coast without human intervention.<\/p>\n\n\n\n<p>Unfortunately the world will be a much worse place if, in a decade&#8217;s time, militaries around the world are using such lethal autonomous weapons systems (LAWS), and there are no laws regulating LAWS. <\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<blockquote class=\"wp-block-quote is-style-large is-layout-flow wp-block-quote-is-layout-flow\"><p>The world will be a much worse place if, in a decade&#8217;s time, militaries around the world are using such lethal autonomous weapons systems and there are no laws regulating them<\/p><\/blockquote>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<p>The media like to use the term \u201ckiller robot\u201d rather than a wordy expression such as lethal or fully autonomous weapon. But the problem with the media\u2019s term \u201ckiller robot\u201d is that this conjures up a picture of the Terminator. And it is not the Terminator that worries me or thousands of my colleagues working in AI. It is much simpler technologies that we see being announced right now. <\/p>\n\n\n\n<p>Take a Predator drone. This is a semi-autonomous weapon. It can fly itself much of the time. However, there is still a soldier, typically in a container in Nevada, who is in over all control. And importantly, it is still a soldier who makes the final life-or-death decision to fire one of its Hellfire missiles.<\/p>\n<\/div>\n<\/div>\n\n\n\n<p>But it is a small technical step to replace that soldier with a computer. Indeed, it is technically possible today. And once we build such simple autonomous weapons, there will be an arms race to develop more and more sophisticated versions. Indeed, we can see the beginnings of this arms race. In every theatre of way, in the air, on land, on and under the sea, there are prototype autonomous weapons under development.  <\/p>\n\n\n\n<p>This will be a terrible development in warfare. But it is not inevitable. In fact, we get to choose whether we go down this particular road. For over five years now, I and thousands of my colleagues, other researchers in Artificial Intelligence and Robotics have been warning of these dangerous developments. We\u2019ve been joined by founders of AI and Robotics companies, Nobel Peace Laureates, church leaders, politicians and many members of the public. <\/p>\n\n\n\n<p>Strategically, autonomous weapons are a military dream. They let a military scale their operations unhindered by manpower constraints. One programmer can command hundreds of autonomous weapons. This will industrialise warfare. Autonomous weapons will greatly increase strategic options. They will take humans out of harm\u2019s way opening up the opportunity to take on the riskiest of missions. You could call it War 4.0.<\/p>\n\n\n\n<p>There are many reasons, however, why the military\u2019s dream of lethal autonomous weapons will turn into a nightmare.  First and foremost, there is a strong moral argument against killer robots. We give up an essential part of our humanity if we hand over the decision of whether someone should live to a machine. Machines have no emotions, compassion or empathy. Are machines then fit to decide who lives and who dies?<\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<blockquote class=\"wp-block-quote is-style-large is-layout-flow wp-block-quote-is-layout-flow\"><p>To build a nuclear bomb requires technical sophistication. You need the resources of a nation state, and access to fissile material. You need some skilled physicists and engineers. Nuclear weapons have not, as a result, proliferated greatly. Autonomous weapons require none of this<\/p><\/blockquote>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<p>Beyond the moral arguments, there are many technical and legal reasons to be concerned about killer robots. In my view, one of the strongest is that they will revolutionise warfare. Autonomous weapons will be weapons of immense destruction. Previously, if you wanted to do harm, you had to have an army of soldiers to wage war. You had to persuade this army to follow your orders. You had to train them, feed them, and pay them. Now just one programmer could control hundreds of weapons. <\/p>\n\n\n\n<p>Lethal autonomous weapons are more troubling, in some respects, than nuclear weapons. To build a nuclear bomb requires technical sophistication. You need the resources of a nation state, and access to fissile material. You need some skilled physicists and engineers. Nuclear weapons have not, as a result, proliferated greatly. Autonomous weapons require none of this.<\/p>\n<\/div>\n<\/div>\n\n\n\n<p>Autonomous weapons will be perfect weapons of terror. Can you imagine how terrifying it will be to be chased by a swarm of autonomous drones? They will fall into the hands of terrorists and rogue states who will have no qualms about turning them on civilians. They will be an ideal weapon with which to suppress a civilian population. Unlike humans, they will not hesitate to commit atrocities, even genocide.<\/p>\n\n\n\n<p>You may be surprised but not everyone is on board with the idea that the world would be a better place with a ban on killer robots in place. \u201cRobots will be better at war than humans,\u201d they say. \u201cLet robot fight robot and keep humans out of it.\u201d Yet these arguments don\u2019t stand up to scrutiny in my view, and that of many of my colleagues working in AI and robotics. Here are the five main objections I hear to banning killer robots \u2014 and why they\u2019re misguided.<\/p>\n\n\n\n<p><strong>Objection 1. Robots will be more effective than humans<\/strong><\/p>\n\n\n\n<p>They\u2019ll be more efficient for sure. They won\u2019t need to sleep. They won\u2019t need time to rest and recover. They won\u2019t need long training programs. They won\u2019t mind extreme cold or heat. All in all, they\u2019ll make ideal soldiers. But they won\u2019t be more effective. The recently leaked Drone Papers suggest nearly nine out of ten people killed by drone strikes weren\u2019t the intended target. This is when there\u2019s still a human in the loop, making the final life-or-death decision. <\/p>\n\n\n\n<p>The statistics will be much worse when we replace that human with a computer. Killer robots will also be more efficient at killing us. Terrorists and rogue nations are sure to use them against us. It\u2019s clear if they\u2019re not banned that there will be an arms race. It is not overblown to suggest that this will be the next great revolution in warfare after the invention of gunpowder and nuclear bombs. The history of warfare is largely one of who can more efficiently kill the other side. This has typically not been a good thing for humankind.<\/p>\n\n\n\n<p><strong>Objection 2. Robots will be more ethical<\/strong><\/p>\n\n\n\n<p>In the terror of battle, humans have committed many atrocities. And robots can be built to follow precise rules. However, it\u2019s fanciful to imagine we know how to build ethical robots. AI researchers like myself have only just started to worry about how you could program a robot to behave ethically. It will takes us many decades to work this out. And even when we do, there\u2019s no computer we know that can\u2019t be hacked to behave in ways that we don\u2019t desire. Robots today cannot make the distinctions that the international rules of war require: to distinguish between combatant and civilian, to act proportionally, and so on. Robot warfare is likely to be a lot more unpleasant than the war we fight today.<\/p>\n\n\n\n<p><strong>Objection 3. Robots can just fight robots<\/strong><\/p>\n\n\n\n<p>Replacing humans with robots in a dangerous place like the battlefield might seem like a good idea. However, it\u2019s also fanciful to suppose that we could just have robots fight robots. There\u2019s not some separate part of the world called \u201cthe battlefield.\u201d Wars are now fought in our towns and cities, with unfortunate civilians caught in the crossfire. The world is sadly witnessing this today in Syria and elsewhere. Our opponents today are typically terrorists and rogue nations. They are not going to sign up to a contest between robots. Indeed, there\u2019s an argument that the terror unleashed remotely by drones has likely aggravated the many conflicts in which we find ourselves today.<\/p>\n\n\n\n<p><strong>Objection 4. Such robots already exist and we need them<\/strong><\/p>\n\n\n\n<p>I am perfectly happy to concede that a technology like the autonomous Phalanx anti-missing system which sits on many naval ships is a good thing. You don\u2019t have time to get a human decision when defending yourself against an incoming supersonic missile. But the Phalanx is a defensive system. And my colleagues and I did not call for defensive systems to be banned. We only called for offensive autonomous systems to be banned. Like the Samsung sentry robot currently active in the DMZ between North and South Korea. This will kill any person who steps into the DMZ from four km away with deadly accuracy. There\u2019s no reason we can\u2019t ban a weapon system that already exists. Indeed, most bans, like those for chemical weapons or cluster munitions, have been for weapon systems that not only exist, but have been used in war.<\/p>\n\n\n\n<p><strong>Objection 5. Weapon bans don\u2019t work<\/strong><\/p>\n\n\n\n<p>History would contradict this argument. The 1998 UN Protocol on Blinding Lasers resulted in blinding lasers, designed to cause permanent blindness, being kept out of the battlefield. If you go to Syria today \u2014 or any of the other war zones of the world \u2014 you won\u2019t find this weapon, and not a single arms company anywhere in the world will sell it to you. You can\u2019t un-invent the technology that supports blinding lasers, but there\u2019s enough stigma associated with them that arms companies have stayed away. <\/p>\n\n\n\n<p>I hope a similar stigma will be associated with autonomous weapons. We won\u2019t be able to un-invent the technology, but we can put enough stigma in place that robots aren\u2019t weaponized. Even a partially effective ban would be likely worth having. Anti-personnel mines still exist today despite the 1997 Ottawa Treaty. But 40 million such mines have been destroyed. This has made the world a safer place and resulted in many fewer children losing their life or a limb.<\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\">\n<blockquote class=\"wp-block-quote is-style-large is-layout-flow wp-block-quote-is-layout-flow\"><p>We won\u2019t be able to un-invent the technology, but we can put enough stigma in place that robots aren\u2019t weaponized<\/p><\/blockquote>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\">\n<p>AI and robotics can be used for many great purposes. Much the same technology will be needed in autonomous cars as in an autonomous drone. And autonomous cars are predicted to save 30,000 deaths on the roads of the United States every year. They will make our roads, factories, mines and ports safer and more efficient. They will make our lives healthier, wealthier and happier. In the military setting, there are many good uses of AI. Robots can be used to clear minefields, bring supplies in through dangerous routes, and shift mountains of signal intelligence. But they shouldn\u2019t be used to kill.<\/p>\n<\/div>\n<\/div>\n\n\n\n<p>We stand at a crossroads on this issue. I believe it needs to be seen as morally unacceptable for machines to decide who lives and who dies. In this way, we may be able to save ourselves and our children from this terrible future.<\/p>\n\n\n\n<p>In July 2015, I helped organise an open letter to the UN calling for action that was signed by thousands of my colleagues, other AI researchers. The letter was released at the start of the main international AI conference. Sadly the concerns we raised in this letter have yet to be addressed. Indeed, they have only become more urgent. <\/p>\n\n\n\n<h5 class=\"wp-block-heading\">Open letter from 2015 signed by thousands of AI researchers<br><\/h5>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p><br>Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is \u2013 practically if not legally \u2013 feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.<\/p><p>Many arguments have been made for and against autonomous weapons, for example that replacing human soldiers by machines is good by reducing casualties for the owner but bad by thereby lowering the threshold for going to battle. The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow. <\/p><p>Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing etc. Autonomous weapons are ideal for tasks such as assassinations, destabilising nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people.<\/p><p>Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons \u2013 and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits. Indeed, chemists and biologists have broadly supported international agreements that have successfully prohibited chemical and biological weapons, just as most physicists supported the treaties banning space-based nuclear weapons and blinding laser weapons.<\/p><p>In summary, we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.<\/p><\/blockquote>\n\n\n\n<p>I urge you to join the global campaign to make the world a better place by banning such weapons.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>The world faces a very critical choice about the future of warfare. This is not due to the growing political movement against fully autonomous weapons. 28 nations have called on the UN to ban such weapons pre-emptively. Most recently, the European Parliament voted in support of such a ban, whilst the German Foreign Minister Heiko Maas called for international cooperation on regulating autonomous weapons. And in the same week that Maas called for action, Japan gave its backing to international efforts to regulate the development of lethal autonomous weapons at the United Nations. At the end of 2018, the UN\u2026<\/p>\n","protected":false},"author":6,"featured_media":9308,"parent":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"inline_featured_image":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[185],"tags":[],"segment":[],"subject":[],"class_list":["post-9311","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-military-use-ai-en"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.2 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Killer robots &#8211; IDEES<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/revistaidees.cat\/en\/killer-robots\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Killer robots &#8211; IDEES\" \/>\n<meta property=\"og:description\" content=\"The world faces a very critical choice about the future of warfare. This is not due to the growing political movement against fully autonomous weapons. 28 nations have called on the UN to ban such weapons pre-emptively. Most recently, the European Parliament voted in support of such a ban, whilst the German Foreign Minister Heiko Maas called for international cooperation on regulating autonomous weapons. And in the same week that Maas called for action, Japan gave its backing to international efforts to regulate the development of lethal autonomous weapons at the United Nations. At the end of 2018, the UN\u2026\" \/>\n<meta property=\"og:url\" content=\"https:\/\/revistaidees.cat\/en\/killer-robots\/\" \/>\n<meta property=\"og:site_name\" content=\"IDEES\" \/>\n<meta property=\"article:published_time\" content=\"2020-02-20T08:47:42+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2020-02-26T11:13:25+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/i0.wp.com\/revistaidees.cat\/wp-content\/uploads\/2020\/02\/AAFF-ARMAS-AUTONOMAS-2000-X-800.jpg?fit=2000%2C800&ssl=1\" \/>\n\t<meta property=\"og:image:width\" content=\"2000\" \/>\n\t<meta property=\"og:image:height\" content=\"800\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"author\" content=\"Guille Velasco\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Guille Velasco\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"14 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/revistaidees.cat\/en\/killer-robots\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/revistaidees.cat\/en\/killer-robots\/\"},\"author\":{\"name\":\"Guille Velasco\",\"@id\":\"https:\/\/revistaidees.cat\/#\/schema\/person\/adfa7c9b46b4f5aba1a2db263fdfd38f\"},\"headline\":\"Killer robots\",\"datePublished\":\"2020-02-20T08:47:42+00:00\",\"dateModified\":\"2020-02-26T11:13:25+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/revistaidees.cat\/en\/killer-robots\/\"},\"wordCount\":2754,\"image\":{\"@id\":\"https:\/\/revistaidees.cat\/en\/killer-robots\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/i0.wp.com\/revistaidees.cat\/wp-content\/uploads\/2020\/02\/AAFF-ARMAS-AUTONOMAS-2000-X-800.jpg?fit=2000%2C800&ssl=1\",\"articleSection\":[\"Military Use of AI\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/revistaidees.cat\/en\/killer-robots\/\",\"url\":\"https:\/\/revistaidees.cat\/en\/killer-robots\/\",\"name\":\"Killer robots &#8211; IDEES\",\"isPartOf\":{\"@id\":\"https:\/\/revistaidees.cat\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/revistaidees.cat\/en\/killer-robots\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/revistaidees.cat\/en\/killer-robots\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/i0.wp.com\/revistaidees.cat\/wp-content\/uploads\/2020\/02\/AAFF-ARMAS-AUTONOMAS-2000-X-800.jpg?fit=2000%2C800&ssl=1\",\"datePublished\":\"2020-02-20T08:47:42+00:00\",\"dateModified\":\"2020-02-26T11:13:25+00:00\",\"author\":{\"@id\":\"https:\/\/revistaidees.cat\/#\/schema\/person\/adfa7c9b46b4f5aba1a2db263fdfd38f\"},\"breadcrumb\":{\"@id\":\"https:\/\/revistaidees.cat\/en\/killer-robots\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/revistaidees.cat\/en\/killer-robots\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/revistaidees.cat\/en\/killer-robots\/#primaryimage\",\"url\":\"https:\/\/i0.wp.com\/revistaidees.cat\/wp-content\/uploads\/2020\/02\/AAFF-ARMAS-AUTONOMAS-2000-X-800.jpg?fit=2000%2C800&ssl=1\",\"contentUrl\":\"https:\/\/i0.wp.com\/revistaidees.cat\/wp-content\/uploads\/2020\/02\/AAFF-ARMAS-AUTONOMAS-2000-X-800.jpg?fit=2000%2C800&ssl=1\",\"width\":2000,\"height\":800,\"caption\":\"Araya Peralta\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/revistaidees.cat\/en\/killer-robots\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Inici\",\"item\":\"https:\/\/revistaidees.cat\/en\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Killer robots\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/revistaidees.cat\/#website\",\"url\":\"https:\/\/revistaidees.cat\/\",\"name\":\"IDEES\",\"description\":\"Contemporary global issues\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/revistaidees.cat\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Person\",\"@id\":\"https:\/\/revistaidees.cat\/#\/schema\/person\/adfa7c9b46b4f5aba1a2db263fdfd38f\",\"name\":\"Guille Velasco\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/secure.gravatar.com\/avatar\/629007751c4a3e3bc4a875f83b1492bf27b7e7eff053528d6942b03ce18e75ad?s=96&d=mm&r=g\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/629007751c4a3e3bc4a875f83b1492bf27b7e7eff053528d6942b03ce18e75ad?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/629007751c4a3e3bc4a875f83b1492bf27b7e7eff053528d6942b03ce18e75ad?s=96&d=mm&r=g\",\"caption\":\"Guille Velasco\"}}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Killer robots &#8211; IDEES","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/revistaidees.cat\/en\/killer-robots\/","og_locale":"en_US","og_type":"article","og_title":"Killer robots &#8211; IDEES","og_description":"The world faces a very critical choice about the future of warfare. This is not due to the growing political movement against fully autonomous weapons. 28 nations have called on the UN to ban such weapons pre-emptively. Most recently, the European Parliament voted in support of such a ban, whilst the German Foreign Minister Heiko Maas called for international cooperation on regulating autonomous weapons. And in the same week that Maas called for action, Japan gave its backing to international efforts to regulate the development of lethal autonomous weapons at the United Nations. At the end of 2018, the UN\u2026","og_url":"https:\/\/revistaidees.cat\/en\/killer-robots\/","og_site_name":"IDEES","article_published_time":"2020-02-20T08:47:42+00:00","article_modified_time":"2020-02-26T11:13:25+00:00","og_image":[{"width":2000,"height":800,"url":"https:\/\/i0.wp.com\/revistaidees.cat\/wp-content\/uploads\/2020\/02\/AAFF-ARMAS-AUTONOMAS-2000-X-800.jpg?fit=2000%2C800&ssl=1","type":"image\/jpeg"}],"author":"Guille Velasco","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Guille Velasco","Est. reading time":"14 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/revistaidees.cat\/en\/killer-robots\/#article","isPartOf":{"@id":"https:\/\/revistaidees.cat\/en\/killer-robots\/"},"author":{"name":"Guille Velasco","@id":"https:\/\/revistaidees.cat\/#\/schema\/person\/adfa7c9b46b4f5aba1a2db263fdfd38f"},"headline":"Killer robots","datePublished":"2020-02-20T08:47:42+00:00","dateModified":"2020-02-26T11:13:25+00:00","mainEntityOfPage":{"@id":"https:\/\/revistaidees.cat\/en\/killer-robots\/"},"wordCount":2754,"image":{"@id":"https:\/\/revistaidees.cat\/en\/killer-robots\/#primaryimage"},"thumbnailUrl":"https:\/\/i0.wp.com\/revistaidees.cat\/wp-content\/uploads\/2020\/02\/AAFF-ARMAS-AUTONOMAS-2000-X-800.jpg?fit=2000%2C800&ssl=1","articleSection":["Military Use of AI"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/revistaidees.cat\/en\/killer-robots\/","url":"https:\/\/revistaidees.cat\/en\/killer-robots\/","name":"Killer robots &#8211; IDEES","isPartOf":{"@id":"https:\/\/revistaidees.cat\/#website"},"primaryImageOfPage":{"@id":"https:\/\/revistaidees.cat\/en\/killer-robots\/#primaryimage"},"image":{"@id":"https:\/\/revistaidees.cat\/en\/killer-robots\/#primaryimage"},"thumbnailUrl":"https:\/\/i0.wp.com\/revistaidees.cat\/wp-content\/uploads\/2020\/02\/AAFF-ARMAS-AUTONOMAS-2000-X-800.jpg?fit=2000%2C800&ssl=1","datePublished":"2020-02-20T08:47:42+00:00","dateModified":"2020-02-26T11:13:25+00:00","author":{"@id":"https:\/\/revistaidees.cat\/#\/schema\/person\/adfa7c9b46b4f5aba1a2db263fdfd38f"},"breadcrumb":{"@id":"https:\/\/revistaidees.cat\/en\/killer-robots\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/revistaidees.cat\/en\/killer-robots\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/revistaidees.cat\/en\/killer-robots\/#primaryimage","url":"https:\/\/i0.wp.com\/revistaidees.cat\/wp-content\/uploads\/2020\/02\/AAFF-ARMAS-AUTONOMAS-2000-X-800.jpg?fit=2000%2C800&ssl=1","contentUrl":"https:\/\/i0.wp.com\/revistaidees.cat\/wp-content\/uploads\/2020\/02\/AAFF-ARMAS-AUTONOMAS-2000-X-800.jpg?fit=2000%2C800&ssl=1","width":2000,"height":800,"caption":"Araya Peralta"},{"@type":"BreadcrumbList","@id":"https:\/\/revistaidees.cat\/en\/killer-robots\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Inici","item":"https:\/\/revistaidees.cat\/en\/"},{"@type":"ListItem","position":2,"name":"Killer robots"}]},{"@type":"WebSite","@id":"https:\/\/revistaidees.cat\/#website","url":"https:\/\/revistaidees.cat\/","name":"IDEES","description":"Contemporary global issues","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/revistaidees.cat\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Person","@id":"https:\/\/revistaidees.cat\/#\/schema\/person\/adfa7c9b46b4f5aba1a2db263fdfd38f","name":"Guille Velasco","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/629007751c4a3e3bc4a875f83b1492bf27b7e7eff053528d6942b03ce18e75ad?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/629007751c4a3e3bc4a875f83b1492bf27b7e7eff053528d6942b03ce18e75ad?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/629007751c4a3e3bc4a875f83b1492bf27b7e7eff053528d6942b03ce18e75ad?s=96&d=mm&r=g","caption":"Guille Velasco"}}]}},"jetpack_featured_media_url":"https:\/\/i0.wp.com\/revistaidees.cat\/wp-content\/uploads\/2020\/02\/AAFF-ARMAS-AUTONOMAS-2000-X-800.jpg?fit=2000%2C800&ssl=1","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/posts\/9311","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/comments?post=9311"}],"version-history":[{"count":5,"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/posts\/9311\/revisions"}],"predecessor-version":[{"id":9663,"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/posts\/9311\/revisions\/9663"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/media\/9308"}],"wp:attachment":[{"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/media?parent=9311"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/categories?post=9311"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/tags?post=9311"},{"taxonomy":"segment","embeddable":true,"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/segment?post=9311"},{"taxonomy":"subject","embeddable":true,"href":"https:\/\/revistaidees.cat\/en\/wp-json\/wp\/v2\/subject?post=9311"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}