To

info@fditechnology.com

Click or drag a file to this area to upload.

To

info@fditechnology.com

To

info@fditechnology.com

Click or drag a file to this area to upload.

TECHNOLOGY IN MOTION CONTROL

blank
en
  • en
  • pt
  • ar
  • tr
  • ru

The Artificial Intelligence Arms Race: Trends and World Leaders in Autonomous Weapons Development

Justin Haner
Northeastern University
Denise Garcia
International Committee for Robot Arms Control and
Northeastern University

Abstract

Autonomous weapons technologies, which rely on artificial intelligence, are advancing rapidly and without sufficient public debate or accountability. Oversight of increased autonomy in warfare is critically important because this deadly technology is likely to proliferate rapidly, enhance terrorist tactics, empower authoritarian rulers, undermine democratic peace, and is vulner- able to bias, hacking, and malfunction. The top competitors in this arms race are the United States, China, Russia, South Korea, and the European Union.

Spending soars as progress stalls: where is the public debate on lethal artificial intelligence and autonomous weapons systems development?

As United Nations member states have made little progress in discussions of lethal autonomous weapons systems (AWS) over the last 5 years, the nature of warfare is transforming before our eyes. This change is occurring without proper accountability or public scrutiny as a handful of countries continue to make massive investments in increasing the autonomy of their weapons systems paired with advanced artificial intelligence (AI). Greater debate is urgently needed as lethal AWS are likely to proliferate rapidly, enhance terror- ist tactics, empower authoritarian rulers, undermine demo- cratic peace, and are vulnerable to bias, hacking, and malfunction. A proper public debate concerning the ramifi- cations of ‘killer robots’ should start in earnest.

Technological advances in autonomy are often incremen- tal and come with a low political profile. Autonomy is being added to different parts of existing weapons systems and meaningful human control, from target planning to mission execution, will be gradually lost without proper considera- tion of the moral dilemmas it raises (Roff, 2014; Sharkey, 2017; Schwarz, 2018). This widely unaccounted for develop- ment of AWS is taking place as the worldwide market is expanding rapidly. Global military spending on AWS and AI, narrowly defined, is projected to reach $16 and $18 billon respectively by 2025 (Sander and Meldon, 2014; Research and Markets, 2018).

This article proceeds as follows. The following section uses current events to project the future impacts that increas- ingly autonomous weapons, if left unchecked, could have on international security. The second section uses all pub- licly available data to establish and rank the top five world leaders according to their intent to develop autonomous technology, their capacity to develop AWS hardware, and their level of AI expertise. The final section highlights the importance of ongoing efforts to restrict or ban the use of AWS and of setting global norms under international law now by these leading states, before it is too late.

Projections from current trends in lethal AI and AWS development

Autonomous weapons are poised for rapid proliferation. At present, this technology is concentrated in a few powerful, wealthy countries that have the resources required to invest heavily in advanced robotics and AI research. However, Moore’s Law and the declining costs of production, including 3D printing, will soon enable many states and non-state actors to procure killer robots (Scharre, 2014). At present, quadcopter drones cost as little as $25 and a small $35 Rasp- berry Pi computer can run AI advanced enough to defeat Uni- ted States Air Force fighter pilots in combat simulations (Cuthbertson, 2016). Accountability will become increasingly difficult as more international actors are able to acquire lethal AWS. Houthi rebels are using weaponized drones and both ISIS and Boko Haram have adapted drones for use as

improvised explosive devices (Olaifa, 2018; Nissenbaum and Strobel, 2019). Proliferation to groups using terrorist tactics is particularly worrying because perceived costs drop as AWS increases the distance between the attacker and the target, making it less likely that they will be caught, while simultane- ously increasing perceived benefits by allowing more precise target selection in order to maximize deadly impact and fear. Such offensive capabilities are likely to proliferate faster than lethal AI-based defensive systems due to protective technol- ogy having higher relative costs due to increased need to dif- ferentiate and safely engage targets.

Within states, lethal autonomy will likely empower author- itarian regimes and may intensify the global trend of demo- cratic backsliding because AI-enhanced monitoring systems and robotic ‘soldiers’ are the perfect tools of despotic rule. While fully autonomous killer robots may not replace all tra- ditional soldiers any time soon, focusing exclusively on such future trends risks missing the dystopian effects that AI is having in authoritarian states today. China has imprisoned millions of ethnic Uighurs and uses AI-based surveillance with facial recognition software to not only monitor Xinjiang province but to keep a close eye on all of its citizens (Human Rights Watch, 2019). Such surveillance systems have been deployed to over 70 Chinese cities and are being used to shape behavior via the ‘social credit’ system (Scharre, 2019). The authoritarian marketplace for such capabilities is strong, as China has held training sessions with officials from more than 30 countries seeking advanced tools for monitoring and controlling public opinion (Scharre, 2019).

Between states, full autonomy in war may undermine democratic peace (Altmann and Sauer, 2017). Immanuel Kant’s theory of democratic peace relies upon the public not supporting unnecessary wars as they will be the ones called upon to fight in them. When many countries transi- tioned from a conscription or draft-based military to an all- volunteer force, public opposition to ongoing wars declined (Horowitz and Levendusky, 2011). This trend will likely be exacerbated and further lower the threshold for use of force when even fewer human soldiers are needed to fight wars (Cole, 2017; Garcia, 2014). Semi-autonomous systems have already begun to erode the fundamental norms of interna- tional law against the use of force (Bode and Huelss, 2018; Edmeades, 2017; Garcia, 2016).

Public oversight and accountability is particularly impor- tant because lethal AI-based systems are vulnerable to bias, hacking, and computer malfunction. As many as 85 per cent of all AI projects are expected to have errors due to either bias in the algorithm, biased programmers, or bias in the data used to train them (Gartner, 2018). AI bias tends to be particularly discriminatory against minority groups and could lead to over-targeting or false positives as facial recognition systems become further integrated into the weapons of war (West et al., 2019). Beyond bias, AI systems can be hacked, have unintended coding errors, or otherwise act in ways programmers never could have predicted. One such coding error happened in autonomous financial trading systems in May 2010 causing a ‘flash crash’ wiping out $1 trillion worth of stock market value in just a few minutes (Pisani, 2015).

Even when correctly coded, many AI programs exploit unforeseen loopholes to achieve their programmed goals, such as a Tetris-playing AI that would pause the game just before the last block fell so that it could never lose, or another AI program that would delete the answer key against which it was being tested so that it could receive a falsely inflated perfect score (Scharre, 2019). The conse- quences of such bias and errors as AI is added to increas- ingly autonomous weapons could be devastating.

Top five world leaders in lethal autonomous weapons development

While most military spending figures are classified, our eval- uation of the top leaders in AWS development makes use of what data is publicly available as a proxy to empirically gauge their relative standings. Table 1 displays the metrics we were able to use to rank each major actor and focuses on what we see as the three critical components for lethal AWS development. First, we examine the specific intent of countries to develop lethal AWS based on their official poli- cies, actions, and public opinion. Second, we assess the gen- eral capacity and record of each actor with regard to developing cutting-edge lethal automated weapons hard- ware. Finally, we evaluate the most critical component, which is cultivating the artificial intelligence software exper- tise needed to enable AWS to carry out the complex tasks of war. The top five competitors in lethal AI and AWS devel- opment, are the United States, China, Russia, South Korea, and the European Union (EU)

The United States

With a defense budget greater than the combined military spending of China, Russia, South Korea, and all 28 EU mem- ber-states combined, it is no surprise that the United States is the world leader in the development of lethal AWS (SIPRI, 2019). Autonomy has been an official component of United States national security strategy since 2012 with the release of Department of Defense (DoD) Directive 3000.09. This pol- icy was the first of its kind and allows for semi-autonomous systems to engage targets pre-selected by human operators, as well as for fully autonomous weapons to select and engage targets after senior level DoD approval (Department of Defense, 2012). Further support for autonomy in war can be seen in the United States ‘Third Offset Strategy’ where it is listed as one of the main pillars. However, despite being a clear policy priority for defense officials, not all Americans support this effort. Only 25 per cent of American citizens trust AI, and some employees at major companies have resisted developing AI for military purposes, as seen in Goo- gle’s internal rebellion against Project Maven, an AI develop- ment contract for the United States military (Ipsos, 2018).

The United States is the outright leader in autonomous hardware development and investment capacity. By 2010, the United States had already invested $4 billion into researching AWS with a further $18 billion earmarked for autonomy development through 2020 (Boulanin and Verbruggen, 2017). Despite already owning over 20,000 autonomous vehicles, the United States is projected to spend $17 billion on drones through 2021, including 3,447 new unmanned ground, sea, and aerial systems (Gettinger, 2018; Statista, 2019).

In the military AI expertise race, the United States started before the opening gun even went off by investing $1 billion in ‘strategic computing’ back in 1983, and since then has con- sistently outspent its competitors (Boulanin and Verbruggen, 2017). In addition to having the most AI companies in the world, the United States has the most AI-related publications for a single country, the most AI patent applications and accepted AI patents, as well as the largest pool of talented AI researchers, including those in the top ten per cent of their field, more than any other single country in the world (CISTP, 2019a, 2019b, 2019c; IPlytics GmbH, 2019; Shoham et al., 2018).

China

China is the clear rising contender in lethal AWS and AI development and has outlined in its ‘Next Generation Artifi- cial Intelligence Development Plan’ that it intends to utilize AI on the battlefield in association with AWS (China State Council, 2017; Kania, 2017). With a combination of 70 per cent citizen trust in AI (the highest of the 24 countries sur- veyed) and the heavy pressure it can exert on companies to transfer technology to the state, it is unlikely to face signifi- cant internal resistance to AWS development (Ipsos, 2018).

China’s capacity for weapons development is high with an estimated annual budget of $250 billion and projected spending of $4.5 billion on drone technology by 2021 (SIPRI, 2019; Statista, 2019). Most impressively, Chinese companies have tested swarming technology with over 1,000 synchro- nized drones (Kania, 2017). However, while some countries, such as South Korea, Israel, and Japan, seek AWS develop- ment to augment their soldiers and fill near-term gaps in security, China, with the world’s largest army, does not have this problem. This frees China to focus the bulk of its resources on long-term strategic investments in AI.

China publicly plans to become the world leader in AI development by 2030 (China State Council, 2017). China’s controversial methods of intellectual property procurement have allowed them to make technological leaps forward in a non-linear fashion. With heavy ‘civil-military fusion’ invest- ment, China’s State Council estimates their AI industries to be worth $22 billion by 2020, $59 billion by 2025, and $150 bil- lion by 2030 (China State Council, 2017; Kania, 2017). By some metrics, China has already taken the lead in AI. Despite lag- ging in total publications, between 2011 and 2015 Chinese scientists published 41,000 papers on AI, almost double the United States during the same period (Baker, 2017). Further, Chinese investment and financing in AI projects between 2013 and 2018 is estimated to be 60 per cent of the entire worlds funding of such projects, again more than doubling United States investment during the same period (CAICT and Gartner, 2019). However, China does face a problem of top expertise flight as despite having over 18,000 talented AI developers, when it comes to those who rank among the world’s best, the United States and EU each have more than five times as many of the top experts (CISTP, 2019a).

Russia

Despite scoring low across several capacity and expertise metrics, Russia is a leader in the lethal AWS race because it is its most brazen supporter. Russia is openly looking to remove humans from the decision-making loop and does not intend to comply with any international efforts to curtail or ban AWS use in combat (Bendett, 2017; Tucker, 2017). In accordance with Russian programs for the ‘Creation of Prospective Military Robotics through 2025’ and ‘Concept for Deployment of Robotic Systems for Military Use until 2030’, Russia plans to have autonomous systems guarding their weapons silos by 2020 and aims to have thirty percent of their combat power to be partially or fully autonomous by 2030 (Bendett, 2017; Moscow Times, 2014).

Russia is acutely focused on near-term hardware develop- ment. Despite a comparatively low annual GDP and total budget for defense, Russia is intent on spending almost as much as China on drones by 2021, has a military robotics- focused rearmament budget of $346 billion, and hosts annual conferences on the roboticization of its armed forces (Bendett, 2017; Sputnik News, 2013; Statista, 2019). Their autonomous Uran-9 robotic tank has already been deployed to Syria (Mizokami, 2018).

President Vladimir Putin has publicly stated whoever becomes the leader in AI will ‘become the ruler of the world’, however, Russian investments in AI are significantly lacking (Bendett, 2017). Even basic AI statistics on Russia are hard to come by and one potential explanation may be that signifi- cant development is not happening on a comparable scale. Despite having at least ten research centers dedicated to AI use in warfare, Russia’s annual domestic military spending on AI is estimated to be as low as $12.5 million annually, just 0.01 per cent of the unclassified AI budget for the United States military (Bendett, 2017, 2018). International sanctions may be part of the problem as Russia has been forced to cut its defense budget by 7 per cent in 2017, 3.2 per cent in 2018 and estimated 4.8 per cent for 2019 (Kofman, 2017).

South Korea

South Korea is a disproportionately strong player in the devel- opment of lethal AWS and the world leader when it comes to autonomous sentry weapons. While only 17 per cent of South Koreans trust AI, most have no problems with robots in general (Ipsos, 2018). With a ratio of 631 robots to every 10,000 human workers, they have the highest concentration of robots in the world (Peng, 2018). Facing slowing population growth, South Korea is looking for automation to move beyond industrial use and into the military realm as it currently relies on mandatory conscription to fill the ranks of its army.

Despite being under an American security umbrella, South Korea’s own weapons development capacity remains high, spending $41 billion on defense annually (SIPRI, 2019). With

a credible threat to the North, their primary concern has been the development of static, defensive AWS. The world’s first stationary autonomous robotic turret, the Samsung SGR-A1, was developed in South Korea in 2006 (Parkin, 2015). Beyond that, Korean arms manufacturer DoDAAM developed the Super aEgis II, a long-range sentry gun turret that can autonomously detect, track, and engage targets – and in a troubling trend of proliferation, has reportedly sold this technology to the United Arab Emirates and Qatar (Par- kin, 2015).

AI expertise cultivation is a major focus in South Korea. Two days after their world champion Go player, Lee Sedol, was defeated by Google’s DeepMind AlphaGo AI system, South Korea pledged nearly $1 billion towards AI research (Peng, 2018). With close to 70,000 AI patents, more than 50,000 publications on AI, and more than 2,000 AI experts, South Korea is major player on the world stage (Bode and Huelss, 2018; CISTP, 2019a, 2019c; Shoham et al., 2018). South Korea aims to remain competitive globally by open- ing six new AI-focused schools by 2020 (Peng, 2018).

The European Union

The EU, with a combined GDP soon to be the largest in the world and several leading weapons manufacture member- states, has the potential to become the global leader in AWS development (World Bank, 2019). At present, the EU’s focus has been on industrial AI and robotics. The largest impedi- ment to the EU becoming a dominant actor in lethal AWS development is the mixed intent of its members. France, Ger- many, the United Kingdom (UK), Sweden, and Italy all are developing autonomous military robotic systems, however some members remain undecided, and Austria has even joined calls for a ban on AWS use (Boulanin and Verbruggen, 2017; Campaign to Stop Killer Robots, 2018). Further compli- cating the issue, the European Parliament holds the position that humans must always maintain decision-making control over lethal weapons systems (European Parliament, 2018).

With the second largest combined defense budget in the world, totaling $281 billion, and projected spending on drone procurement of at least $8 billion by 2021, the EU has the capacity to develop world-class AWS hardware (SIPRI, 2019; Sta- tista, 2019). Some individual EU member-states are heavy- weight contenders in their own regard. France alone outspends Russia and South Korea on defense with more than $63 billion budgeted annually and further has stated that AI will be a major part of their military strategy (Peng, 2018; SIPRI, 2019). Germany spends $49 billion annually on defense and has pro- duced an ‘Active Defense System’ with an automated reaction time of less than a millisecond (Boulanin and Verbruggen, 2017; SIPRI, 2019). Autonomy is a core component of the Italian army modernization plan and it spends more than $27 billion on defense annually (Nones and Marrone, 2012; SIPRI, 2019). The UK is the second largest exporter of weapons in the world, spends more than $49 billion annually on defense, and is investing heavily in swarming drone technology (Boulanin and Verbruggen, 2017; SIPRI, 2019).

The EU has surpassed even the United States on some AI metrics. EU member-states published the most AI-related papers, with more than 425,000 total, and have the second highest figures when it comes to AI patent applications, with more than 233,000 (CISTP, 2019c; IPlytics GmbH, 2019). Further, the EU has the most AI talent and top talent in the world, with over 41,000 and 5,100 respectively, coming just from Germany, France, the UK, Spain, and Italy alone (CISTP, 2019a). If the combined expertise and capacity of the mem- ber-states can be pooled effectively through the new Euro- pean Defense Fund, the EU could emerge as the dominant actor in the AI and AWS arms race.

Ending the artificial intelligence arms race with a ban on killer robots

As the AI arms race rages on, the stakes remain high yet public debate is lacking. Sixty-one per cent of citizens polled across more than twenty countries oppose the development of lethal AWS, and yet billions of their tax dollars are being spent on their development each year (CSKR, 2019). France, Germany, and others have advocated use of the Convention on Certain Weapons (CCW) process to develop ‘Possible Guiding Princi- ples’ as a code of conduct to encourage AWS development to stay in accordance with existing international law (Convention on Certain Weapons, 2018). Beyond that, 28 states have called for a ban on killer robots, and further the Non-aligned Move- ment and a group of African states desire to negotiate a new international treaty to set limits on robotic killing. Previous weapons bans, from chemical and biological weapons to land- mines and cluster munitions, have been effective policy tools which significantly curtailed the use of these problematic weapons. While the United States is not currently in a position to lead with its ill-fated ‘America First’ policy, the EU and other forward-thinking countries should attempt to set solid global norms and push for a ban on the use of AWS now. China announced last year that it wishes to ban the battlefield use of AWS, but not their development and production. This could serve as a basis for coalition negotiations with the rest of the world and would represent a key step forward in preventative security governance (Garcia, 2018).

Note

1. Other notable contenders in the AI arms race are India, Israel, and Japan, see ‘Dark Horses in the Lethal AI Arms Race’ for more informa- tion (Haner, 2019).

References

Altmann, J. and Sauer, F. (2017) ‘Autonomous Weapon Systems and Strategic Stability’, Survival, 59 (5), pp. 117–142.

Baker, S. (2017) Which Countries and Universities are Leading on AI Research?Times Higher Education, World University Rankings [online]. Available from: https://www.timeshighereducation.com/data- bites/which-countries-and-universities-are-leading-ai-research [Accessed 17 May 2019].

Bendett, S. (2017) Red Robots Rising: Behind the Rapid Development of Russian Unmanned Military Systems, The Strategy Bridge [online].

Available from: https://thestrategybridge.org/the-bridge/2017/12/12/ red-robots-rising-behind-the-rapid-development-of-russian-unma nned-military-systems [Accessed 17 May 2019].

Bendett, S. (2018) In AI, Russia Is Hustling to Catch Up, Defense One [online]. Available from: https://www.defenseone.com/ideas/2018/04/ russia-races-forward-ai-development/147178/ [Accessed 23 May 2019].

Bode, I. and Huelss, H. (2018) ‘Autonomous Weapons Systems and Changing Norms in International Relations’, Review of International Studies, 44 (3), pp. 393–413. https://doi.org/10.1017/ s0260210517000614.

Boulanin, V. and Verbruggen, M. (2017) Mapping the development of autonomy in weapon systems, Stockholm International Peace Research Institute [online]. Available from: https://www.sipri.org/sites/ default/files/2017-11/siprireport_mapping_the_development_of_a utonomy_in_weapon_systems_1117_1.pdf [Accessed 23 May 2019].

CAICT and Gartner (2019) Global AI Investment and Funding Share, Statista [online]. Available from: https://www-statista-com.ezproxy.ne u.edu/statistics/941446/ai-investment-and-funding-share-by-country/ [Accessed 23 May 2019].

Campaign to Stop Killer Robots (CSKR) (2018) Country views on killer robots. Campaign to Stop Killer Robots [online]. Available from: https://www.stopkillerrobots.org/wp-content/uploads/2018/11/KRC_ CountryViews22Nov2018.pdf [Accessed 23 May 2019].

Campaign to Stop Killer Robots (2019) Global poll shows 61% oppose Killer Robots, Campaign to Stop Killer Robots [online]. Available from: https://www.stopkillerrobots.org/2019/01/global-poll-61-oppose-killer- robots/ [Accessed 17 May 2019].

China State Council (2017) A Next Generation Artificial Intelligence Development Plan [online]. Available from: https://www.newamerica. org/cybersecurity-initiative/digichina/blog/full-translation-chinas-new- generation-artificial-intelligence-development-plan-2017/ [Accessed 17 May 2019].

CISTP (2019a) Center for International Strategy, Technology, and Policy AI talent by Country, Statista [online]. Available from: https://www- statista-com.ezproxy.neu.edu/statistics/941479/ai-experts-by-country/ [Accessed 21 May 2019].

CISTP (2019b) Center for International Strategy, Technology, and Policy Global AI Enterprises by Country, Center for International Strategy, Technology, and Policy [online]. Available from: https://www-statista- com.ezproxy.neu.edu/statistics/941054/number-of-ai-companies-world wide-by-country/ [Accessed 21 May 2019].

CISTP (2019c) Global AI Paper Publications, Statista [online]. Available from: https://www-statista-com.ezproxy.neu.edu/statistics/941037/ai- paper-publications-worldwide-by-country/ [Accessed 21 May 2019].

Cole, C. (2017) ‘Harm to Global Peace and Security’, in R. Acheson, M. Bolton, E. Minor and A. Pytlak (eds.), The Humanitarian Impact of Drones. Womens International League for Peace and Freedom, pp. 48–59. Available from: https://reliefweb.int/sites/reliefweb.int/files/ resources/humanitarian-impact-of-drones.pdf [Accessed 23 May 2019].

Convention on Certain Weapons (CCW) Group of Government Experts (2018) ‘Report of the 2018 session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems’. United Nations Office at Geneva [online]. Available from: https://www.unog.ch/80256EDD006B8954/ (httpAssets)/20092911F6495FA7C125830E003F9A5B/$file/CCW_GGE. 1_2018_3_final.pd [Accessed 23 May 2019].

Cuthbertson, A. (2016) An Algorithm Powered by this $35 Computer Just Beat a Human Fighter Pilot, Newsweek [online]. Available from: https://www.newsweek.com/artificial-intelligence-raspberry-pi-pilot-ai- 475291 [Accessed 17 May 2019].

Department of Defense (DoD) (2012) Directive 3000.09, United States of America Department of Defense [online]. Department of Defense. Available from: https://www.esd.whs.mil/Portals/54/Documents/DD/ issuances/dodd/300009p.pdf.

Edmeades, A. (2017) ‘International Law Perspectives’, in R. Acheson, M. Bolton, E. Minor and A. Pytlak (eds.), Harm to Global Peace and Security. Womens International League for Peace and Freedom, pp. 101–114. Available from: https://reliefweb.int/sites/reliefweb.int/files/ resources/humanitarian-impact-of-drones.pdf [Accessed 23 May 2019].

European Parliament (2018) Texts adopted – Autonomous Weapon Systems, European Parliament Resolution. Available from: http:// www.europarl.europa.eu/doceo/document/TA-8-2018-0341_EN.html? redirect [Accessed 23 May 2019].

Garcia, D. (2014) ‘The Case Against Killer Robots – Why the United States Should Ban Them’, Foreign Affairs. Available from: https:// www.foreignaffairs.com/articles/united-states/2014-05-10/case-aga inst-killer-robots [Accessed 23 May 2019].

Garcia, D. (2016) ‘Future Arms, Technologies, and International Law: Preventative Security Governance’, European Journal of International Security, 1 (1), pp. 94–114.

Garcia, D. (2018) ‘Lethal Artificial Intelligence and Change: The Future of International Peace and Security’, International Studies Review, 20 (2), pp. 334–341.

Gartner (2018) Gartner Says Nearly Half of CIOs Are Planning to Deploy Artificial Intelligence, Gartner Newsroom. Available from: https:// www.gartner.com/en/newsroom/press-releases/2018-02-13-gartner-sa ys-nearly-half-of-cios-are-planning-to-deploy-artificial-intelligence [Accessed 16 May 2019].

Gettinger, D. (2018) Summary of Drone Spending in the FY 2019 Defense Budget Request. Center for the Study of the Drone at Bard College. Available from: https://dronecenter.bard.edu/files/2018/04/ CSD-Drone-Spending-FY19-Web-1.pdf [Accessed 17 May 2019].

Haner, J. K. (2019) Dark Horses in the Lethal AI Arms Race. Available from: JustinKHaner.com/AIArmsRace [Accessed 4 July 2019].

Horowitz, M. C. and Levendusky, M. S. (2011) ‘Drafting Support for War: Conscription and Mass Support for Warfare’, The Journal of Politics, 73 (2), pp. 524–534. https://doi.org/10.1017/s0022381611000119.

Human Rights Watch (2019) China’s Campaign of Repression Against Xinjiang’s Muslims, Human Rights Watch [online]. Available from: https://www.hrw.org/report/2018/09/09/eradicating-ideological-viruse s/chinas-campaign-repression-against-xinjiangs#; [Accessed 20 May 2019].

IPlytics GmbH (2019) AI Patent Applications by Country Worldwide, Statista [online]. Available from: https://www-statista-com.ezproxy.ne u.edu/statistics/1007944/number-of-ai-patents-worldwide-by-country/ [Accessed 21 May 2019].

Ipsos (2018) Trust in artificial intelligence by country 2018, Statista [online] . Available from: https://www-statista-com.ezproxy.neu.edu/ statistics/948531/trust-artificial-intelligence-country/ [Accessed 21 May 2019].

Kania, E. B. (2017) Battlefield singularity: Artificial Intelligence, Military Revolution, and China’s Future Military Power [online]. Available from: https://www.cnas.org/publications/reports/battlefield-singula rity-artificial-intelligence-military-revolution-and-chinas-future-military- power [Accessed 17 May 2019].

Kofman, M. (2017) The Russian Defense Budget and You, Russia Military Analysis [online]. Available from: https://russianmilitaryana lysis.wordpress.com/2017/03/17/the-russian-defense-budget-and-you/ [Accessed 17 May 2019].

Mizokami, K. (2018) Russia’s Tank Drone Performed Poorly in Syria, Popular Mechanics. Available from: https://www.popularmechanics.c om/military/weapons/a21602657/russias-tank-drone-performed- poorly-in-syria/ [Accessed 17 May 2019].

Moscow Times (2014) Battle Robots to Guard Russian Missile Silos by 2020, Moscow Times [online]. Available from: http://themoscowtime s.com/articles/battle-robots-to-guard-russian-missile-silos-by-2020- 38460 [Accessed 17 May 2019].

Nissenbaum, D. and Strobel, W. (2019) Mideast Insurgents Enter the Age of Drone Warfare, The Wall Street Journal [online]. Available from:

https://www.wsj.com/articles/mideast-insurgents-enter-the-age-of-

drone-warfare-11556814441 [Accessed 17 May 2019].
Nones, M. and Marrone, A. (2012) The Transformation of the Armed Forces: The Forza NEC Program. Rome: IAI Research Paper (Istituto

Affari Internazional).
Olaifa, B. (2018) Boko Haram Now Engages Foreign Fighters, Says Army,

The Nation, Nigeria [online]. Available from: http://thenationonlineng. net/boko-haram-now-engages-foreign-fighters-says-army/ [Accessed 17 May 2019].

Parkin, S. (2015) Killer Robots: The Soldiers that Never Sleep [online]. Available from: http://www.bbc.com/future/story/20150715-killer-rob ots-the-soldiers-that-never-sleep [Accessed 17 May 2019].

Peng, T. (2018) South Korea Aims High on AI, Pumps $2 Billion into R&D, Medium [online]. Available from: https://medium.com/syncedre view/south-korea-aims-high-on-ai-pumps-2-billion-into-r-d-de8e5c0c 8ac5 [Accessed 17 May 2019].

Pisani, B. (2015) What Caused the Flash Crash? CFTC, DOJ Weigh In, CNBC. Available from: https://www.cnbc.com/2015/04/21/what-ca used-the-flash-crash-cftc-doj-weigh-in.html [Accessed 17 May 2019].

Research and Markets (2018) Artificial Intelligence in Military Market by Offering (Software, Hardware, Services), Technology (Learning & Intelligence, Advanced Computing, AI Systems), Application, Platform, Region – Global Forecast to 2025, Research and Markets [online]. Available from: https://www.researchandmarkets.com/resea rch/z8tfh7/18_8_billion?w=4 [Accessed 17 May 2019].

Roff, H. M. (2014) ‘The Strategic Robot Problem: Lethal Autonomous Weapons in War’, Journal of Military Ethics. 13 (3), pp. 211–2227. https://doi.org/10.1080/15027570.2014.975010.

Sander, A. and Meldon, W. (2014) BCG Perspectives: The Rise of Robotics [online]. Available from: http://image-src.bcg.com/Imags/The_Rise_of_ Robotics_Aug_2014_tcm9-82495.pdfe [Accessed 17 May 2019].

Scharre, P. (2014) Robotics on the Battlefield Part II: The Coming Swarm, CNAS [online]. Available from: https://www.cnas.org/publications/ reports/robotics-on-the-battlefield-part-ii-the-coming-swarm [Accessed 17 May 2019].

Scharre, P. (2019) ‘The Real Dangers of an AI Arms Race’, Foreign Affairs, 9 (3).

Schwarz, E. (2018) The (Im)possibility of Meaningful Human Control for Lethal Autonomous Weapon Systems, Humanitarian Law & Policy [online]. Available from: https://blogs.icrc.org/law-and-policy/2018/08/ 29/im-possibility-meaningful-human-control-lethal-autonomous-wea pon-systems/ [Accessed 7 June 2019].

Sharkey, N. (2017) ‘Why Robots Should Not be Delegated with the Decision to Kill’, Connection Science, 29 (2), pp. 177–186. https://doi. org/10.1080/09540091.2017.1310183.

Shoham, Y., Perrault, R., Brynjolfsson, E., Clark, J., Manyika, J., Carlos, J. et al. (2018) The AI Index 2018 Annual Report. AI Index Steering Committee, Human-Centered AI Initiative, Stanford University, Stanford, CA.

SIPRI (2019) Stockholm International Peace Research Institute (SIPRI) Military Expenditure Database [online]. Available from: https://www. sipri.org/databases/milex [Accessed 17 May 2019].

Sputnik News (2013) Russia to Focus on Robotic Weaponry in Arms Procurement, Sputnik News [online]. Available from: https:// sputniknews.com/military/20131211185469570-Russia-to-Focus-on- Robotic-Weaponry-in-Arms-Procurement/ [Accessed 17 May 2019].

Statista (2019) Global Drone Spending by Country, Statista [online]. Available from: https://www-statista-com.ezproxy.neu.edu/statistics/ 757608/global-drone-spending/ [Accessed 21 May 2019].

Tucker, P. (2017) ‘Russia to the United Nations: Don’t Try to Stop Us From Building Killer Robots’, Defence One [online], pp. 1–2. Available from: https://www.defenseone.com/technology/2017/11/russia- united-nations-dont-try-stop-us-building-killer-robots/142734/ [Accessed 23 May 2019].

West, S. M., Whittaker, M. and Crawford, K. (2019) Discriminating Systems: Gender, Race, and Power in AI [online]. Available from:

http://cdn.aiindex.org/2018/AI Index 2018 Annual Report.pdf

[Accessed 20 May 2019].
World Bank (2019) GDP Data, The World Bank Dataset [online]. Available

from: https://data.worldbank.org/indicator/ny.gdp.mktp.cd [Accessed 21 May 2019].

Author Information

Justin Haner is a doctoral candidate in the Department of Political Science at Northeastern University, specializing in global governance

and international security. His work focuses on the transformative power that international law and institutions can have on solving com- plex global and regional security problems.

Denise Garcia is a Northeastern University professor, International Panel for the Regulation of Autonomous Weapons member, and vice- chair of the International Committee for Robot Arms Control. She stud- ies robotics and Artificial Intelligence, global governance of security, and the formation of new international norms and their impact on peace and security.