Capítulo de livro
Military Emerging Disruptive Technologies: Compliance with International Law and Ethical Standards
Marco Marsili (Marsili, M.);
Título Livro
Intelligent and Autonomous: Transforming Values in the Face of Technology
Ano (publicação definitiva)
2023
Língua
Inglês
País
Países Baixos (Holanda)
Mais Informação
Web of Science®

Esta publicação não está indexada na Web of Science®

Scopus

Esta publicação não está indexada na Scopus

Google Scholar

Esta publicação não está indexada no Google Scholar

Abstract/Resumo
The major powers focus on science and technology development in order to build military power with strategic impact. High-technology weapons, available also to non-state actors, are assumed they would shape the nature of warfare in the twenty-first century. Semiconductors, cloud computing, robotics, and big data are all part of the components needed to develop the AI that will model and define the future battlespace. Artificial intelligence will apply to nuclear, aerospace, aviation and shipbuilding technologies to provide future combat capabilities. The incorporation of AI into military systems and doctrines will shape the nature of future warfare and, implicitly, will decide the outcome of future conflicts. Before fielding a weapon system, military and political leaders should think about how it can be used and should it be used in a certain manner. A strong and clear regulatory framework is needed. The use of automatic processing of plans and orders (automatic control) needs a policy control. Autonomous machines need some level of human control and accountability. Imagine what could happen if a system, like HAL 9000 or the War Games supercomputer, could make an autonomous decision. Some fictional stories have imagined a dystopian future where machine intelligence increases and surpasses human intelligence until machines exert control over humans. As Freedman concludes in The Future of War, most claims from the military futurists are wrong, but they remain influential nonetheless. The tendency of humans is to give more responsibility to machines in collaborative systems. In the future, automatic design and configuration of military operations will be entrusted more and more to the machines. Given human nature, if we recognize the autonomy of machines, we cannot expect anything better from them than the behavior of their creators. So why should we expect a machine to ‘do the right thing’? In the light of what has been discussed here, it could be argued that some military applications of EDTs may jeopardize human security. The total removal of humans from the navigation, command and decision-making processes in the control of unmanned systems, and as such away from participation in hostilities, makes humans obsolete and dehumanizes war. Because of the nature and the technological implications of automated weapons and AI-powered intelligence-gathering tools it is likely that boots on ground will become an exception. Cyber soldier probably will be a human vestige behind the machine. The rules that will apply to battlespace are unknown. Increased machine autonomy in the use of lethal force raises ethical and moral questions. Is it an autonomous system safe from error? Who will bear the responsibility and accountability for the wrong decision: politicians, low-makers, policy-makes, engineers, or military? Guidelines are needed, and ethical and legal constraints should be considered. Lexicon and definition of terms are essential, and the international community should find common, undisputed and unambiguous legal formulations. The difference between conventional/unconventional, traditional/non-traditional, kinetic/non-kinetic, and lethal/non-lethal seems to be outdated. A knife, a broken bottle neck (if it cuts your jugular), even a fork, a hammer, a baseball bat, or a stone – according to the biblical story David kills Goliath by hurling a stone from his sling and hitting him in the center of forehead – are all unconventional, kinetic, and potentially lethal weapons. Nevertheless, distinguishing between weapons, their effect and consequence, is necessary in order to avoid a cascade effect and undesirable outcomes. The LAWS can lead to an acceleration of a new arms race and to proliferation illegitimate actors – non-state actors and terrorist groups – cyber-attacks and hacking, lowering of the threshold for the use of force. The debate on the application of technology to warfare should cover international law, including IHL, ethics, neuroscience, robotics and computer science. It requires a holistic approach. It is necessary to investigate whether the new domains are actually comparable to the classical ones, and whether current rules are applicable, or if new ones are necessary. Further considerations deriving from the extension of the battlefield to the new domains of warfare concern the use of artificial intelligence in the decision-making process, which, in a fluid security environment, needs to be on target and on time in both the physical and virtual informational spaces. It is not just a legal debate, but also moral and ethical that should be deepened. A multi-disciplinary approach would be useful for designing the employment framework for new warfare technologies.
Agradecimentos/Acknowledgements
--
Palavras-chave
emerging and disruptive technologies,EDTs,international law,IHL,international humanitarian law,ethics,morals,law of war,armed conflict,geneva conventions,human rights,fundamental human rights,artificial intelligence,AI,lethal authonomous weapons,arms,machine learning,drones,autonomous systems,unmanned systems,technology,war,warfare,directed energy weapons,laser weapons,big data,quantum computing,military
  • Direito - Ciências Sociais
  • Ciências Políticas - Ciências Sociais
  • Filosofia, Ética e Religião - Humanidades
  • Outras Humanidades - Humanidades
Registos de financiamentos
Referência de financiamento Entidade Financiadora
SFRH/BD/136170/2018 Fundação para a Ciência e a Tecnologia (FCT), Portugal

Com o objetivo de aumentar a investigação direcionada para o cumprimento dos Objetivos do Desenvolvimento Sustentável para 2030 das Nações Unidas, é disponibilizada no Ciência-IUL a possibilidade de associação, quando aplicável, dos artigos científicos aos Objetivos do Desenvolvimento Sustentável. Estes são os Objetivos do Desenvolvimento Sustentável identificados pelo(s) autor(es) para esta publicação. Para uma informação detalhada dos Objetivos do Desenvolvimento Sustentável, clique aqui.