Book chapter
Military emerging disruptive technologies: Compliance with international law and ethical standards
Marco Marsili (Marsili, M.);
Book Title
Intelligent and autonomous: Transforming values in the face of technology
Year (definitive publication)
2023
Language
English
Country
Netherlands
More Information
Web of Science®

This publication is not indexed in Web of Science®

Scopus

This publication is not indexed in Scopus

Google Scholar

This publication is not indexed in Google Scholar

Abstract
The major powers focus on science and technology development in order to build military power with strategic impact. High-technology weapons, available also to non-state actors, are assumed they would shape the nature of warfare in the twenty-first century. Semiconductors, cloud computing, robotics, and big data are all part of the components needed to develop the AI that will model and define the future battlespace. Artificial intelligence will apply to nuclear, aerospace, aviation and shipbuilding technologies to provide future combat capabilities. The incorporation of AI into military systems and doctrines will shape the nature of future warfare and, implicitly, will decide the outcome of future conflicts. Before fielding a weapon system, military and political leaders should think about how it can be used and should it be used in a certain manner. A strong and clear regulatory framework is needed. The use of automatic processing of plans and orders (automatic control) needs a policy control. Autonomous machines need some level of human control and accountability. Imagine what could happen if a system, like HAL 9000 or the War Games supercomputer, could make an autonomous decision. Some fictional stories have imagined a dystopian future where machine intelligence increases and surpasses human intelligence until machines exert control over humans. As Freedman concludes in The Future of War, most claims from the military futurists are wrong, but they remain influential nonetheless. The tendency of humans is to give more responsibility to machines in collaborative systems. In the future, automatic design and configuration of military operations will be entrusted more and more to the machines. Given human nature, if we recognize the autonomy of machines, we cannot expect anything better from them than the behavior of their creators. So why should we expect a machine to ‘do the right thing’? In the light of what has been discussed here, it could be argued that some military applications of EDTs may jeopardize human security. The total removal of humans from the navigation, command and decision-making processes in the control of unmanned systems, and as such away from participation in hostilities, makes humans obsolete and dehumanizes war. Because of the nature and the technological implications of automated weapons and AI-powered intelligence-gathering tools it is likely that boots on ground will become an exception. Cyber soldier probably will be a human vestige behind the machine. The rules that will apply to battlespace are unknown. Increased machine autonomy in the use of lethal force raises ethical and moral questions. Is it an autonomous system safe from error? Who will bear the responsibility and accountability for the wrong decision: politicians, low-makers, policy-makes, engineers, or military? Guidelines are needed, and ethical and legal constraints should be considered. Lexicon and definition of terms are essential, and the international community should find common, undisputed and unambiguous legal formulations. The difference between conventional/unconventional, traditional/non-traditional, kinetic/non-kinetic, and lethal/non-lethal seems to be outdated. A knife, a broken bottle neck (if it cuts your jugular), even a fork, a hammer, a baseball bat, or a stone – according to the biblical story David kills Goliath by hurling a stone from his sling and hitting him in the center of forehead – are all unconventional, kinetic, and potentially lethal weapons. Nevertheless, distinguishing between weapons, their effect and consequence, is necessary in order to avoid a cascade effect and undesirable outcomes. The LAWS can lead to an acceleration of a new arms race and to proliferation illegitimate actors – non-state actors and terrorist groups – cyber-attacks and hacking, lowering of the threshold for the use of force. The debate on the application of technology to warfare should cover international law, including IHL, ethics, neuroscience, robotics and computer science. It requires a holistic approach. It is necessary to investigate whether the new domains are actually comparable to the classical ones, and whether current rules are applicable, or if new ones are necessary. Further considerations deriving from the extension of the battlefield to the new domains of warfare concern the use of artificial intelligence in the decision-making process, which, in a fluid security environment, needs to be on target and on time in both the physical and virtual informational spaces. It is not just a legal debate, but also moral and ethical that should be deepened. A multi-disciplinary approach would be useful for designing the employment framework for new warfare technologies.
Acknowledgements
--
Keywords
Emerging and disruptive technologies,International law,Ethics,Law of war,Armed conflict,Geneva convention,Human rights,Artificial intelligence,Arms machine,Drones,Technology,War warfare,Military,Big data,Quantum computing
  • Law - Social Sciences
  • Political Science - Social Sciences
  • Philosophy, Ethics and Religion - Humanities
  • Other Humanities - Humanities
Funding Records
Funding Reference Funding Entity
SFRH/BD/136170/2018 Fundação para a Ciência e a Tecnologia

With the objective to increase the research activity directed towards the achievement of the United Nations 2030 Sustainable Development Goals, the possibility of associating scientific publications with the Sustainable Development Goals is now available in Ciência-IUL. These are the Sustainable Development Goals identified by the author(s) for this publication. For more detailed information on the Sustainable Development Goals, click here.