By Jean Brunet
Jean Brunet, Systems of Systems Architect and Director at NAEJ Systems Arkitect, has over 15 years of experience at Capgemini and is an ethical expert in European Defence Fund projects. His expertise includes complex systems architecture, big data, AI, defence robotics, security systems, climate solutions, and ethical supply chain optimisation. He also co-founded Elter, focusing on AI systems architecture and validation.
Before the advent of robots, artificial intelligence and sophisticated weapon systems, basic weapons were already subject to ethical scrutiny. Examples include the cross-shaped bayonet, which inflicted more severe wounds than a double-edged bayonet, landmines that endangered civilian populations when they exploded long after conflicts ended, and depleted uranium shells, whose radioactive dust affects population for extended periods in a wide area. The history of warfare teaches us that the analysis of the ethical behaviour of a system must be done conducted over short, medium, and long-time frames.
The existence of unavoidable ethical rules in a weapon system seems, at first glance, incompatible with their operational effectiveness. Critics argue that such principles might undermine trust in the system’s functionality. Their implementation could also stifle research and innovation capabilities, and ultimately weaken the country that adopts them. However, this is not the case, and we shall explore why later in this paper.
Ethical decision making: A critical element in complex system architecture
It is essential to remember that, for a long time now, the triggering of fire has been subject to very precise command protocols, leaving no room for approximation. Consider, for example, the acquisition of a target and the command to fire within the European military forces headquarters. The doctrine for the use of weapons is subject to a rigorous decision-making and command process that depends on the nature and type of mission, the level of commitment, the context and the balance of forces, among other criteria.
Before sending a shell at a target, for instance, an observer must have a clear view of the target, using binoculars or optronic means that link him directly and visually to the target, including acoustic means or light intensifiers. The target must be observed continuously without interruption for a given period. If the target is ever masked once, the identification process must begin anew. No matter how complex the systems become, ethical decision-making process and how the architect of complex systems goes about it remain paramount.
Integrating Ethics by Design in Weapon System Architecture
“Ethic by design” is ideally suited to architects, particularly those specialising in the design of complex digital systems. In computer science, architecture refers to a holistic model that describes the overall structure inherent to a computer system, encompassing the organisation of the different elements of the system (software and/or hardware and/or human and/or information) and their interrelationships. This structure follows a set of strategic decisions taken during the design process of all or part of the computer system, through the exercise of a technical and industrial discipline called architecture, overseen by the IT architect.
The architect’s responsibility is to create this holistic model and communicate it to all the stakeholders involved in the system’s design. When a system is complex, the architect leads and coordinates a team of architects responsible for the architecture of certain subsystems, such as Artificial Intelligence.
By understanding how a weapon system is built, we can appreciate the architect’s role in ensuring that ethical considerations are accounted for in the system. However, achieving “Ethics by Design” in complex systems presents several challenges.
Balancing Ethics and Performance
Is it reasonable to apply the concept of « complex system » in every context? A complex system is composed of many interacting entities, whose integration enables the achievement of a higher-order common goal. These systems are characterised by emergent properties, which only exist at the system level and are not observable at the constituent level.
We must also acknowledge with humility that complex systems are so named because their behaviour can sometimes defy control, producing unpredictable results that deviate from an initially defined ethical framework. This becomes especially true in the case of partially or fully autonomous weapon systems, equipped with artificial intelligence (AI) engines.
Integrating ethical rules into the behaviour of a complex system is a daunting task for any architect to support and implement completely, because as there are often contradictions between performance, efficiency and the ethical functioning of the system. Ethics constantly resolve dilemmas and compromises, which can become complex in today’s systems, and even unbearable for the architect responsible for the system. This is why we recommend involving a senior, independent architect—one who is not directly part of the project but is familiar with the system—throughout the development process, particularly during the design phase.
Practical steps towards ethical system design
How can an architect ensure that a system adopts ethical behaviour? In the case of a system incorporating Artificial Intelligence, the architect analyses how some critical components are integrated, including algorithms for data generation, augmentation, annotation, and detecting the absence of critical cases, as well as test case generation engines and simulation engines etc. More than ever, it is essential to adopt a method that is both holistic and sufficiently systemic, allowing the identification of potential ethical loopholes.
In general, the architect responsible for ethics must develop a comprehensive approach to building complex systems. For autonomous systems, this architect will always encounter flaws in the architecture, including:
- Security: Weapons systems can be vulnerable to cyber-attacks. A security breach could allow malicious actors to take control of the system or cause it to malfunction, with potentially catastrophic ethical consequences.
- Reliability: Autonomous systems must operate reliably in complex and unpredictable environments. A flaw in their programming or design could lead to fatal errors, such as targeting the incorrect target.
- Transparency: The opacity of the algorithms used in Autonomous Weapon Systems makes it difficult to verify compliance with ethical and legal standards. An overly complex or poorly documented architecture can hide critical failures.
- Human monitoring and control: Autonomous weapon systems must allow sufficient human supervision oversight to intervene in the event of a malfunction or error. An architecture that does not guarantee this could be considered deemed ethically unacceptable.
Contrary to some beliefs, embedding ethical principles in weapon systems can enhance their operational effectiveness when used to serve a just cause. Without ethics, we risk becoming indistinguishable from our adversaries. This might not only cause confusion among the people of Europe but can also demoralise soldiers, who may feel that they are betraying the values they are meant to defend.
Consider, for example, the use of lethal drones. If we allow a drone to strike its target without regard for the confidence level in automatic target recognition, and if communication with the operator is impossible, then we have failed. Furthermore, environmental considerations, such as energy consumption, are becoming increasingly important. As energy resources grow scarcer due to the intensity of combat and the distance from supply bases, learning to operate with no or minimal energy consumption will give soldiers a significant advantage in the field.
In conclusion, the ethics of warfare, when applied to complex weapons systems, requires architects to consider the consequences of these systems’ use throughout the design process. Technical or programming flaws in the architecture of these systems can have serious repercussions and require particular attention to ensure that their deployment aligns with the fundamental ethical principles required and upheld by the European community.
The texts published by C&V Consulting only bind their authors. They do not bind C&V Consulting or the institutions to which they belong in any way.