Chapter 18 Roboethics and Telerobotic Weapons Systems

 



Abstract

This abstract discusses the ethical use of telerobotic weapons technology, stating that such technology is used ethically when it is controlled smartly to achieve morally good objectives. The paper focuses on whether telerobotic weapons can be controlled intelligently. Currently, it seems that this smart control isn't fully achieved. The author proposes ideas to enhance this situation.


18.1 Introduction

The introduction explains that using technology ethically means controlling it intelligently to achieve a moral good. Philosopher Carl Mitcham outlines three requirements for this: 1) Understanding the purpose or goal of using the technology; 2) Knowing the outcomes of technology use before actually using it; and 3) Acting based on this knowledge. Applying this to telerobotic weapons, their ethical use involves intelligent control for moral actions. The paper doesn't discuss when warfare is just but focuses on whether telerobotic weapons can meet Mitcham's criteria for intelligent control. Currently, it's uncertain if these criteria are fully met, and the author proposes ways to improve this.


18.2 Problems with the Intelligent Control of Telerobotic Weapons Systems

18.2.1 Telepistemological Distancing

This section discusses the impact of telerobotic systems on military operations, focusing on two main issues: telepistemological distancing and the ethical implications of using such technology.

Telepistemological Distancing: This term refers to the physical and psychological distance between the operator of a telerobot (like a military drone) and the location of military activity. This distancing can affect how operators perceive and understand the situations they are navigating. The primary purpose of military robotics is to remove human soldiers from direct danger on the battlefield.

Types of Robots: The text distinguishes between autonomous robots, which operate independently without direct human input, and telerobots, which are controlled by human operators. Examples of telerobots include the NASA Mars Rovers and various military drones. These can range from being directly controlled to semi-autonomous.

Autonomy in Telerobots: While telerobots have human operators, they also contain autonomous systems over which the operator may have limited control. This blend of human and machine control can lead to ethical challenges, especially in high-stress, fast-paced environments like battlefields.

Impact on Ethical Decision Making: Operating a telerobot can alter an operator's perception and decision-making. Seeing the world through the sensors and cameras of a robot creates a telepistemological experience, where beliefs about a situation are formed remotely. This can complicate epistemological questions about the accuracy and reliability of information received from the robot.

Challenges of Telerobotic Perception: Ensuring that the information a telerobot provides to its operator is epistemologically reliable is crucial for both effective and ethical operation. Operators must be able to discern accurate information even in complex and deceptive environments, such as a battlefield.

Comparison with Other Technologies: The text compares telerobots to simpler technologies like binoculars, noting that while binoculars provide a reliable view of the world based on physical optics, telerobots' digitally enhanced or altered images may not be as epistemologically reliable.

Sociological Prejudgment: The operator's social, political, and ethical biases can further complicate the interpretation of information received from a telerobot, adding another layer of epistemic noise.

Political and Ethical Consequences: The distance and perceived safety provided by telerobots can make military force more politically palatable and likely, potentially perpetuating warfare. The selective presentation of successful missions can create an illusion of clean, precise warfare, influencing public opinion and political decisions.

In summary, while telerobotic systems offer significant advantages in terms of safety and operational capabilities, they also present complex ethical and epistemological challenges that must be addressed to ensure their responsible and ethical use in military contexts.


18.2.2 The Normalization of Warfare

This section discusses how telerobotic systems, such as unmanned military drones, are contributing to the normalization of warfare in everyday life.

Distance from the Battlefield: Operators of telerobots, such as drone pilots, are often located far from the battlefield, sometimes thousands of miles away. For example, some U.S. Air Force drone operators work near Las Vegas, operating drones on military missions during shifts and then returning to their normal lives.

Normalization of Warfare: The routine and shift-work nature of operating these telerobots is leading to the normalization of warfare. Fighting a war becomes a regular job, where operators use deadly force and then return to everyday activities, like family life. This raises ethical concerns about the psychological impact on the operators and their families.

Political Motivation for Telerobotic Warfare: Politically, there is a strong push to expand the use of telerobots in military operations. This would lead to fewer casualties and less lifestyle disruption for military personnel. As long as these wars target countries without similar capabilities, they might go largely unnoticed by the general public.

Increasing Autonomy and Technical Simplicity: As telerobotic systems become more autonomous and easier to operate, the need for highly trained military professionals may decrease. This could lead to the use of younger, less experienced operators or even the outsourcing of these roles to military contractors.

Ethical Implications: These trends could distort the ethical understanding of warfare. If conducting war becomes akin to a regular office job, there might be less urgency to end conflicts. The use of telerobotic weapons, without careful consideration of their ethical implications, could perpetuate unethical situations.

In summary, the use of telerobotic systems in military operations is leading to a shift in how warfare is perceived and conducted, with significant ethical implications both for the operators and the broader public.


18.2.3 The Perceived Antiseptic Layer of Telerobotics

This section discusses how telerobots, such as unmanned military drones, contribute to the perception of warfare as being "surgical" and impersonal, impacting the way enemies are viewed in conflict.

Myth of Surgical Warfare: Telerobots foster the belief in surgical warfare, where the use of precise, remote-controlled weapons leads to the idea that warfare can be clean and without significant collateral damage.

Dehumanization of the Enemy: The telepistemological distancing created by telerobotics, where operators are physically and emotionally removed from the battlefield, can lead to the dehumanization of the enemy. Operators may see their targets merely as thermal images on a screen, while those being targeted see only the remote-controlled machines.

Moral Agency and Hatred: This type of warfare can diminish the recognition of enemies as fellow moral agents and potentially deepen hatred, contrary to many ethical theories which demand that all moral agents, including enemies, be treated with special regard. Ethical warfare, if it exists, should aim for a swift resolution of conflict, treating enemies as moral agents and striving for a return to peaceful, respectful political relations.

Difficulty in Achieving Peace: Telerobotic warfare complicates the achievement of peaceful relations post-conflict. Victims of telerobotic attacks often view these weapons as cowardly and blame them for civilian casualties, whether or not these perceptions are entirely accurate. This viewpoint can hinder the reconciliation process.

Ethical Implications: If telerobotic weapons foster intergenerational hatred, their use becomes ethically questionable. Intelligent control of technology requires considering these long-term impacts on relationships between peoples and nations.

In summary, the perceived antiseptic layer of telerobotics in warfare not only creates a false sense of precision and cleanliness in conflict but also significantly impacts the moral considerations and long-term consequences of using such technology in warfare.


18.3 Mitigation Strategies

This section presents strategies for mitigating the ethical issues associated with the use of telerobotic weapon systems, emphasizing the need for intelligent control for their ethical use.

Enhanced Remote Sensing: The design of the weapon system's remote sensing capabilities must be improved. It's important to ensure that the system provides clear information for ethical decision-making, including the ability to identify human agents as such, without objectification by the sensor's mediation. If this level of identification isn't possible, then the machine should not be used as a weapon.

Full Moral Agent Control: A moral agent must have complete control over the weapon, beyond just an abort button. Every aspect of the decision to engage or not should be made by a moral agent. While this agent doesn't necessarily have to be human (an artificial moral agent, or AMA, could suffice), the current technology for AMAs is not yet sufficiently advanced. Until it is, human involvement in decision-making is essential.

Training in Just War Theory: Since the operator can introduce biases (epistemic noise), it's crucial that they are thoroughly trained in just war theory. Currently, only military officers receive this training, suggesting that only they should control armed telerobots. If this level of training cannot be ensured for all operators, then these machines should not be used.

Avoid Normalizing War: The use of telerobotic weapons should not trivialize or normalize war. Shift-work fighting and the location of control centers near civilian populations should be avoided to prevent the legitimization of military targets in civilian areas.

Preventing Hatred and Conflict Intensification: Telerobotic weapons must be used in ways that do not prolong or intensify conflict-induced hatred. They should be deployed ethically only if they contribute to a swift return to peaceful relations.

In summary, these recommendations focus on ensuring that telerobotic weapons systems are used in a manner that upholds ethical principles, respects human dignity, and contributes to the resolution rather than the escalation of conflicts.

Comments

Popular posts from this blog

Introduction to the Book Review of Philosophy and Engineering: Reflections on Practice, Principles and Process

Chapter 27 The Methodological Ladder of Industrialised Inventions: A Description-Based and Explanation -Enhanced Prescriptive Model