“Lethal autonomous robots”

Interesting article here about the development of intelligent drones who would make better moral calls on the battlefield than humans. Excerpt:

A year after seeing the Apache helicopter video in 2005, Mr. Arkin, the Georgia Tech roboticist, won a three-year grant from the U.S. Army Research Office for a project with a stated goal of producing “an artificial conscience” to guide robots in the battlefield independent of human control. The project resulted in a decision-making architecture that Mr. Arkin says could potentially lead to ethically superior robotic warriors within as few as 10 to 20 years, assuming the program is given full financial support.

“I’m not talking about replacing war fighters one for one,” he says. “I’m talking about designing very narrow, very specific machines for certain tasks that will work alongside human war fighters to carry out particular types of operations that humans don’t do particularly well at, such as building-clearing operations.”

Advertisements

2 thoughts on ““Lethal autonomous robots”

  1. Daniel Alexander

    Very interesting. I wonder, if a robot could attain a human-like consciousness, would it choose not to fight or not follow orders (assuming it can comprehend feelings, subjectivity, etc.)? Where is Isaac Asimov when you need him!?

    Like

    Reply
  2. Spencer Harris

    I personally commend the level of thought that is being given to this issue, despite the lack of general awareness of this issue. The consequences of any action, for or against, on the subject of robotic warfare is meaningful because it involves the loss of human lives. It is intrinsically worthy of deep consideration.

    The inevitability of robotic warfare can be confirmed by the prisoner’s dilemma. Whoever does not utilize the advantages that robotic warfare bestows will suffer if his opponent takes advantage of those advantages, even though it would be best if all parties refused to participate.

    This is partly alluded to by the last paragraph in the article which says:

    “If it’s your job to be concerned about the security of the United States, and that’s what the president has told you to do, then you’ve got to try to understand this stuff,” Mr. Allenby says. “Because if you don’t, and then China does, or Russia does, or India does, or Brazil does, then you haven’t done your job. You’ve failed.”
    It is wisely said in this article by Mr. Wallach that decision-making machines is not an “…appropriate form of warfare. It becomes comparable to biological weaponry, gas warfare, lasers on the battlefield, other things that have now been declared immoral, inappropriate in warfare.”

    However, cornered animals do desperate things…as do wounded animals eager to dispatch a dangerous threat. See General Grant in the Civil War. Or the bombing of industrial centers in WWII. Strategically it is just as important to remove the enemies’ industrial capability as it is to kill enemy soldiers in combat.
    Rachel Maddow’s book “Drift: The Unmooring of American Military Power” does a good job of arguing that a necessary impediment to unnecessary warfare is public disapprobation. A part of what instigates public disapproval is the personal prospect of friends and family suffering in a war they do not think is worth the cost. Part of her argument is that by having a professional army, by not drafting unwilling civilians to be soldiers, we have disconnected a majority of the American populace from the consequences of war. This assertion can be supported by the lack of attention the war in Afghanistan receives to this day. Those who have friends and family in harm’s way are profoundly interested in policy concerning Afghanistan…while everyone else resides in a state of apathy born of their absence of personal concerns residing in that area.

    My fear is that as robots/computers take a larger and larger role in warfare modern society will have even less impediments to war to solve disputes than we do right now. Without friends and family casualties, how much attention would anyone pay to the wars we fight? How much easy would it be for America to participate in an unnecessary war?

    However, robotics in warfare is promising in many regards. I am a sergeant in the infantry and a newly certified Raven UAV operator. In Afghanistan I can remember instances in which if no American lives were at stake, we could have taken more time to analyze the situation which could have reduced the amount of collateral damage. I have seen first-hand the benefits of robots in combat, benefits a peacenik and a warfighter can appreciate, but I am wary of the potential unseen consequences of relying upon machines to conduct warfare in a manner that coincides with the American conscience.

    Like

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s