Philosophy@Utah State

Home » Uncategorized » “Lethal autonomous robots”

“Lethal autonomous robots”

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 97 other followers

Old Main, USU


You need a Philosophy T-shirt! For more information, please click here.


* Interested in presenting a paper at an UNDERGRADUATE PHILOSOPHY CONFERENCE or publishing in an UNDERGRADUATE PHILOSOPHY JOURNAL? You should consider it! To see what options are available, both in state and out of state, click here.


• Is the world eternal? YES
• Do humans have contra-causal free will (i.e., can humans do otherwise)? NO
• Is beauty in the eye of the beholder? YES
• Do humans have souls? YES
• Are there natural rights? YES
• Is it morally permissible to eat meat? NO
• Is the unexamined life worth living? NO
• Is truth subjectivity? YES
• Is virtue necessary for happiness? YES
• Can a computer have a mind? YES
• Can humans know reality as it is in itself? YES
• Is hell other people? YES
• Can art be created accidentally? NO
• Can we change the past? NO
• Are numbers real? NO
• Is it always better to know the truth? YES

Blog Stats

  • 192,301 hits

Interesting article here about the development of intelligent drones who would make better moral calls on the battlefield than humans. Excerpt:

A year after seeing the Apache helicopter video in 2005, Mr. Arkin, the Georgia Tech roboticist, won a three-year grant from the U.S. Army Research Office for a project with a stated goal of producing “an artificial conscience” to guide robots in the battlefield independent of human control. The project resulted in a decision-making architecture that Mr. Arkin says could potentially lead to ethically superior robotic warriors within as few as 10 to 20 years, assuming the program is given full financial support.

“I’m not talking about replacing war fighters one for one,” he says. “I’m talking about designing very narrow, very specific machines for certain tasks that will work alongside human war fighters to carry out particular types of operations that humans don’t do particularly well at, such as building-clearing operations.”



  1. Daniel Alexander says:

    Very interesting. I wonder, if a robot could attain a human-like consciousness, would it choose not to fight or not follow orders (assuming it can comprehend feelings, subjectivity, etc.)? Where is Isaac Asimov when you need him!?


  2. I personally commend the level of thought that is being given to this issue, despite the lack of general awareness of this issue. The consequences of any action, for or against, on the subject of robotic warfare is meaningful because it involves the loss of human lives. It is intrinsically worthy of deep consideration.

    The inevitability of robotic warfare can be confirmed by the prisoner’s dilemma. Whoever does not utilize the advantages that robotic warfare bestows will suffer if his opponent takes advantage of those advantages, even though it would be best if all parties refused to participate.

    This is partly alluded to by the last paragraph in the article which says:

    “If it’s your job to be concerned about the security of the United States, and that’s what the president has told you to do, then you’ve got to try to understand this stuff,” Mr. Allenby says. “Because if you don’t, and then China does, or Russia does, or India does, or Brazil does, then you haven’t done your job. You’ve failed.”
    It is wisely said in this article by Mr. Wallach that decision-making machines is not an “…appropriate form of warfare. It becomes comparable to biological weaponry, gas warfare, lasers on the battlefield, other things that have now been declared immoral, inappropriate in warfare.”

    However, cornered animals do desperate things…as do wounded animals eager to dispatch a dangerous threat. See General Grant in the Civil War. Or the bombing of industrial centers in WWII. Strategically it is just as important to remove the enemies’ industrial capability as it is to kill enemy soldiers in combat.
    Rachel Maddow’s book “Drift: The Unmooring of American Military Power” does a good job of arguing that a necessary impediment to unnecessary warfare is public disapprobation. A part of what instigates public disapproval is the personal prospect of friends and family suffering in a war they do not think is worth the cost. Part of her argument is that by having a professional army, by not drafting unwilling civilians to be soldiers, we have disconnected a majority of the American populace from the consequences of war. This assertion can be supported by the lack of attention the war in Afghanistan receives to this day. Those who have friends and family in harm’s way are profoundly interested in policy concerning Afghanistan…while everyone else resides in a state of apathy born of their absence of personal concerns residing in that area.

    My fear is that as robots/computers take a larger and larger role in warfare modern society will have even less impediments to war to solve disputes than we do right now. Without friends and family casualties, how much attention would anyone pay to the wars we fight? How much easy would it be for America to participate in an unnecessary war?

    However, robotics in warfare is promising in many regards. I am a sergeant in the infantry and a newly certified Raven UAV operator. In Afghanistan I can remember instances in which if no American lives were at stake, we could have taken more time to analyze the situation which could have reduced the amount of collateral damage. I have seen first-hand the benefits of robots in combat, benefits a peacenik and a warfighter can appreciate, but I am wary of the potential unseen consequences of relying upon machines to conduct warfare in a manner that coincides with the American conscience.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: