Categories
Uncategorized

Week 5 Post

Summary

Killer robots such as drones, guns and bombs are on the rise. Such weapons are used in combat to substitute a person. These weapons are controversial because they decide on their own, with artificial brains, whether to attack and kill. In the article, "Killer Robots Aren't Science Fiction. Calls to Ban Such Arms Are on the Rise," Adam Satariano, Nick Cumming-Bruce, and Rick Gladstone write about the measures taken by the U.N. having a meeting in Geneva that was followed intently by experts in artificial intelligence, military strategy, disarmament and humanitarian law. The purpose was to see what should be done, if anything, to regulate or ban them. Most of the nations that belong to the Convention on Certain Conventional Weapons said they want to slow down the development of killer robots but were opposed by the United States and Russia. Both countries develop these types of weapons.

These weapons cause suffering and they don't distinguish between civilians and fighters and the C.C.W. has no provisions for these weapons. These sort of weapons make decisions with little or no human involvement. However, the drones the U.S. used in Afghanistan and Iraq were operated remotely by people who choose targets and decide whether to shoot so they are not considered robots. The robots are attractive to people who plan wars because it keeps soldiers out of harm's way and make decisions faster than a human would and at the same time it gives more responsibilities during war to autonomous systems like pilotless drones and driverless tanks that decide independently when to strike.

These types of weapons are risky to critics since they are skeptical the robot cannot differentiate between a soldier and a civilian, adult from child, someone surrendering or hostile. This can be considered a manner in which technology is not necessarily for the better because of the artificial brain being used by the robot. If successful and consistent aiming at the proper, correct target for a high percentage of the time then it could possibly be technology for the better. If it gets agreed to in an agreement to use these weapons then we are looking for a new way to evolve our weapons systems to have a new way of combat on the battlefield.

One reply on “Week 5 Post”

Krste,

I think your blog post frames the AI problem fairly well. As with so many revolutionary technologies that have come before, it is difficult for policymakers to fully anticipate all of the possible applications or to think through the moral, ethical and military issues associated with them. Clearly, however, there are substantial and important issues that require careful examination and discussion before such systems are implemented. If not, once we cross these technological thresholds, it will be extremely difficult to go back. As with the case of nuclear weapons, once a country knows how to use AI technology in military systems, it cannot and will not “unlearn” it. –Professor Wallerstein

Comments are closed.