Autonomous weapons are destiny, start programming them to do good

Angles logo
Angles headline
Jacob Knutson

Since the dawn of humanity there’s been warfare, and since the dawn of warfare, humanity, through advances in military technology, has gone to extensive lengths to separate itself from it’s horrendous creation.

With the invention of the bow and arrow, the attacker could stand a several feet away from his opponent, instead of confronting them face-to-face.

The invention of the rifle extended this distance to several hundred feet, and artillery, planes and rocketry lengthened it to miles.

If we denounce a weapon as immoral because it physically separates the human operator from its atrocious actions, as autonomous weapons do, then we must deem all weapons as immoral, because all weapons create a physical divide between the operator their victim.

The fighter jet separates the pilot from the several enemies he’s just bombed, the submarine separates the sailors from the ship they’ve just torpedoed and the long-range rifle separates the sniper from her enemy in its scope.

Deeming autonomous weapons as immoral is missing the point. At fault is the inherently immoral activity humans just can’t get enough of: warfare. All weapons are merely tools in satisfying humanity’s incredible lust for destruction and control.

According to Chris Hedges, writer for the New York Times, of the past 3,400 years humans have populated the earth, 268 have been without warfare, defined as an active conflict that has claimed more than 1,000 lives yearly.

Hedges explains humanity’s aggressive history as a product of biology and environment. Because war is often regarded as honorable and noble, it can be viewed as a contest between nations, a chance to compete, win and claim the spoils of others for one’s own country.

Hedges’ estimate reflects two horrible characteristics of humans: We naturally resort to violence when faced with opposition, and we have failed to erect institutions that absolve the need for warfare.

Time and again, our institutions have failed to regulate warfare. For example, last week, global institutions failed to effectively punish North Korea, which successfully tested a new ballistic missile, and Russia, which deployed new cruise missiles.

Both actions were clear violations of international law, but, handicapped by the potential of inciting a new conflict, the rest of the world could only muster public denouncement and economic sanctions against the transgressors.

Until we create institutions that productively regulate global weapon systems and eliminate the desire for warfare, we must use the weapons available to us or others will eventually use them against us, because once a technology is invented it becomes impossible to contain.

Not only will banning autonomous weapons technology fail, it won’t solve the clear problem at hand: humanity’s propensity to use technology for evil.

Because morality is irrelevant when judging weapons, we should be asking if autonomous weapons can be programmed to perform more ethically than armed humans in conflicts.

And because autonomous weapons are on the horizon and will inevitably become a cornerstone of future warfare, we should find a means of doing so before it is too late.

Jacob Knutson is a sophomore  journalism political science major from Rapid City, S.D.