They're Almost Talking


The state of Israel has started selling the Harpy, the world’s most intelligent drone, to China, India, South Korea and other countries. It can blow the heads off of presumed enemies without needing to comply with orders or consider collateral damage.

The drones used by the U.S. government to eliminate alleged Islamic rebels and innocent bystanders require personnel on the ground; generally, dozens of technicians who supervise the remote controlled drone in order to minimize collateral damage. The Harpy kills these alleged rebels and innocent bystanders without asking permission and reeks as much havoc as possible.

The Harpy is programmed to attack any radar signal coming from anti-aircraft batteries that doesn’t register as “friendly” in its database. Obviously, the drone ignores other details.

What happens if the anti-aircraft battery’s radar is located on the roof of a hospital? Well, that would be part of the collateral damage. Furthermore, it is well known that the enemy puts radars in hospitals and community centers to make the drones look bad.

The Fog of War

Among the many excuses used by the military to justify errors that result in civilian deaths is “the fog of war.” A battleground is not a chessboard. Therefore, humans are prone to all sorts of mistakes.

But what happens when decisions are made by robots rather than by humans? Well, that has certain advantages. It is highly unlikely, though technically possible, that a military tribunal would judge the pilots or technicians that control the penultimate generation of drones. But it is absolutely absurd to prosecute the drones.

Also, drones have some advantages over humans: They are much cheaper than the systems used by flesh and bone pilots; they don’t suffer from wounds or mental trauma; they have no fear, guilt or exhaustion; and until they fall into disuse, they are immortal.

How I Learned To Love the Bomb

It is estimated that there are currently more than 70 countries in possession of drones, and technicians from many of those countries are exploring ways of making them more autonomous.

The dangers of this tactic were elucidated in Stanley Kubrick’s “Dr. Strangelove.” A series of miscalculations leads to the brink of a nuclear war between the U.S. and the Soviet Union.

When the U.S. president and the prime minister of the Soviet Union try to have a dialogue and overcome the impasse, they discover their utter powerlessness. Airplanes carrying atomic bombs have already taken off toward their destinations.

And since drones are able to carry all sorts of equipment from conventional bombs to nuclear devices, it is possible that, at some point, that same innocuous dialogue will be repeated by two representatives of great world powers trying to solve the unsolvable.

Faced with that prospect, several human rights organizations and advocates for arms control will be meeting in the upcoming days in London, seeking to launch a campaign in favor of banning drones and other lethal robots.

Stephen Goose, director of Human Rights Watch’s Arms Division and one of the event’s organizers, pointed out that the robots facilitate a giant leap forward in war’s dehumanization. A robot does not differentiate between friend and foe, between someone who yields and someone who needs medical help.

Noel Sharkey, president of the International Committee for Robot Arms Control, noted that once, some U.S. soldiers in Iraq discovered a group that was part of the Iraqi resistance.

When they were about to fire, they noticed that the Iraqis were carrying a coffin and were being trailed by a large group of mourners. The soldiers had a dilemma. If they fired, the whole village where the funeral was taking place would turn against them. Therefore, they lowered their weapons and let the retinue pass. Sharkey pointed out that a robot would never be able to make that sort of evaluation.

The Lesser Evil?

Are drones better or worse than humans? Ronald Arkin, director of the Mobile Robot Laboratory at the Georgia Institute of Technology, told Bill Keller, a columnist for The New York Times that, in a way, a drone could be the lesser evil and could result in more of a humane war. Just as drones are completely bereft of compassion, they also lack other emotions such as hatred and the desire for revenge.

Those are the emotions that lead to atrocities. A drone is not interested in destroying an enemy’s back. And since the beginning of time, we know that the worst mistake a soldier can make is to show his back as he flees.

That is the moment when the victor satisfies his vengeful desires. It is enough to look at the killings of the Patriots and Goths during the Colombian War of Independence to corroborate this.

Of his friends who fought in Vietnam, Arkin said, “My friends who served in Vietnam told me that they fired — when they were in a free-fire zone — at anything that moved.” In that sense, Arkin added, “I think we can design intelligent, lethal, autonomous systems that can potentially do better than that.”

When the opinions of sympathizers and opponents of drones and other robotics systems are analyzed one inevitably arrives at the same conclusion: The only thing preventing further violent excesses by military enemies is a balance of fear.

About this publication


Be the first to comment

Leave a Reply