In the Terminator movies, a relentless super-robot tracked and attempted to kill human targets. A few decades later, killer robots are openly sold and deployed in the field of battle. These killer robots — flying drones — are cheaper and probably a lot less discriminating than the movie models. The Chinese-made drones that Pakistani terrorists used to attack the Indian Air Force station in Jammu were destructive enough, but a newer generation of drones poses a greater threat. The Turkish-made Kargu-2 model of killer drone can allegedly autonomously track and kill specific targets on the basis of facial recognition and Artificial Intelligence (AI).
A United Nations report claims that that model has been used to mount autonomous attacks on human targets. These drones hunted down retreating military convoys and attacked them indiscriminately, without requiring data connectivity between the operator and the munitions, a true “fire, forget and find” capability.
The arrival and rapid proliferation of killer drones is no surprise. For decades, consumer technology has been outpacing military adoption of advanced technologies. Because a drone is essentially a smartphone with rotors attached, today’s affordable consumer drones are a product of the rapid development of smartphone technologies. Making access to the third dimension essentially free and creating commercial opportunities, drones can now deliver groceries and medical supplies to your doorstep.
But endowing drones with human-like cognitive abilities, through AI, will make powerful targeted weapons available to rogue militaries, terrorists, and rampaging teenagers, at a fraction of the cost of the fancy drones that the United States (US) government flies. And unless we take steps to stop this, instructions to turn cheap off-the-shelf drones into automated killers will be posted on the internet.
To date, AI has struggled to provide an accurate identification of objects and faces in the field. It is easily confused when an image is slightly modified by adding text. An image-recognition system that was trained to identify an apple as a fruit was tricked into identifying an apple as an iPod, simply by taping to the apple a piece of paper with the word “iPod” printed on it. Protesters in Hong Kong have used paint on their faces to confound the government facial-recognition efforts. Environmental factors, such as fog, rain, snow, and bright light, too can dramatically reduce the accuracy of AI-using recognition systems.
This may allow forces to adopt relatively simple countermeasures to confound current drone recognition systems, but to actors who already place a low value on collateral damage and innocent victims, such accuracy is a lesser concern than it is to human rights activists and others concerned about the loss of innocent lives.
The effectiveness of drones in zeroing in on targets enables their deployment as new weapons of mass destruction. A swarm of drones bearing explosives and dive-bombing a sports event or any densely populated urban area could kill numerous people and would be hard to stop.
Various companies are now selling drone countermeasure systems with different strategies to stop rogue flying objects, and advanced militaries have already deployed electronic countermeasures to interrupt the control systems of drones. But, so far, shooting down even one drone remains a challenge. Israel recently demonstrated an impressive flying laser that can vaporise drones, but shooting down an entire swarm of them is well beyond our capabilities. And simply blocking communication to the drones is not enough; it may be critical to be able to safely bring them to earth in order to avert random chaos and harm.
To a group intent on causing significant damage, autonomous drones open a field of possibilities. Imagine attacks on 100 different locations on a single day; the effects of the Mumbai or World Trade Centre terrorist attacks would pale in comparison.
India is reportedly looking to procure Israeli anti-drone SMASH-2000 Plus systems, among the most advanced defensive weapons in the world. But even these are obsolete technologies: They can’t protect the country from swarms of drones or from attacks launched within cities.
Asymmetrical warfare disproportionately benefits the forces of chaos rather than the forces of liberty. We require a global moratorium on killer robots of all kinds, including unmanned aerial vehicles. But this is not likely to happen because countries making this new wave of autonomous flying weapons are marketing their wares heavily. The US and China have both refused to back calls for a ban on the development and production of fully autonomous weapons, and so are providing a cover of tacit, putative legitimacy for weapons-makers and governments deploying the drones in the field.
In order to be able to establish a defence against such possibilities, India must put its own scientists and innovators on war-footing. India has the skill and doesn’t need to look abroad; even Indian teenagers can assemble drones and write sophisticated AI systems. The Defence Research and Development Organisation has some systems in development, but the government should dramatically increase funding in research and start-ups and have its military, industry, and academia work together. It should make the development of defensive technologies a national priority, just as China has in developing destructive weapons and surveillance systems.
In anticipating the Covid-19 pandemic, India was complacent. We have long known about the dangers of genetic engineering and the possibility of lab accidents, yet did not halt China’s reckless research or prepare bio-defences against it. Yes, the rest of the world was equally complacent. But in failing to anticipate the development of artificially intelligent killer drones, that mitigating circumstance will offer no help and no excuse
HT