Of particular interest is his focus on the essential inputs for winning victories in war, especially given the longstanding involvement in Afghanistan and Iraq. They key elements he identifies are people, strategy and tactics, and military hardware.
Also there is considerable insight into the kind of roadblocks that institutions might throw up against innovators.
This video is not short, but it provides some great food for thought.
In a recent post I discussed a talk by Dr Peter Singer about robotics and 21st century warfare. This use of technology raises some very big moral dilemmas, especially in the area of law, rules of engagement, and the personal effects of this kind of warfare on both combatants and civilians.
For instance there is no current agreement on which body of law would govern the use of robotic devices in war. Who is to blame for any errors? Is it the operator, who is potentially sitting a continent away? What if there is a software glitch? What happens when the device cannot determine the difference between a child or an old person and an enemy human target? What is the machine equivalent to manslaughter?
These are not trivial questions and, rather than developing complex legislation akin to the Income Tax Act, do we need a Star Trek like ‘prime directive’? This is not science fiction, it is not the future of war – it is already here and operative today.
As Singer said “the fog of war is not lifting, we are still seeing mistakes”. We need to figure out accountability for “un-manslaughter” – he used the example of a drone problem that killed 9 soldiers in training exercise. Armed autonomous systems are becoming commonplace and some big questions remain unanswered. What about war crimes? What about errors? How can the machines distinguish between innocents & combatants? None of these questions mattered when we were just using robots to build cars, but now they do matter.
As Matthew rightly points out in a comment on my previous post, the creation of increasing distance between the killer and the victim makes killing much easier. Killing with a knife or bayonet is so much more personal that pressing a button to dispatch a device to destroy a school (which may or may not have schoolchildren in it at the time) that is being used to house munitions for insurgents.
This continued depersonalisation of killing makes it easier and easier to undertake offensive action. If your own soldiers can sit safely in an office at home and simply use the equivalent of a computer game to attack, then the social and political cost is very low. But what is the personal cost to the soldiers involved? Is killing any less the delivery of death if you do it at a distance? Is going home to eat dinner with the family after dealing death all day at the office even more stressful than being on the ground in a combat zone?
We’re entering some uncharted territory with this technology in terms of its impact on society and upon our warriors, not to mention upon our enemies and upon civilians. As nation states continue to act against non-state actors (such as various insurgent groups or “terrorists”) the dilemma of what happens to innocent civilians will become even more problematic as the element of human judgement and compassion on the ground is removed from the equation.
Further, we do not know if the use of this technology will simply shift enemy action away from their own territory and to our own territory. This last is entirely possible because, if they can no longer kill our soldiers, how else will they seek to make their point and cause us damage?
He opened with scene from Iraq – an IED on roadside and an EOD team sent out to search out and destroy them. One of the ‘soldiers’ was blown up while trying to defuse an IED. The commander wrote home to the factory where the soldier came from, saying “at least when a robot dies you don’t have to write home to its mother”. An indicator of how important these EODs are to the insurgents is the US$50,000 bounty they’ve placed on the head of EOD members. Israel & Hezbollah. Both sides flew drones against each other – even though one was a nation state & other was non state actor.
Another ripple effect is that robotics make war cheaper in terms of people costs by replacing them with machines. This is a real benefit in societies that frown upon deaths of individuals in the service of warfare. The social and political cost of war for governments and politicians drops considerably. The interesting thing pointed out by Singer is that this is happening without any debates in the legislature or in the media – we are seeing a creeping change without open debate as to the efficacy and morality of this technology.
Part of this new kind of warfare is that these machines record everything they see & this is changing the public’s vision or view of war. We are seeing the rise of YouTube wars. Some people are turning war into entertainment, or “war porn”. When video clips of war actions put to music and shown as entertainment – it is very easy to forget the violence is real.
The final issue that arises from this new kind of technology is that we are potentially turning our soldiers into war gamers rather than war fighters. Even now some of them just go to the office and direct drones from US mainland but go home at night to normal family life. This also has an impact on the demographics of war fighting. In the past strength and physical fitness were key. But, as Singer recounted, one of the top US pilots on drone systems is 19-year-old high school drop out. The skills required are changing so the nature of military institutions will need to change in response. In his examples Singer mentioned that some of these drivers don’t even meet they just talk online & only know each other’s handles.
There are some huge moral, social and legal issues that remain to be resolved in relation to this new military technology. Things like un-manslaughter, rules of engagement, etc. More on this later.