Estrategia - Relaciones Internacionales - Historia y Cultura de la Guerra - Hardware militar. Nuestro lema: "Conocer para obrar"
Nuestra finalidad es promover el conocimiento y el debate de temas vinculados con el arte y la ciencia militar. La elección de los artículos busca reflejar todas las opiniones. Al margen de su atribución ideológica. A los efectos de promover el pensamiento crítico de los lectores.

viernes, 10 de mayo de 2013

¿Hacia la deshumanización de la guerra?



Technology is making humans the weakest link in warfare
Christopher Coker 10/05/2013.

It was Thucydides who called war "the human thing" - the only definition the Greek historian was willing to offer. War and humanity have evolved together from the beginning. The problem is that 21st-century technology is changing our understanding of war in deeply disturbing ways. "The human [element] is becoming the weakest link," in the words of an unclassified report from 2003 by the Defense Advanced Research Projects Agency (Darpa), the Pentagon body responsible for the development of technologies for the US military.


The problem is that humans are fallible, and the "pilots" of drones - the unmanned aerial vehicles that form a crucial part of US national security strategy - are no different from anyone else. They sit at their consoles for hours (often eight or more at a time) analysing the video stream that comes on to their screens. This is demanding work, which is why they tend to be young - some of them are as young as 19. A few find the stress too much; chaplains and psychologists are often at hand to help them with the demands of living two very different lives: the online and offline (the world of the military and the world they go home to every night).

But from the military point of view, perhaps the greatest challenge is that drone pilots can experience "cognitive overload". The term describes a situation where the amount of information that needs to be processed exceeds the mind's capacity to store or process the information received. In such situations, either we almost instantly forget the information to hand or we are unable to see whether it contradicts or confirms the information we have stored away. Neuroscientists have begun to monitor pilots' brain activity, to make them more mindful of collateral damage. By getting them to focus on different things, they are, in effect, rewiring the functioning of their attention system. The pilots are becoming semi-autonomous.

At some point, the systems themselves will become almost fully autonomous. At Wright-Patterson Air Force Base in Ohio, they are expecting to have operational by 2015 a suite of on-board sensors that will allow drones independently to detect nearby aircraft and manoeuvre to avoid them. Quite soon, drones will be able to start processing their own video streams and in the not too distant future drones will be able to dock independently with air force tankers in mid-air. At some point, in other words, today's pilots may become tomorrow's supervisors and drones will become robots.

A lot of this is lost in the legal and moral debate about the use of drones. So is another issue with implications for the future of war. Scientists often see human beings as imperfect machines at risk of breaking down as a result of faulty software (inadequate ethical training) or defective hardware (a fragile mind). The digital world we have created may be outpacing our neurons' processing capabilities, forcing us to log off emotionally. The neurons associated with empathy, compassion and emotional stability are sited primarily in areas of the prefrontal cortex. In evolutionary terms, this is a recently developed part of the brain that is bypassed when we are stressed or over-anxious. Emotions such as empathy and compassion emerge from neural processes that are inherently slow. It takes time to understand the moral dimension of a situation.

So are humans becoming the weakest link in war? Since 2007, the US military has been trying to programme the next generation of machines with a "conscience" (a set of computer algorithms in place of the moral heuristics that are hard-wired into us by natural selection). In the not too distant future, robots may be able to evaluate the consequences of their own actions. That empathy and compassion will be beyond them will hardly matter since both will be offset - or so we are assured - by what really counts: consistency of behaviour. Robots will not have prejudices. The reduction of inhumanity will balance the loss of humanity.

The Greeks drew a rigid distinction between the organic and inorganic. We do not. They are becoming fused - if you have a pacemaker, you are already a cyborg. Darpa has a vision of "blending the best traits of man and machine". If Thucydides were alive today he might well have to revise his definition of war.

The writer is a professor at the London School of Economics and the author of 'Warrior Geeks'.

No hay comentarios: