Automatic weapons: value-sensitive or poor design?

War and conflicts always express conflicting values: those between countries, cultures, religions, et cetera. In the last decades these value conflicts are becoming more complex with the growing use of automatic and autonomous weapons, like drones and computer-controlled bombs and rockets. Who are we protecting and are values taking serious enough in this design?


A few weeks ago, I read in the Elsevier that F16 pilots prefer radar controlled rockets above GPS controlled rockets, because with radar you have the option to diverge the rocket when the pilots recognize civilians on the target ground. In this example we see that safety for the innocent is (still) important, which is good to hear. We are not fighting against innocent, unarmed people. The different types of automatic weapons, however, do not always give us the opportunity to do so. We actually transfer the wheel to technology, whether or not intended by human actions.

In the Second World War, so many flight battles have been taken place where a lot pilots found there grave in the ocean. Today, large scale battles like that are not likely to happen again, but on a small scale they still take place, especially in the Middle East. Very recently, it’s been investigated that flight MH17 is taken down by a BUK missile, which is led by radar to his target, where it explodes and spreads a hail of fragments, which causes devastation to materials but also to human life. Whether or not this was an intended action, it shows what results such weapons can have.

Automatic weapons are intended to make the task of a pilot safer. They can be targeted precisely, so that mishits are factually excluded (they still take place though). In this way, the value of safety for pilots and innocent people is taken into account. On the other side, it is still designed to destroy and kill. This is an infringement on the right to live for everyone. In war, other laws apply, but in conflict situations these weapons are, most of the time, used to destroy as many as possible. The enormous destructive possibilities that these weapons have, are not taken into account enough in the design phase. Otherwise it would be impossible to design such weapons. But so far we already are.


Automatic weapons also give people the feeling that they are never totally safe, wherever they are. There is always a possibility that a drone films them, a rocket will be targeted to them or a bomb is aimed at them by radar or infrared. This causes an anguished feeling, what can be the reason why so many conflicts arise when there is only the idea that the other party is planning to do something. These are things thought about now, but clearly not enough earlier.

So we can state that automatic weapons have the advantages for the ones who ‘control’ them, but they break other values, like the right to live. Is it ethical to act like this? Is it value-sensitive? Only partial.