|Home | About | Journals | Submit | Contact Us | Français|
‘It is well that war is so terrible else we should grow too fond of it.’ General Robert E Lee
On 28 June 1859, near Solferino, a Franco-Sardinian army of 150 000 encountered an Austrian army of 160 000. A bloody and inconclusive struggle ensued, and by nightfall 30 000 men lay wounded and unattended on the battlefield. This was the episode that, through the efforts of the Swiss businessman Henri Dumont, led to foundation of the International Committee of the Red Cross and to the Geneva Convention granting neutrality to injured combatants and their attendants. Dumont's idea was not, of course, new. In the Napoleonic wars the French surgeon Larrey was famous and loved for his evenhandedness to friend and foe. In modern times it was good to see a Royal Navy physician, after the Falklands Campaign, being honoured by the Argentines as well as by the British.
Although in some armed conflicts—especially civil wars1—respect for the medical ethic and the ‘rules of war’ is patchy at best, there is a movement to put other groups under the shelter of medical neutrality. As predicted by Churchill, the wars of democracy—fought by whole populations rather than by armies—have proved worse than the wars of kings. The public health is threatened not only directly through casualties but also indirectly through dislocation of vital services; there have been suggestions that war should be addressed on a public health agenda including primary, secondary and tertiary prevention. Those undertaking conflict preventive medicine would need to be guided by medical ethical thinking, just like their fellow practitioners in the front line. One group, Peace Through Health, seeks to build on experience with ‘days of tranquillity’, when humanitarian ceasefires have been arranged to allow interventions such as polio vaccination2. As a paediatrician I discuss here the notion that, in future armed conflicts, children at least should be accorded neutral status.
For children, warfare can be seen as a form of abuse against which they require protection3. To prevent the resultant physical and psychological damage seems an obvious good. Nonetheless, it has to be asked whether a nation that has chosen to fight can legitimately seek immunity for selected citizens, however young. Moreover, little can be done to protect against weapons of mass destruction. Even in less extreme forms of warfare there are formidable obstacles to providing immunity for non-combatants—the sheer effort and organization required; how to classify them; the risk to morale of splitting family units; the risk that such groups would be targeted by combatants; the risk that those setting up and running places of safety might themselves be abusive. Not the least is how to achieve quarantine from the free-range activity that is terrorist or guerrilla warfare. But all these obstacles are surmountable. So is the ‘good’ obvious? We must also face the possibility that measures to isolate a vulnerable part of the population would do harm by making warfare more acceptable.
When we consider the child in war, a particular group to discuss is young people bearing arms—‘child soldiers’. Leaving aside the difficult definitions of child (age, development, social judgment?) and war (declared, undeclared, guerrilla, terrorist, freedom fighter?) it is easy to applaud the decision by the British Army to limit combat to those 18 years old and above. However, note the irony: society now insists on rights and empowerment in certain areas of young life but undermines autonomy in others, as here. The distinction between self-determined and peer-pressed behaviour is illusory.
I conclude that, although there is scant chance of a general move to protect children against war, various individual actions can be taken—not least by the military (and its medical teams) on humanitarian and peacekeeping deployments. Other vulnerable groups must be considered. Abundant evidence exists for the mentally traumatizing effects of warfare on children; what is remarkable is the notion that these cease to occur when an individual becomes adult. When the question of women serving in the front line was being debated recently, a correspondent to a national newspaper noted that we should be trying to make the front line a no-go area for men, let alone women. This becomes the priority now that the front line may be a Passover Seder or the Church of the Nativity, the shock troops a regiment of adolescent suicide bombers or terrified young reservists. Approaches to protect certain vulnerable groups seem to me worthy of support, providing that these measures do not detract from the main imperative—to prevent warfare in general. Though often recommended, a public-health strategy has never been tried; there is a powerful argument for doing so, building on the links between peace and health2. Every combatant is somebody's child: warfare ‘makes the impossible possible. It liberates what should be repressed and represses what should be free’4. Nobody knows this better than the military, especially the members of its medical services who have served so bravely and honourably.
Note These ideas were developed during preparation of my presidential lecture ‘The Child and War’ delivered to the United Services Section on 4 October 2001.