Mortality and morbidity patterns in modern conflicts

The nature of conflict has changed considerably during the course of the past hundred years. No longer do young men don uniforms and fight each other far from major population centers. Today's battles do not take their toll only on soldiers. Instead, in our world, 90 percent of the victims of conflict are civilians, whether specifically targeted or innocent bystanders caught up in battles through no intention of their own (see Table 11.1). It is in civilian groups that most excess morbidity and mortality occurs, and childhood (those under five years of age) mortality rates, specifically, are generally found to be two to three times higher than those in the general population. A recent survey in the Democratic Republic of Congo (DRC), including its eastern regions, to which peace has never returned since the days of the Rwandan genocide more than a decade ago, is an excellent demonstration of the consequences of war. In this study, mortality rates are measured in the entire population, and in the under five-year-old population. Deaths are attributed not to the diseases that might be their proximate cause, but rather to the presence or absence of instances of violence in the health zones in which they occurred (Coghlan etal., 2006).

Unfortunately, violence is and always has been part and parcel of human existence. According to one source, in early 2007 there were 41 separate violent conflicts occurring, some of which had begun as long ago as the 1960s (Globalsecurity.org, 2007). Accompanying the violence, the provision of humanitarian assistance to the victims of conflict has an equally long history. The Red Cross movement was founded after Henri Dunant, a Swiss merchant, witnessed the suffering of wounded soldiers at the Battle of Solferino. Recently, one commentator has compared the relationship between humanitarian agencies and war to that of coral divers and water - difficult environmental conditions are the fundamental nature of the business (Slim, 2004).

It is important to point out, however, that few deaths in the Democratic Republic of Congo study cited above were due to trauma resulting from violence. Although the direct consequences of war can be substantial, with many civilians killed - as they were in Rwanda and in the Balkans during the 1990s,

Table 11.1 Crude mortality and under-five mortality in conflict zones

Crude mortality rate,

Under-five

deaths per 10,000

mortality rate

per day (95% CI)

(95% CI)

Health zones reporting violence

3.1 (2.6-3.4)

6.4 (5.7-7.2)

Health zones not reporting violence

1.7 (1.5-1.9)

3.1 (2.7-3.5)

Source: Coghlan et al. (2006).

Source: Coghlan et al. (2006).

in Mozambique during the 1980s, and in Iraq in the first decade of the twenty-first century - it is more common, as in the case in Democratic Republic of Congo, for the vast majority of deaths to be from other causes. From a biomedical standpoint, most deaths in the DRC study were due to those conditions that are responsible for most deaths in the developing world, including infectious diseases such as pneumonia, diarrhea, malaria, and measles, compounded by malnutrition. In this part of the world, at this time, some may have been due to AIDS. Most remarkable, nevertheless, is the disparity between health zones in which violence was occurring and those in which it was not. Although it has always been intuitively known that a population's health must suffer in times of conflict, this study provides direct, quantitative proof.

Although it is not necessarily a prominent feature of the war in DRC, one of the more constant features of conflict, and one that makes an important contribution to increased morbidity and mortality, is the displacement of the affected civilian population. People, in their attempt to flee from violence and the perceived risk of danger to themselves and their families, abandon their homes, their belongings, and their land to seek refuge in nearby areas. Sometimes, members of the community - adult males, for example - will stay behind in an attempt to protect their economic interests, leading to a demographically skewed distribution of the population.

Forced migration results in two distinct groups of people. Under international law, a "refugee" is defined as someone who, "owing to well-founded fear of being persecuted for reasons of race, religion, nationality, membership of a particular social group, or political opinion, is outside the country of his nationality and is unable, or owing to such fear, is unwilling to avail himself of the protection of that country" (United Nations, 1950). According to the Office of the United Nations High Commissioner for Refugees (UNHCR), the organization explicitly charged with protecting and providing assistance to refugees, there were 8.4 million refugees in the world at the start of 2006 - a decline of 12 percent from the previous year. In fact, during the first half of the first decade of the twenty-first century, the number of refugees has fallen by one-third and has reached its lowest level since 1980 (UNCHR, 2006).

Although this trend may seem at first to be quite encouraging, it should not be taken as such. It is true that many refugees have been able to return to their countries of origin (but not necessarily to their land holdings) due to at least temporary improvements in the security situation. Such is the case in Afghanistan, to which almost three-quarters of a million refugees who had been living in Pakistan returned following the establishment of an elected government, and in Liberia, to which approximately 70,000 refugees returned from neighboring countries following the cessation of hostilities. On the other hand, the nature of today's conflicts is such that many people who would want to become refugees are prevented from doing so. For example, in the wake of the first Gulf War, in March 1991, several hundred thousand - perhaps as many as a million - Kurds in northern Iraq, justifiably fearing retribution from the Saddam Hussein regime for their support of its opponents, fled into the Kandil Mountains between Iraq and Turkey, hoping to find refuge in southern Turkey. However, because of longstanding animosity between the government of Turkey and secessionist Kurds in the east, the border between the two countries was effectively closed by either barbed wire or the presence of land mines. Similarly, Haitians fleeing from increasing disorder and lawlessness in their home country, a situation that has resulted in widespread abuses of human rights including killings, arbitrary arrests, human trafficking and sexual violence, have been prevented by the Coast Guard from seeking refuge in the United States and have been returned, without review of their situation, to Haiti.

As a result of these kinds of practices, the number of "internally displaced persons" (IDPs) - people who meet all of the criteria for being refugees but who have not crossed or been able to cross an internationally recognized border - has increased by 22 percent to approximately 23.7 million (UNHCR, 2006). The plight of IDPs is serious. They are proffered no protection under international law, and in fact their welfare is considered to be the responsibility of the government of the country of which they are citizens - the same governments whose abuses, in many cases, they are seeking to flee. For UNHCR, both refugees and IDPs are "people of concern," but the ability to provide assistance to the former is clearly far easier than for the latter, because of not only legal complexities, but also operational difficulties. Even the number of IDPs is difficult to determine, as many are "absorbed" by relatives or friends, imposing additional hardships even on those who are not displaced.

In addition to causing the displacement of millions of people, conflict also takes an important toll on both agrarian and industrial economies. Although there is some dispute as to the magnitude of the impact of conflict on national economies, some estimate that a 15-year civil war would reduce gross domestic product by as much as 30 percent (Collier, 1999; Imai and Weinstein, 2000). Local economies may be even more devastated, with more immediate consequences for the health of the population - particularly through food shortages resulting in a high prevalence of under-nutrition. In rural areas, where a relatively high proportion of food is derived from subsistence farming, farmers may be physically unable to plant as much as they might in the absence of conflict. Where even small-scale commercial food production is a way of life, farmers may be obstructed from bringing their produce to market. Resulting scarcities can be responsible for higher food prices, which, combined with the loss of jobs and currency inflation, leave the general population with less food with which to feed families. More direct effects of conflict, such as the destruction of irrigation systems and theft of produce by soldiers, have also been observed (Macrae and Zwi, 1994). Where conflict is exacerbated by drought, pest infestation, or other detrimental factors, famine can result.

For those city dwellers living in conflict, the food situation can become particularly problematic. Without the ability to revert to subsistence agricultural methods, to find alternative food sources, or to adopt "coping" mechanisms such as consuming seed stock, those whose nutrition depends entirely on a healthy marketplace can rapidly become deprived. For example, in 1992, during the Balkan war, residents of Sarajevo, accustomed to receiving 270 tonnes of food per day, were forced to survive on a total of 216 tonnes - an amount that translates to barely three-quarters of the minimum daily caloric requirement per person (Toole et al., 1993).

Another serious threat to the health of civilians caught up in conflict is the destruction of utilities and other elements of the physical infrastructure, whether intentional, accidental, or through an inability to carry out essential maintenance functions. As above, the burden may fall mostly and most frequently on those living in an urban environment, but even in rural areas the disruption of irrigation systems and of local water supplies, for example, have taken an important toll. In one setting, soldiers intentionally destroyed hand pumps in rural parts of southern Sudan (Dodge, 1990). In some cities caught up in warfare, water supplies have been particularly compromised. In 1993, for example, residents of Sarajevo had only 5 liters of water per day - well below the minimum per capita requirement of 15 liters per capita per day suggested by the Sphere Project (Sphere Project, 2004). Not only was the amount of water severely restricted, but attempts to access water also put people at serious physical risk. Particularly notorious was the so-called "sniper's alley," where civilians were shot at from the surrounding hillsides while fetching water from outdoor fountains after their indoor taps ran dry. Similar situations prevailed in other urban areas of Bosnia as well, resulting in outbreaks of hepatitis A, watery diarrhea, and bacillary dysentery.

In the same vein, disruptions of the power supply can also have serious consequences on health. Clinic and hospital services are severely curtailed, and surgical procedures become more risky if they can be done at all. Drugs, vaccines, blood, and other products that require refrigeration are all likely to perish. In colder climates, the lack of electricity and/or fuel for heating not only increases caloric requirements, but also has an impact on rates of acute respiratory ailments and on exposure.

Finally, violent conflict has a major effect on the availability of health services, and on the ability of a population to use those services that might remain available. In many countries, of course, especially in sub Saharan Africa, health services are severely limited even in the absence of war, but even the most basic primary health care is usually denied to needy populations in a violent environment. At times clinics, even in the most peripheral areas, along with schools, are intentionally targeted by factions intent on disrupting all public functions. The actions of Renamo, in Mozambique, and of the Sandinistas, in Nicaragua, provide good examples of these destructive strategies (Cliff and Noormahomed, 1988; Garfield and Williams, 1992). In Afghanistan today, utilization rates of available health services are distinctly lower in areas in which personal safety has been compromised by Taliban threats to those utilizing government services than in those areas in which the government has been able to re-establish health care and where people can travel safely from their homes to health facilities.

The safety of health workers is also an important issue in times of conflict. Physicians, nurses, and other health professionals may be better able to flee dangerous areas than the rest of the population, should they choose to do so, thereby compromising the delivery of health services. Even more tragically, warring groups may target health workers, and they or their family members may be kidnapped and/or killed. Even health workers engaged by humanitarian agencies to provide relief to war-beleaguered populations are at increased risk. During the 1990s and 2000s, employees of the International Committee of the Red Cross have been killed in Afghanistan, Chechnya, Burundi, DRC, and Sierra Leone, among others. Health professionals working for the International Rescue Committee, Action Against Hunger, Médecins Sans Frontières, and CARE have been killed in Pakistan, Sri Lanka, Afghanistan, and Iraq, respectively, and many other humanitarian organizations have suffered losses of life in other countries, in what has become a particularly disturbing trend that has consequences not only for the health of the populations these individuals and organizations were serving, but also on the way and the extent to which humanitarian relief programs in general will be implemented in the future.

If, as discussed above, the biomedical causes of death in complex emergencies are the same as in developing countries that are not experiencing conflict, and only the circumstances are different, it stands to reason that, in emergencies, deaths from these common diseases occur at far higher rates. In the late 1970s, epidemiologists began to measure crude and age-specific mortality rates in emergency settings and to compare them to estimated baseline rates for developing countries. For example, in the so-called "death camps" of Thailand to which Cambodian refugees fled in 1979 to escape the genocide perpetrated by the Khmer Rouge, the crude mortality rate was 9.1 per 10,000 per day during the first week of resettlement. The incorporation of epidemiological evidence into the health planning process enabled those responsible for providing assistance to establish healthcare priorities. Doing so contributed to lowering mortality to 0.71 per 10,000 per day by the fifth week of the relief effort (Glass et al., 1980). A similarly rapid decline in crude mortality was documented among Kurdish refugees to the Turkey/ Iraq border in the wake of the first Gulf War, in 1991 (CDC, 1991).

In Africa, crude mortality rates were measured in Somalia in the early 1980s, after the Ogaden War, combined with drought, resulted in the flight of more than 500,000 ethnic Somalis from Ethiopia into about 30 refugee camps, and in camps in eastern Sudan, to which Ethiopian nationals caught up in a complex web of circumstances that included drought, inequitable land reform practices, and violent conflict found themselves forced to flee. In each case, high mortality rates were recorded in the early stages of the emergency, and in each case these rates declined relatively rapidly toward baseline levels during the first 6-12 months of the international relief effort (CDC, 1992). Occasionally, less good outcomes have been documented. In the Hartisheik refugee camp in Ethiopia, in 1988-89, crude mortality rates actually rose during the first nine months that humanitarian assistance was being delivered (Toole and Bhatia, 1992).

The application of epidemiological methods and measurements has not been free of problems, though. While two-stage cluster sample surveys have become the method of choice for determining crude mortality rates in a population, along with age-specific (usually under five-year-old mortality) rates, these can be difficult to conduct for a variety of technical and operational reasons. For example, all of the results presented above come from surveys conducted in camp settings. In these settings, the affected population can usually be reasonably well enumerated and located - the selection of clusters is relatively simple, and access to samples selected is practical and safe. Increasingly frequently, however, the population in need of assistance is scattered, difficult (if not impossible) to count, and sometimes very difficult to reach. In these conditions, especially if violence is raging all around, the "epidemiological space" is contracted, and obtaining useful information to guide a relief effort is seriously compromised. Such has been the case, in recent years, in the eastern DRC, in Darfur (Sudan), and in Iraq.

In addition to the difficulties encountered in ascertaining and following trends in mortality, which include problems in determining both the number of deaths and the size of the target population, many humanitarian actors lack the proper training and experience to conduct and interpret epidemiological surveys. A review of more than 20 mortality surveys conducted in Somalia in the early 1990s by a wide variety of non-governmental organizations revealed gross inconsistencies in all aspects of sample design, conduct, analysis, and use of the data (Boss et al., 1994). More recently, an evaluation of two-stage cluster sample surveys used to determine the nutritional status of children in Ethiopia also found serious methodological and analytical problems with the vast majority of surveys conducted by the humanitarian community responding to a perceived food shortage problem there (Spiegel et al., 2004).

The problems with understanding and interpreting data from mortality surveys conducted in emergencies in general, and in conflict settings in particular, are perhaps best illustrated by the controversy surrounding estimates of civilian mortality in Iraq since the start of the war in 2003. Skepticism regarding the results of the survey, which demonstrated levels of mortality far higher than those reported from other, non-epidemiologically valid sources, focused on the two-stage cluster sampling method, which is perfectly suited for the purpose of determining mortality and has been repeatedly proven, even in conflict settings, to give accurate and reliable results (Roberts et al., 2004; Burnham and Roberts, 2006; Burnham et al., 2006). Because of this, the determination of crude and age-specific mortality rates using population-based studies has appropriately become an essential first step in determining the magnitude of an emergency, in designing appropriate humanitarian interventions, and in evaluating the degree to which they are being successfully implemented. Guidelines for interpreting the results of mortality surveys have recently been published (Checchi and Roberts, 2005).

The most common specific diseases that cause excess mortality are closely linked to the circumstances - the impact of a violent environment can be shown, in most cases, to contribute to higher rates of morbidity and of disease-specific mortality. The connection between conflict and food scarcity has been discussed above. The natural consequence of food shortages in a population is malnutrition, and indeed some of the highest rates of malnutrition, as documented by population-based anthropometric surveys, have been documented in populations affected by war. In Somalia, in 1992, following the death of the dictator Siad Barre, inter-clan violence disrupted the food supply throughout the country. At the start of the relief effort, the prevalence of malnutrition in internally displaced children living in camps and aged less than five years was as high as 75 percent. In south Sudan, when food shortages were compounded by a violent environment in 1994, malnutrition was measured at nearly 40 percent in the under-five population. In both the DRC in 1996 and Angola in 1995, malnutrition prevalence hovered around 30 percent (Toole etal., 2001).

Interestingly, although the prevalence of malnutrition is certainly made worse by the presence of conflict, there are instances when it has actually increased during the relief effort. In Iraq in 1991, for example, malnutrition in children aged 12-23 months increased from 5 percent to 13 percent during several months following an epidemic of cholera in the general population, and while the population was receiving considerable external humanitarian assistance. In Hartisheik refugee camp in Ethiopia, cited above, malnutrition prevalence increased from less than 10 percent to almost 25 percent in 1988, with a concomitant rise in mortality, for the simple reason that the relief effort provided less than an adequate daily ration of food to the refugees.

Micronutrient deficiencies have also been documented frequently during complex emergencies. Scurvy, a disease that has become quite rare under normal circumstances, was documented on several occasions in refugee camps in Somalia and Sudan during the 1980s. In every case refugees were receiving external assistance, but relief foods did not contain an adequate source of ascorbic acid (Desenclos et al., 1989). An outbreak of almost 20,000 cases of pellagra occurred among refugees in camps in Malawi fleeing from Renamo-instigated violence in Mozambique in 1990. Again, the outbreak was traced to a lack of niacin in the ration provided by the international relief community (Malfait et al., 1993). Pellagra also occurred in war-torn Angola, among internally displaced persons, in 2000 (Baquet et al., 2000).

There is a clearly established relationship between nutritional status and mortality from communicable diseases, especially in children, even though severe malnutrition may not be the immediate cause of death (Pelletier et al., 1995). High case-fatality rates from measles, traditionally one of the most important causes of death in refugees and internally displaced persons, are probably due to a combination of malnutrition and vitamin A deficiency. Although it has now become an almost reflexive early element of humanitarian response, measles vaccination, which has been available as a safe and effective public health tool since the 1960s, was not routinely implemented until the 1980s. Epidemiologists had identified measles as the leading cause of death in Somalia in the late 1970s, and again in Wad Kowli and other refugee camps in Sudan in 1985. In the latter instance, measles was responsible for more than 50 percent of all mortality in both children and adults. By the end of the 1980s, due to the widespread adoption of a policy making mass vaccination one of the earliest interventions of relief efforts, measles had almost disappeared as a cause of death in complex emergencies to which vaccine could be supplied in adequate quantities. Exceptions occurred when refugees and/or IDPs were "absorbed" into local communities, beyond the reach of the international relief agencies, and were dependent for vaccination on weak national immunization programs.

As mentioned above, diarrhea has always been one of the most common causes of mortality, and its incidence and case-fatality rates are also exacerbated by conflict, forced migration, and malnutrition. Diarrhea has been responsible for between 25 and 85 percent of mortality, and is always among the leading causes of clinic visits in complex emergencies. Diarrhea in emergencies commonly occurs in three forms: acute watery diarrhea, usually of viral origin, and the two epidemic forms - cholera and dysentery - due to Shigella dysenteriae type 1. The cholera outbreak in Goma in July 1994, described in the opening section of this chapter, was atypical in its virulence, but cholera is a common occurrence in situations where there is a breakdown of public health infrastructure, where the population is dependent on primitive and insufficient water supplies prone to contamination, and where sanitation is rudimentary.

Acute respiratory infections, most commonly pneumonia, are also frequently recorded as being among the most common causes of mortality in emergencies, although documentation is not consistent. No studies of pneumonia have been conducted in emergency settings, and knowledge of its epidemiology in these circumstances is extremely limited. Pneumonia is, however, the leading cause of death in non-emergency settings (Black et al., 2003). Also, it is intuitive that overcrowding and inadequate shelter and clothing are likely to increase both the incidence and mortality from respiratory infections (Connolly et al., 2004).

Malaria, especially in Africa, is another leading cause of mortality. Forced migration has been a major risk factor for malaria, and the movement of people with low levels of immunity from areas of low endemicity to hyper-endemic areas has occurred on different occasions. Increased exposure to the elements, from inadequate shelter and clothing, increase the number of potentially infective mosquito bites. Overcrowding also increases the number of bites per person. All of these risk factors are compounded by growing resistance to chloroquine, which renders the effective treatment of malaria far more expensive than it had been. The provision of artemisinin combination therapy, the currently recommended treatment for Plasmodium falciparum in most parts of the world, puts a serious strain on traditional levels of funding for humanitarian assistance in the health sector.

The control of outbreaks of meningococcal meningitis, as well as that of other common infectious diseases such as tuberculosis and HIV/AIDS, and other locally endemic conditions, is rendered more difficult in areas where conflict occurs. As discussed above, all health services in a conflict zone are likely to be seriously disrupted and, even if they continue to be offered, the ability of the population to use them is greatly restricted.

0 0

Post a comment