Uganda has been following a systematic approach to planning and implementation of the IDSR strategy since 2000. Periodic reviews of the indicators monitoring core and support functions were recommended by WHO AFRO, the US Centers for Disease Control (CDC), ministries of health and all implementing partners as a means for assessing national surveillance systems (WHO AFRO 1999
; WHO AFRO 2000
; WHO 2002
). We evaluated the progress of IDSR implementation in Uganda at national, district and health facility levels from 2001 to 2007.
At national level we reviewed the progress of IDSR implementation through analysis of the national core indicators adapted from the standard WHO AFRO IDSR indicators listed in . The implementation of IDSR at national level was evaluated through analysis of the core IDSR indicators, funding for implementation and comparison of costs before and after IDSR implementation. Analysis of IDSR core indicators focused on completeness and timeliness of weekly epidemiological data, morbidity and mortality data for cholera and meningococcal meningitis as tracer conditions for IDSR performance from 2001 to 2007 (). Completeness was computed based on proportion of districts and health units per district which submit weekly and monthly reports in a calendar year. Timeliness was computed based on proportion of districts that submit timely reports in a calendar year. Timely submission of reports was receipt of district epidemiological data by Thursday following the end of the previous epidemiological week. Attack rates (AR) and case fatality rates (CFR) were computed based on the district level weekly and monthly reports of cases, deaths and at-risk populations per evaluation year. Cholera and meningococcal meningitis were selected among the epidemic prone diseases under surveillance as the tracer epidemic diseases to monitor attack and case fatality rates regularly.
IDSR core indicators and their targets
Key IDSR performance indicators at national level, Uganda, 2001–07
The adapted IDSR core indicators () and their targets were included in the national surveillance databases in order to routinely monitor progress with essential surveillance functions, such as detection and reporting of priority diseases, analysis and interpretation of data, investigation (including laboratory confirmation of suspected outbreaks) and response to epidemic threats, and provision of feedback.
Denominators were determined by the number of units for a given indicator that exist in the country. For example, total number of health units in a district constitute the denominator for timeliness or completeness of reporting in a given period, and the total outbreaks reported in a country constitute the denominator for outbreak timely investigation and response. The designation of denominators, therefore, constituted the unit base, which was in the form of structures such as health units, health workers or health events.
We also examined the national level funding for IDSR implementation from 2000/2001 to 2007/2008. We conducted a cost analysis to compare funding for surveillance before (1996–99) and during IDSR implementation (see ) using annual budgets and financial reports. Data on funding from 1996 to 1999 are aggregated because the national level vertical programmes conducted the surveillance functions directly at district and regional levels. Decentralization of the financial management levels had not yet taken effect. However, from the year 2000, the funds for IDSR implementation were disbursed directly to national, regional and district levels. The mean costs analysed were in line with key resources involved in implementation of IDSR (as shown in ).
Comparison of costs before (1996–99) and during (2000–07) IDSR implementation
At district and health facility level, an evaluation on the performance of IDSR core and support functions was carried out in 2004. In order to allow comparison and determine progress, the 2004 IDSR evaluation was based on the same parameters as the IDSR baseline assessment in 2000. In the 2000 IDSR assessment, eight of the then existing 45 districts of Uganda were sampled. A multi-stage stratified sampling was done in which four regions were selected purposely to have regional representation. In each region, two districts were selected by simple random sampling, giving a total of eight districts, and in each district, eight health facilities were selected again by simple random sampling. Because the 2000 baseline assessment was based on a small sample, comparison with the 2004 evaluation poses a limitation.
The evaluation focused on the IDSR structures at national, district and health facility levels, core and support functions, inputs, processes, outputs and outcomes of IDSR, based on the key interventions from the 2000 IDSR Plan of Action. We used similar methods to collect data, including structured questionnaires administered to personnel at health facility, district and national levels; observation checklists and key informant interviews. The sampled districts and health facilities were part of IDSR implementation from the time of its inception in 2000 to the time the evaluation was conducted in 2004.
We analysed the 2004 evaluation survey results to compare the performance of district and health-facility levels in 2004 with the 2000 baseline (CDC 2000
). The 2004 evaluation of the district and health-facility levels was based on a three-stage sampling technique. In the first stage, two districts were selected by random sampling from each of the 10 geographical regions, giving a total of 20 districts for the evaluation out of 56 in the country. In the second stage at district level, two Health Sub-Districts (HSDs) were selected by simple random sampling, giving a total of 40 HSDs from the 20 districts. HSDs are sub-operational levels with a health facility capable of handling outpatient and inpatient health care services, and an oversight role to lower-level health facilities in their catchment area. To assess performance at the health facility level, in the 3rd stage, a total of 217 health units (both government and private, at different levels) were selected through multi-stage random sampling.
The evaluation used both quantitative and qualitative methods, with a standardized questionnaire for most aspects and key informant interviews at national and district levels. The tools were administered to a sample of health facility staff who were the in-charges of the facilities, and were therefore purposely sampled to collect information on the IDSR organizational structure, flow of information, list of diseases and conditions under surveillance, availability of guidelines and standards, feedback mechanism, as well as training in IDSR.
Data management and analysis
Data entry screens were developed using the Epi Info Version 3.4 computer package, with consistency and logic checks. An analysis plan was developed and analysis was done using Epi Info Version 3.4 (CDC 2008
). In addition, we used qualitative data from key informant interviews to supplement the quantitative data and to explain some of the issues related to the success and challenges of IDSR implementation.
The data on cost analysis were entered and analysed in Microsoft Excel 2003 (Microsoft Corp., Seattle) to calculate the means, standard deviations and also to compare costs before and after institution of the IDSR. To determine the cost for the implementation of IDSR, the cost of all the different items was identified.
We calculated the costs using annual budget and actual remissions to the region, district and health centre level. Annual financial resources provided by the Ugandan government for IDSR implementation were obtained from the annual government-approved budget for the Epidemiological Surveillance Division of the Ministry of Health (MOH), the focal unit for IDSR implementation. These were entered into a Microsoft Excel sheet for comparison over time. The population estimates were obtained from estimate data from the Uganda Bureau of Statistics
. We used this data to compute the mean annual cost per capita per year for all IDSR activities over the two comparative periods/systems using population estimates and mean annual costs.
We analysed the costs involved in the implementation of the strategy and compared them with the costs before its introduction. The mean costs analysed were in line with key resources involved in implementation of IDSR (see ). The costs from 2000–07 were annualized and then added together to obtain the average.
For comparison, the costs from 1996–99 were also summed up to reflect the mean annual costs. Some costs for the latter period were estimated using proxy figures of the previous year and similar programmes at the MOH. The costs of equipment, supplies and processes and feedback on IDSR were considered at national, regional and district levels. Capital costs were depreciated at 5% annually over a 10-year useful-life time horizon. The depreciated costs for vehicle and office equipment were added together to get the subsequent cost over the 7-year period and the same was done for the period 1996–99.