Search tips
Search criteria 


Logo of wtpaEurope PMCEurope PMC Funders GroupSubmit a Manuscript
Int Health. Author manuscript; available in PMC 2010 December 10.
Published in final edited form as:
PMCID: PMC3000595

Computer aided learning to link evidence to paediatric learning and practice: a pilot in a medical school in a low income setting


Bridging the gap between research evidence and practice is problematic in low income settings. Wereport medical students' experience with a pilot computer aided learning (CAL) program developed to enable students to explore research evidence supporting national guidelines. We asked 50 students to enter data from pre-set clinical scenarios, diagnose the severity of pneumonia/asthma and suggest treatment and then compare their diagnosis and treatment with that suggested by a computer algorithm based on the guidelines. Links to evidence supporting the guideline-suggested diagnosis and treatment were provided. Brief evidence summaries and video clips were accessed by 92% of students and full text articles by 86%. The majority of the students showed an interest in the CAL approach and suggested the scope of the approach be expanded to other illnesses. Such a system might provide one means to help students understand the link between research and policy and ultimately influence practice.

Keywords: Computer-aided learning, Evidence-based medicine

1. Introduction

Computer aided learning (CAL) in undergraduate medical schools has been used for some years but is relatively new in sub-Saharan Africa. When used, emphasis is often on helping students learn new concepts or in reinforcing previously learned concepts. Randomized controlled trials on its effects in Europe and Asia produced mixed but generally positive results.1-4 In a study introducing health workers to guidelines such as the Integrated Management of Childhood Illness (IMCI)5 in one low-income setting, Uganda, such training was reported to be more cost effective than traditional approaches.6 The credibility of any training increasingly also relies on demonstrating links between the teaching and the evidence. At the same time there is growing concern over the gap between research and practice,7 especially in low income settings, and increasing effort is being channeled towards bridging this gap.8

Here we report our experience with a pilot CAL approach developed with two main aims: to provide students with an opportunity to explore the research evidence supporting the guidelines in a setting where students have little formal introduction to the principles of evidence-based medicine and where accessing modern research evidence is sometimes difficult;9 to reinforce national assessment, management and treatment guidelines.

2. Methods

2.1. The computer assisted learning too

The pilot tool focused exclusively on the Kenyan government guidelines for pneumonia and asthma.10 These are a local adaptation of WHO guidance on these topics11 and comprise specific algorithms directing assessment, classification and management for these two, sometimes overlapping, conditions. The CAL also took advantage of a standard paediatric admission record (PAR) developed in Kenya.12 Data collection was carried out between 24 May and 6 June 2008.

For the purposes of this exercise we provided six clinical scenarios on pre-filled PARs.13 These included one case each of pneumonia alone with clinical signs indicative of three levels of severity: outpatient pneumonia, severe pneumonia and very severe pneumonia; and one case each where both pneumonia and asthma were present at the same degree of severity. The participants, medical students in this case, were invited to choose three of the six scenarios then key in the symptoms and signs onto a laptop computer using an interface based on the PAR. After this the students were asked to review the information in the scenario and proceed to (i) classify the severity of the pneumonia (and asthma) based on the clinical signs they had entered, and (ii) prescribe the treatment they thought was appropriate for the case(s) entered based on the age/weight given in the scenario. Once these steps were completed, the students were prompted to compare their diagnosis/severity classification and their prescription with that generated by an internal computer algorithm based on the Kenyan guidelines10 and using the entered data (see an illustration in Figure 1).

Fig. 1
A flow chart showing the operation of the computer aided learning tool.

Once this on-screen comparison was provided, the user was offered further links to pages of the site housing materials that supported the guideline. The information available was classified into three levels. We considered the choice to access the different levels, represented by the frequency of access episodes (hits to specific pages of the site), to reflect the trade-off between inertia (nothing accessed, no hits) and the combination of a level's ease of access and the attractiveness of the information offered. In Level 1 (easy to access) we provided brief summaries of the evidence behind steps of the guidelines (for example how respiratory rate thresholds were selected) and videos displaying clinical signs that are included in the guidelines. These could be accessed through single-step inbuilt buttons and links. Information in Level 2 (less easy to access) was accessed in two steps and contained several key original (open access) references in portable document format (PDF) that could be browsed on-screen. Finally Level 3 (hardest to access) was accessed through Level 2 and opened a request page for a printed copy of the PDF, although for this pilot test no printed copies were actually provided.

2.2. Technical development

The tool was developed on a web based platform suitable for use on a standalone computer or, ultimately, on the internet. We used open source tools: PHP ( to create the interface linked to a MySQL database ( The videos were created in a web embeddable, flash video, format ( The CAL was run successfully on Mozilla ( and Internet Explorer ( browsers. We opted for a custom platform as opposed to an off-the-shelf alternative as this allowed for future compatibility with a pilot electronic medical record also being locally developed and it allowed us to engineer the application around the experiment.

2.3. Participants

For this pilot work we included fourth year medical students at the College of Health Sciences, University of Nairobi who were undertaking their eleven week paediatric rotation. After a 2 week pre-pilot with a different student set to debug the system the CAL was formally piloted by introducing the tool to the approximately 100 students rotating in paediatrics in May/June 2008. Student selection was carried out during the working week for two weeks by placing a laptop and facilitator in a side room on the paediatric wards. The first 50 students available to try the tool were required to key in the prefilled PAR records as a way of reinforcing the understanding of the essentials signs for the two overlapping diseases. The facilitator identified and introduced students to the program, provided them with the standard scenarios and a unique, anonymous personal identifier for login. The facilitator was present from 11:00 – 17:00 (after routine rounds for students were completed), did not prompt students but did keep a brief record of students' spoken comments. Only one laptop was available and thus only one student at a time used the system. Students were not given a fixed time limit when using the pilot tool and had to fit using the tool around other teaching or ward activities – in this sense use was opportunistic. Use of the tool was not regarded as part of the official student rotation.

Approval for this pilot work was provided by the Department of Paediatrics of the University of Nairobi and the Kenyatta National Hospital.

3. Results

Four of 50 students keyed in insufficient clinical data from the standard scenarios and therefore the computer could not generate a diagnosis based on its internal algorithms. Of the 46 students successfully using the CAL and invited to explore three standard scenarios, 40 explored only one scenario, 6 explored two scenarios and none explored three.

The proportion of participants accessing material on the different levels of the system at least once decreased from 92% for level 1, to 86% for level 2 and to 24% for level 3. Thus students seemed interested in the evidence summaries and video clips at Level 1 and the original research reference material provided on-screen in Level 2. Based on the proportion of students accessing the levels there was least interest in obtaining printed copies of original research publications (Level 3). However, the average number of hits per participant accessing a level increased from 2.8 at Level 1 to 3.1 at Level 2 and to 6.9 at Level 3 (Table 1). This suggests that, although a smaller proportion of students were interested in requesting printed copies of original research articles, those who were interested requested almost seven research papers each.

Table 1
The proportions of participants, number of hits, average number hits per participant and the range of hits per participant per level.

In this pilot testing, we had very little power to explore associations with an individual's choice to access any of the levels. Bearing this in mind we were unable to demonstrate an association between the number of times an individual accessed any of the three levels (measured as the number of ‘hits’ per individual per level) and: date of participation, dichotomised as week 1 or week 2; time of participation, dichotomised as morning or afternoon; or number of scenarios started, dichotomised as 1 or >1.

The students' written and verbal comments indicated that they generally felt interested in an easy to use interface where minimal effort is required to access materials. They also found the use of videos very valuable in understanding specific clinical signs and many made the observation that the tool would be considerably more useful and interesting if it spanned a greater range of common childhood diagnoses.

4. Discussion

The results showed that the majority of the students were interested in the CAL approach and the reading materials but this interest may be somewhat superficial. This is illustrated by their choice to complete one scenario only and their predominant choice to browse on-screen materials reasonably quickly. However, there did appear to be a sub-group of students, almost one quarter, who delved further into the system, accessed all levels and requested print copies of many of the available information materials. In this limited pilot study we did not set out to explore in detail why students might or might not want information. We are therefore unable to determine whether the sometimes limited exploration of the tool was truly a result of limited inherent interest or simply reflects limited opportunity in terms of the time available to a student user to explore the system. Such questions should be addressed by further work. Other studies have, however, reported that the mode or level of integration of CALs into the curriculum had influence on the uptake by students14,15 and such an incentive might have prompted greater engagement with the tool in our study.

The comments from students encourage us to feel that this approach has considerable potential, if expanded in scope, to introduce students and potentially qualified health workers to new guidelines and the evidence that underlies them, perhaps improving their credibility and implementation.


The authors would like to thank the medical students, the Kenyatta National Hospital and College of Health Sciences, University of Nairobi for assistance in conducting this work.

Financial support: Funds from a Wellcome Trust Senior Fellowship awarded to Dr. Mike English (#076827) made this work possible. The funders had no role in the design, conduct, analyses or writing of this study nor in the decision to submit for publication.


Conflicts of interest: There are no conflicts of interest.

Ethical approval: This work was conducted as part of an evaluation of possible new training approaches with the Department of Paediatrics, University of Nairobi in collaboration with the Kenyatta National Hospital.


1. Bissell V, McKerlie RA, Kinane DF, McHugh S. Teaching periodontal pocket charting to dental students: a comparison of computer assisted learning and traditional tutorials. Br Dent J. 2003;195:333–6. [PubMed]
2. Chaikoolvatana A, Haddawy P. Evaluation of the effectiveness of a computer-based learning (CBL) program in diabetes management. J Med Assoc Thai. 2007;90:1430–4. [PubMed]
3. Howerton WB, Jr, Enrique PR, Ludlow JB, Tyndall DA. Interactive computer-assisted instruction vs. lecture format in dental education. J Dent Hyg. 2004;78:10. [PubMed]
4. Karnath BM, Das Carlo M, Holden MD. A comparison of faculty-led small group learning in combination with computer-based instruction versus computer-based instruction alone on identifying simulated pulmonary sounds. Teach Learn Med. 2004;16:23–7. [PubMed]
5. WHO . Integrated Management of Childhood Illness (IMCI) World Health Organization; Geneva: 2010. © . health/topics/prevention care/child/imci/en/index.html. [accessed 21 January 2010]
6. Tavrow P, Rukyalekere A, Maganda A, Ndeezi A, Sebina-Zziwa A, Knebel E. A comparison of computer-based and standard training in the Integrated Management of Childhood Illness in Uganda Operations Research Results. U.S. Agency for International Development (USAID); Bethesda, MD: 2002. [accessed 21 January 2010]
7. Kazdin AE. Evidence-based treatment and practice: new opportunities to bridge clinical research and practice, enhance the knowledge base, and improve patient care. Am Psychol. 2008;63:146–59. [PubMed]
8. Davis J, Chryssafidou E, Zamora J, Davies D, Khan K, Coomarasamy A. Computer-based teaching is as good as face to face lecture-based teaching of evidence based medicine: a randomised controlled trial. BMC Med Educ. 2007;7:23. [PMC free article] [PubMed]
9. Gituma A, Masika M, Muchangi E, Nyagah L, Otieno V, Irimu G, et al. Access, sources and value of new medical information: views of final year medical students at the University of Nairobi. Trop Med Int Health. 2009;14:118–22. [PMC free article] [PubMed]
10. Ministry of Heath, Kenya . Basic Paediatric Protocols. Ministry of Health; Nairobi: 2005. [accessed 28 January 2010]
11. WHO . Pocket book of hospital care for children. World Health Organization; Geneva: 2005. [accessed 28 January 2010]
12. Mwakyusa S, Wamae A, Wasunna A, Were F, Esamai F, Ogutu B, et al. Implementation of a structured paediatric admission record for district hospitals in Kenya-results of a pilot study. BMC Int Health Hum Rights. 2006;6:9. [PMC free article] [PubMed]
13. iDOC-AFRICA Scenarios for Paediatric Admission Records. 2010. [accessed 28 January 2010]
14. Davis J, Chryssafidou E, Zamora J, Davies D, Khan K, Coomarasamy A. Effective e-learning for health professionals and students-barriers and their solutions. A systematic review of the literature-findings from the HeXL project. Health Info Libr J. 2005;2:20–32. [PubMed]
15. Hege I, Ropp V, Adler M, Radon K, Masch G. Experiences with different integration strategies of case-based e-learning. Med Teach. 2007;29:791–7. [PubMed]