PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of jgimedspringer.comThis journalToc AlertsSubmit OnlineOpen Choice
 
J Gen Intern Med. 2007 March; 22(3): 327–331.
Published online 2007 January 10. doi:  10.1007/s11606-006-0055-6
PMCID: PMC1824748

The Impact of an Evidence-Based Medicine Educational Intervention on Primary Care Physicians: A Qualitative Study

Kerem Shuval, MPH,corresponding author1 Aviv Shachak, PhD,2 Shai Linn, MD, DrPH,1 Mayer Brezis, MD, MPH,4 Paula Feder-Bubis, PhD,3 and Shmuel Reis, MD, MHPE2,5

Abstract

Background

Attitudes and barriers to implementing EBM have been examined extensively, but scant evidence exists regarding the impact of EBM teaching on primary care physicians’ point of care behavior.

Objective

Gaining insight into behavioral and attitudinal changes of facilitators and participants during a multifaceted EBM educational intervention.

Design, setting, and participants

A qualitative study on primary care physicians and facilitators from a large HMO selected from the intervention arm of a parallel controlled trial using purposeful sampling. We conducted focus groups with 13 facilitators and 17 physicians and semi-structured interviews with 10 facilitators and 11 physicians.

Results

Both facilitators and participants believed EBM enhanced the quality of their practice. The intervention affected attitudes and knowledge, but had little impact on physicians’ ability to utilize pre-appraised resources at the point of care. Using EBM resources during consultation was perceived to be a complex task and impractical in a busy setting. Conversely, a positive impact on using medication databases was noted. Medication databases were perceived as easy to use during consultations in which the benefits outweighed the barriers. The intervention prompted physicians to write down clinical questions more frequently and to search for answers at home.

Conclusions

This study underlines the need not only to enhance EBM skills, but also to improve the ease of use of EBM resources at the point of care. Tasks should be simplified by tailoring evidence-based information retrieval systems to the busy clinical schedule. Participants’ recommendations to establish an HMO decision support service should be considered.

Key words: evidence-based medicine, primary care, medical education

Introduction

Evidence-based medicine (EBM) has become an integral part of undergraduate and postgraduate medical education.1, 2 Coomarasamy and Khan (2004) suggest that although stand-alone and integrated EBM teachings are both effective in improving knowledge, clinically integrated teaching is more likely to facilitate sustained changes in clinical behavior. Moreover, attitudes and barriers to implementing EBM have been examined both quantitatively and qualitatively.38 Physicians generally had positive attitudes towards EBM. They considered evidence helpful in decision making, agreed it improved patient outcome, but also felt that EBM clashed with “the art of medicine,” thereby, reducing clinical autonomy.5 Physicians believed intuition plays a vital role in primary care and that evidence should be considered alongside patient preferences and clinical judgment.8 In addition, family physicians questioned the applicability of research findings to general practice. Local specialists, rather than the medical literature, were important sources and interpreters of evidence and were trusted because of previous success with joint patient care.6 The main barriers to integrating EBM into day-to-day clinical practice were lack of time because of heavy workloads, lack of familiarity with evidence-based resources, difficulty in retrieving information, and limited access to the web in clinics.4, 7

Though ample research has been conducted on physicians’ perceptions of EBM and barriers towards its implementation, scant evidence exists regarding the impact of EBM teaching on primary care physicians’ (PCP) clinical behavior.3, 912 Most studies assessing the impact of teaching EBM have focused on medical students and residents, rather than on practicing physicians.3, 9, 13 To gain insight into knowledge acquisition and attitudinal and behavioral changes of PCPs during a multifaceted EBM intervention, we conducted a qualitative study (March 2004 to July 2005) nested in a controlled trial (CT). The present study examines both participants’ and course facilitators’ reactions to a year-long 21-hour EBM intervention.

Methods

Design

We designed a qualitative study nested in a CT to evaluate the impact of an EBM intervention on the knowledge, attitudes, and clinical behavior of course participants and facilitators, and EBM perception and barriers/enablers to its implementation. The primary objective of the CT was to examine whether an EBM educational intervention can enhance appropriate drug prescriptions and test ordering of intervention physicians (n = 70) in comparison to control physicians (n = 74). The intervention consisted of three 5-hour workshops and 6 academic detailing sessions held in participants’ primary care clinics. It focused on teaching EBM via the “user’s mode,” emphasizing formulating clinical questions, information retrieval, and integrating the best evidence in practice.14 Physicians were encouraged to utilize pre-appraised resources during consultation, such as the Cochrane Library, Bandolier, National Guideline Clearinghouse, TRIP, ATTRACT, DARE, EBM On-Call, CATs, and Drug Search databases. Inclusion criteria of the CT were practicing medicine in primary care clinics of more than 4,000 patients and physicians’ consent.

To enable a small facilitator/participant ratio, approximately 20 accredited family physicians with various levels of EBM expertise and teaching experience were approached to facilitate the intervention. No monetary compensation was offered, only CME accreditation. Thirteen physicians accepted the challenge and participated in a 12-hour pre-intervention EBM course.

Participants

All told, 28 PCPs and 13 course facilitators from the largest Health Maintenance Organization (HMO) in Israel (3.7 million members) participated in the study. The 28 PCPs were selected from the intervention arm of the CT using purposeful sampling15 to capture the physicians’ diverse characteristics. Participants were selected from 2 districts to explore possible regional differences. Other characteristics taken into account in the sampling process were: gender, age, specialization, clinical experience, academic activity, and clinic setting. Facilitators were generally younger than participants (mean ages: 42.0 and 47.0, respectively) and more likely to be academically active (53.8% and 17.9%, respectively) or accredited family physicians (100.0% and 53.6%, respectively).

Focus Groups and Semi-structured Interviews

A total of 6 focus groups consisting of 5–7 participants lasting 60 minutes were conducted. To encourage openness, 3 focus groups for trainees and 3 for facilitators were held separately during the intervention. A total of 17 trainees participated in 1 of the 3 focus groups. All 13 facilitators participated in at least 1 focus group (more than half participated in 2) held during the intervention as part of a formative evaluation process. This enabled both researchers and facilitators to discuss both successes and difficulties encountered during the intervention.

While conducting focus groups, we felt group dynamics hindered some themes from emerging.16 Therefore, we decided to employ an additional data collection method, i.e., semi-structured interviews.17 Conducting interviews in the comfort of the physicians’ own clinics promoted openness and enabled researchers to gain an in-depth understanding of physicians’ perceptions of EBM and the intervention. Using both semi-structured interviews and focus groups allows for methodological triangulation which establishes validity in qualitative research.18 Participants identified by their peers as key informants, and facilitators, who emerged as key informants during focus groups, were approached and asked to participate in 90-minute interviews. Interviews with participants (n = 11) and facilitators (n = 10) were held until data saturation.19, 20

All focus groups and interviews were moderated by 1 of the researchers (KS). The moderator explained the aim of the research and used a predetermined set of questions (Appendix).

Analysis

To bolster the trustworthiness of analysis, 2 researchers (KS, AS) analyzed the data independently. Open discussions between researchers were held to develop or recategorize themes. Initial agreement was high (85% on themes) and discussions were held until reaching 100% agreement. The framework approach21 was used to analyze the data. First, researchers read the transcriptions several times to familiarize and immerse themselves in the data. Second, a thematic framework was developed based on both predefined issues and emerging themes from the familiarization stage. Next, codes were assigned to the data, and thematic charts21 were created. After each of these stages, all researchers took part in the final analysis stage of mapping and interpretation of the data in relation to the original objectives and emerging themes.20 In addition, we compared our findings with preliminary quantitative results from the CT.

Results

The following results reflect the main themes that emerged during data analysis. To exhibit the full spectrum of perceptions in the main themes, both majority (i.e., >70% of physicians’ perceptions) and minority (i.e., <30%) perspectives are presented. Many themes are shared by facilitators and trainees, and are therefore, presented together. However, perceptions of the impact of the intervention diverged between facilitators and participants, and are consequently, presented separately.

Perceptions of EBM

Most facilitators and participants believed EBM enhanced the quality of their practice (Facilitator number 6 = F#6): “EBM helps me give my patients the best available treatment.... I can’t just voice my opinion anymore; I have to state the level of evidence and justify my decisions.” In addition, many found the EBM approach helped them deal with questions regarding the efficacy of complementary medicine and empowering them when dealing with pharmaceutical company representatives (F#6): “I approach alternative medicine in the same way that I approach conventional medicine.... I can show them, in evidence based medicine, that acupuncture, for example, has evidence recommending it for arthritis but not for other problems”; (Participant number 19 = P#19): “I ask the drug reps to show me Absolute Risk Reduction or Numbers Needed to Treat, rather than Relative Risk Reduction. It (EBM) enables me to look at the data critically and discuss research findings more intelligently.”

Although the majority was positive towards EBM, others believed it to be (P#26) “just a fad likely to disappear,” (P#20) “just not practical in a busy urban clinic,” or (F#10) “I think it (EBM) should be coined ‘population based medicine’... results from randomized controlled trials aren’t necessarily relevant to individual patients.”

Perceived Barriers and Enablers to Implementing EBM at the Point of Care

Facilitators and participants perceived barriers to implementing EBM at the point of care were time constraints, work overload, a busy urban setting, and patients demanding redundant treatment (F#4): “ I have 60 people in the waiting room...I’m not going to give a patient a long lecture on why it’s no longer necessary to treat every Streptococcus with an antibiotic. It takes much more time to explain EBM than to simply write out a prescription.”

Moreover, PCPs perceived constantly changing evidence as hindering the practice of EBM. Physicians felt that frequent changes in treatment recommendations as a result of new evidence created uncertainty for patients and physicians alike (F#10): “We regard the results of randomized control trials as the absolute truth. But the ‘truth’ keeps on changing. Look at Beta-Blockers or Hormonal Replacement Therapy...These ever-changing recommendations are really hard for my patients and I to swallow...It was a lot easier back then when medicine was based on instinct and experience.”

Additional barriers consisted of textbooks bereft of EBM jargon, physicians’ scant computer and information retrieval skills, and slow computers. A number of participants felt the doctor’s advanced age to be a barrier to understanding EBM concepts and utilizing online EBM resources (P#18): “I’m 59 years old. You can’t compare me with them (young physicians)... I don’t think as fast as I used too; I’m not as capable with the computer either... After the first session I was really devastated... I went home and cried.”

Enabling factors included the ease of use of medication databases and HMO incentives for better quality of care (F#5): “Doctors should be rewarded for practicing better medicine, and EBM is an integral part of that... I think financial incentives would make a real difference... Today there’s no real reward ...nothing... zilch.” Participants noted that academic teaching and writing clinical guidelines necessitates continual learning and keeping up to date with the latest evidence (P#21): “I teach residents... I don’t want to be caught unprepared if a resident asks me about my opinion regarding a paper that was just published.”

Physicians recommended both personal and organizational strategies to overcome these barriers. Personal strategies consisted of constantly keeping up-to date (via medical journals and email services), meeting regularly with a colleague experienced in information retrieval and jointly searching for answers to clinical questions, leaving medication databases open during consultation, and using patient handouts. The main organizational strategies suggested include providing decision support services, assisting in “real time” decision making and decision support systems (P#23): “It would be very helpful if there was a support service I could call to ask questions... I wouldn’t feel like I’m imposing myself and taking up the specialist’s time”; (F#9): “There might be information overload using a decision support system, but it’s more realistic and less time consuming than looking for the information myself.” Other suggested strategies included: annual EBM knowledge exams and quality of care monitoring, regular staff meetings in primary care clinics, and journal clubs.

The Effect of the Intervention on Attitudes, Perceived Knowledge, and Behavior

Facilitators’ Perceptions of Changes in Themselves

Most facilitators found that teaching the course enhanced their own knowledge and skills; however, opinions differed regarding the impact on behavior. Some felt that their improved information retrieval skills influenced their ability to access EBM resources at the point of care (F#9): “Clinical questions that took me hours at night became 5-minute tasks done during the encounter. It’s a major difference.” But others noted the intervention had little impact on their ability to utilize pre-appraised resources and believed the intervention had missed the mark (F#5): “If the aim of the intervention was that while sitting with a patient I’ll be able to punch in a question and retrieve information within minutes, well we’ve failed. It just won’t happen”. Integrating EBM at the point of care was seen as more feasible through writing down clinical questions, and later, searching for them at home (F#3):“I can’t do much when the patient is present, but I’m able to write down questions which I later search for at home. At our subsequent meeting, I bring printed material and show it to them. This helps.”

Facilitators’ Perceptions of Changes in Participants

Facilitators believed the intervention affected their students’ attitudes, empowered them, improved their computer and EBM skills, but doubted the impact on their behavior (F#2): “Do I fantasize that the intervention influenced my students’ decision making? Unfortunately, no.” Others felt behavioral changes won’t be detected by examining (F#5) “test referral and drug prescription rates before and after the intervention,” but rather, by examining micro-changes (F#6):“ I taught them how to take text out of journal articles and paste it in their patients’ charts...These facets made EBM come to life in their daily practice.” A number of facilitators noted that the intervention’s effect depended on the initial knowledge of trainees (F#10): “The course really made a difference to those who had studied EBM in family medicine residency... Doctors lacking the baseline knowledge had a difficult time.”

Participants’ Perceptions of Changes in Themselves

In contrast to facilitators’ perceptions of trainees, most participants believed the intervention had an impact not only on their attitudes and skills, but on their behavior as well. Trainees claimed their ability to retrieve information improved and reported using EBM resources more frequently (P#23): “Before I used to search for medical information unsystematically. Today I’m more equipped to go online and know where to look for relevant information.” In addition, participants reported that the course affected their utilization of medication databases at the point of care. These databases were accessed to determine dosages, side effects, generic names, and drug interactions (P#25): “As a result of the course I use Micromedex a lot. It helps me when I need info on a new drug.” Participants also reported the intervention caused them to rely on (P#23) “online journal publications rather than outdated books” for clinical decision making. Although many physicians agreed that the intervention changed their behavior, some admitted that changes were mostly perceptual (P#21): “After the course I was really juiced up about the whole EBM idea... I’m constantly thinking about it; I’m in the process of starting... but I haven’t gotten down to it yet.”

Discussion

To the best of our knowledge, no qualitative study has examined the perceived behavioral changes of PCPs participating in an EBM educational intervention. Most qualitative studies, thus far, have focused on physicians’ perceptions/barriers to EBM, and the impact of decision support systems or handheld computers.48, 2225 The present study attempts to bridge this gap by gaining insight into both facilitators and course participants’ perceptions of changes as a result of an educational intervention and their general perceptions of EBM. The results of the study suggest that the intervention positively influenced attitudes and knowledge, but had limited effect on behavior. These findings are consistent with preliminary data from the parallel controlled trial indicating significant changes in knowledge, but having no effect on physicians’ drug prescription and test ordering performance.

Although most physicians viewed EBM positively, some found the shift from an unsystematic approach to an evidence-based approach hard to accept. Physicians’ difficulty in dealing with constantly changing evidence was found in a study by Putnam et al.6 Van Weel and Knottnerus (1999) discussed the complexity of inferring from randomized controlled trial findings to individual patients in the complexity of general practice.26 This perception was shared by some physicians in the present study perceiving EBM to be too generic, and not focused enough on individual need. Perceptions not previously reported are the feelings of self-confidence EBM engendered among PCPs when dealing with pharmaceutical company representatives and patients’ queries regarding alternative medicine.

Organizational barriers such as time constraints and work overload hindered PCPs’ ability to implement evidence. These findings are consistent with previous studies which found that the lack of personal and professional time impeded EBM implementation.4, 7, 27 Physicians nevertheless felt the intervention increased their ability to keep up to date and to search for evidence-based information at their leisure, thereby, influencing clinical decision making.

This study underlines the complexity of changing clinical behavior at the point of care. While the intervention affected PCPs utilization of medication databases, it had little impact on the use of online EBM resources during consultation. Utilizing EBM resources at the point of care was perceived to be a complex task requiring clinical question formulation and relevant resource selection to reach a clinical decision. Even after a year-long intervention, most PCPs found this task to be virtually impossible. In contrast, utilizing medication databases during the patient–doctor encounter was perceived as easy to use, a simple task in which the benefits (rapidly determining dosages and drug interactions) well outweighed the impediments.

Towards the final phase of the intervention, some facilitators anticipated that the intervention would not impact clinical behavior. Consequently they perceived the intervention as failing. In comparison, course participants viewed the intervention more positively. A possible implication of this finding is that in future interventions the importance of attaining interim objectives (e.g., enhanced skills) should be emphasized to facilitators.

The results of the study underline the need not only to enhance physicians’ skills and perceptions of EBM, but also to improve the ease of use of evidence-based resources at the point of care. Although all physicians had access to online resources during consultation, they found tasks to be complex and the organizational barriers overwhelming. Tasks should be simplified by tailoring evidence-based information retrieval systems to the busy clinic, perhaps through integration with decision support systems. Moreover, strategies to overcome organizational constraints such as lack of consultation time, and lack of incentives should be employed. Finally, participants suggested establishing a decision support service facilitating EBM point of care integration and holding journal clubs during staff meetings. These suggestions should be considered.28

The study has a number of limitations. First, we were not able to interview PCPs more than once during the intervention. Pre–post intervention interviews might have given a more accurate picture of changes throughout the intervention. Second, facilitators’ perceptions might be overrepresented in the study’s findings. Third, the fact that PCPs participated in interviews during the intervention might have influenced their behavior (i.e., Hawthorne effect).

Acknowledgments

We would like to thank the primary care physicians and course facilitators who took part in the study. In addition, we thank The Galil Center for Medical Informatics and Telemedicine for giving administrative and logistic support. This paper was presented in The Second Israeli National Convention of Qualitative Research Methodologies on June 5th, 2006.

Financial Support The Israel National Institute for Health Policy and Health Services Research and the Roter Fund (Maccabi Health Services).

Potential Financial Conflicts of Interest None disclosed.

Ethical Approval An exemption from the Helsinki Committee [Institutional Review Board (IRB)] of the Emek Medical Center, Afoula, Israel.

Appendix

Questions (Followed by Probes) for Focus Groups and Interviews

General Perceptions of EBM
  1. How do you perceive EBM?
  2. What’s your definition of EBM?
  3. What’s your opinion of EBM?
  4. What do you think of critical appraisal and the use of pre-appraised resources?

EBM and Clinical Practice
  1. How does EBM affect (if at all) your clinical practice?
  2. Are you able to incorporate EBM in your daily professional life?
  3. Is EBM practicable at the point of care? What’s the best way, in your opinion, to implement EBM?
  4. Do you utilize EBM resources in your practice? If so, do you do so at the point of care?
  5. If you utilize EBM resources, please elaborate on which resources you use. How long does it normally take to find an answer to a clinical question? If you aren’t able to find a clinical answer, what steps (if any) do you take to obtain an answer?
  6. Which factors, in your opinion, inhibit or promote the applicability of EBM?

Impact of the EBM Intervention
  1. What impact (if any) has the EBM course had on your practice?
  2. How has the course affected you (if at all)? Are you able to use EBM more as a result of the course? Please elaborate.
  3. Do you utilize more pre-appraised resources or other facets of EBM as a result of the intervention?
  4. Has the course influenced your behavior when dealing with barriers to EBM practice?

Attitudes Towards the EBM Intervention
  1. What did you think of the EBM course you participated in or facilitated?
  2. What’s your opinion regarding the workshop phase and one-on-one sessions? What has influenced you more (if at all)?
  3. What’s your opinion regarding the content taught in the course?
  4. Did you encounter any difficulties during the course?

Questions for Facilitators Only

Pre-intervention:
  1. How do you perceive your role as facilitator in the intervention?
  2. What are your expectations of the intervention?

Post-intervention
  1. Have any changes (attitudes, knowledge, behavior or other) accrued in you and your course trainees as a result of the intervention?
  2. Which difficulties and successes have you encountered during the intervention?

Additional Information
  1. Is there anything else you would like to add?

References

1. Sackett D, Rosenberg WMC, Gray JAM, Haynes RB, Richarson WS. Evidence-based medicine: what it is and what it isn’t. BMJ. 1996;313:71–2. [PMC free article] [PubMed]
2. Dobbie AE, Schneider FD, Anderson AD, Littlefield J. What evidence supports teaching evidence based medicine? Acad Med. 2000;75:1184–85. [PubMed]
3. Coomarasamy A, Khan KS. What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ. 2004;329:1017. [PMC free article] [PubMed]
4. McColl A, Smith H, White P, Field J. General practitioners’ perceptions of the route to evidence based medicine: a questionnaire survey. BMJ. 1998;316:361–5. [PMC free article] [PubMed]
5. Mayer J, Piterman L. The attitudes of Australian GPs to evidence-based medicine: a focus group study. Fam Med. 1999;16:627–32. [PubMed]
6. Putnam W, Twohig PL, Burge FI, Jackson LA, Cox JL. A quality study of evidence in primary care: what the practitioners are saying. CMAJ. 2002;166:1525–29. [PMC free article] [PubMed]
7. Al-Ansary LA, Khoja TA. The place of evidence based medicine among primary health care physicians in Riyadh region, Saudi Arabia. Fam Pract. 2002;19:537–42. [PubMed]
8. Hannse K, Leys M, Vermeire E, et al. Implementing evidence-based medicine in general practice: a focus group based study. BMC Fam Pract. 2005;6:37. [PMC free article] [PubMed]
9. Parkes J, Hyde C, Deeks J, Milne R. Teaching critical appraisal skills in health care setting. The Cochrane Database of Systematic Reviews 2001; Issue 3. Art. No.: CD001270. DOI: 10.1002/14651858. CD001270. [PubMed]
10. Taylor R, Reeves B, Ewings P, Binns S, Keast J, Mears R. A systematic review of the critical appraisal skills training for clinicians. Medical Education. 2000;34:120–5. [PubMed]
11. Straus SE. What’s the E for EBM? BMJ. 2004;328:535–6. [PMC free article] [PubMed]
12. Straus SE, Ball C, Balcombe N, Sheldon J, McAlister FA. Teaching evidence-based medicine skills can change practice in a community hospital. J Gen Intern Med. 2005;20:340–3. [PMC free article] [PubMed]
13. Norman GR, Shannon SI. Effectiveness of instruction in critical appraisal (evidence based medicine) skills: a critical appraisal. CMAJ. 1998;158:177–81. [PMC free article] [PubMed]
14. Straus SE, Green ML, Bell DS, et al. Evaluating the teaching of evidence based medicine: conceptual framework. BMJ. 2004;329:1029–32. [PMC free article] [PubMed]
15. Patton MQ. Designing qualitative studies. In: Qualitative evaluation and research methods. Newbury Park (CA): Sage; 1990. p. 145–198.
16. Kitzinger J. Qualitative research: introducing focus groups. BMJ. 1995;311:299–302. [PMC free article] [PubMed]
17. Britten N. Qualitative research: qualitative interviews in medical research. BMJ. 1995;310:1084–85. [PMC free article] [PubMed]
18. Denzin NK. A theoretical introduction to sociological methods. Chicago, IL: Aldine; 1970.
19. Glaser BG, Straus AL. The discovery of grounded theory. Chicago: Aldine; 1967.
20. Pope C, Ziebland S, Mays N. Analysing qualitative data. BMJ. 2000;320:114–116. [PMC free article] [PubMed]
21. Bryman A, Burgess R. eds. Analysing qualitative data. London: Routledge; 1993.
22. Freeman AC, Sweeny K. Why general practitioners do not implement evidence: qualitative study. BMJ. 2001;323:110. [PMC free article] [PubMed]
23. Green ML, Ruff TR. Why do residents fail to answer their clinical questions? A qualitative study of barriers to practicing evidence based medicine. Acad Med. 2005;80:176–82. [PubMed]
24. McAlearney AS, Schweikhart SB, Medow MA. Doctors’ experience with handheld computers in clinical practice: qualitative study. BMJ. 2004;328:1162. [PMC free article] [PubMed]
25. Rousseau N, McColl E, Newton J, Grimshaw J, Eccles M. Practice based, longitudinal, qualitative interview study of computerized evidence based guidelines in primary care. BMJ. 2003;326:314. [PMC free article] [PubMed]
26. Van Weel C, Knottnerus JA. Evidence-based interventions and comprehensive treatment. Lancet. 1999;353:916-8. [PubMed]
27. Tracy CS, Dantas GC, Upshur REG. Evidence-based medicine in primary care: qualitative study of family physicians. BMC Fam Pract. 2003, 4:6. [PMC free article] [PubMed]
28. Schwart A, Millam G. A web based library consult service for evidence-based medicine: Technical development. BMC Med Inform Decis Mak. 2006;6:16. [PMC free article] [PubMed]

Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine