This study is the first to describe in detail the types and associated cost avoidance of clinical pharmacy interventions performed by pharmacy students completing their APPEs at a state psychiatric facility. The types of interventions performed by students were primarily related to the psychiatric specialty setting and focused largely on patient education activities, order clarification, medication dosing adjustments, and laboratory monitoring. These results are not surprising given the patient population served in an inpatient psychiatric facility and the focus on patient counseling and ADE prevention by pharmacy curricula.22
The majority of interventions were considered to be at least moderately significant, meaning that students likely had substantial impact on improving medication management as well as patient outcomes. The rating of intervention types as minor, moderate, or major was based largely on literature from similar studies evaluating these factors.20,21,26
However, investigators felt several types of interventions, most notably order clarifications, should be classified at a lower significance level than that reported in the literature. In the case of order clarification, an intervention often regarded as ADE prevention, consideration was given to the fact that APPE students at our site do not input or verify physician orders. Rather, interventions in this category often are initiated by a pharmacist and passed on to the student before team meetings. Given this, investigators felt the APPE students' impact on patient care was much less than if they had initiated this type of intervention themselves.
The present study also suggests that the contributions made by students during their APPEs yielded substantial economic benefits for the host facility. Using conservative methods for calculating cost avoidance, pharmacy students saved an estimated $6,000 to $24,000 over an 8-month period. This savings benefit, while similar to values reported in other studies, is still comparatively low given the significance of most interventions.23-27
This is due primarily to the extreme subjectivity in calculating cost avoidance values. Because no standardized guidelines for economic evaluation of clinical interventions exist, the ability to generalize results from one study to another is difficult. The methods used for calculating cost savings in this study, while based upon the literature, were kept conservative on purpose. The investigators felt a conservative approach, rather than evaluating costs associated with worst-case scenarios, would provide a more realistic and believable estimation of cost avoidance.
This study had several important limitations. Clinical intervention data were self-reported by students, which introduces a potential for bias in results (ie, reporting of some intervention types may have been inflated while others may have been underreported). This may have contributed to the high physician acceptance rate of pharmacy student interventions. Given that other studies have reported physician acceptance rates of student interventions of 60% to 95%, the probability of a 97% acceptance rate may be an overestimate and the result of decreased reporting of rejected interventions.23,26,27
There could be several reasons for the high acceptance rate. Students may have felt that documenting a rejected intervention in essence was documenting a failure on their part. Another explanation could be the perception of patient harm. For example, if an intervention was rejected but did not pose risk to the patient, students may have been less likely to document it. With respect to the reporting of all interventions, another factor should be considered. APPE students typically spend much of the first week at a new practice site becoming oriented to the facility and may not have had a clear understanding of what was considered a clinical intervention requiring documentation, leading to a decreased reporting of interventions during their first week and overall. Additionally, as documentation of interventions was a required component of this APPE and assessed in their midpoint and final evaluations, the potential for over-documentation existed, especially because students entered their own data into the database.
An additional limitation of the study is inherent to its retrospective design, which did not allow for the evaluation of certain outcomes, such as the impact of physician acceptance of these interventions. For this reason, the actual benefits to patient care made by pharmacy students were difficult to demonstrate. Although no definitive proof can be offered to support that patient care improved as a direct result of student interventions, the data presented in this study suggests a positive impact.
One final consideration is the qualitative description of student interventions. The interventions evaluated in this study were not intended to be an all-inclusive list of student activities. Pharmacy students routinely performed a variety of tasks, such as medication use evaluations, assisting pharmacists with research activities, patient care interactions with other health care professional students, and education of pharmacy staff members through journal club and topic presentations, which indirectly could have had a positive effect on patient care and which were not evaluated by this study. While valuable, these activities do not have an associated cost avoidance value and are difficult to document accurately in an intervention database.