PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of amjpharmedLink to Publisher's site
 
Am J Pharm Educ. 2012 April 10; 76(3): 46.
PMCID: PMC3327244

Portfolio Use and Practices in US Colleges and Schools of Pharmacy

Abstract

Objectives. To identify the prevalence of portfolio use in US pharmacy programs, common components of portfolios, and advantages of and limitations to using portfolios.

Methods. A cross-sectional electronic survey instrument was sent to experiential coordinators at US colleges and schools of pharmacy to collect data on portfolio content, methods, training and resource requirements, and benefits and challenges of portfolio use.

Results. Most colleges and schools of pharmacy (61.8%) use portfolios in experiential courses and the majority (67.1%) formally assess them, but there is wide variation regarding content and assessment. The majority of respondents used student portfolios as a formative evaluation primarily in the experiential curriculum.

Conclusions. Although most colleges and schools of pharmacy have a portfolio system in place, few are using them to fulfill accreditation requirements. Colleges and schools need to carefully examine the intended purpose of their portfolio system and follow-through with implementation and maintenance of a system that meets their goals.

Keywords: portfolio, assessment, evaluation, competency achievement, pharmacy practice experiences, pharmacy education

INTRODUCTION

Accrediting agencies at the national, regional, and professional levels have stressed the assessment of educational outcomes for more than a decade to improve the evaluation of student learning.1 This shift from traditional process-oriented to outcomes-oriented accreditation implies that colleges and schools must now provide evidence that learning outcomes are achieved, rather than simply have an assessment process. Even with this shift, the nature of accreditation may still result in assessment of student learning becoming another set of activities to accomplish rather than an actual demonstration of learning outcomes.2-4

A variety of approaches and tools for evaluating student learning have emerged with the advent of accreditation-based outcomes assessment. Student portfolios are one such approach, as implied by the Accreditation Council for Pharmacy Education (ACPE) Standards and Guidelines 2007: “In general, the college or school's evaluation of student learning should … demonstrate and document in student portfolios that graduates have attained the desired competencies, when measured in a variety of health care settings.”5

A difficulty in requiring student portfolios for assessment is the lack of consistency within the existing literature and research of approaches to summative assessment of competency, and the definition, role, and components of portfolio assessment. Also, legal and psychometric issues remain to be resolved in using student portfolios in summative assessment. Thus, issues and concerns about portfolio use remain under investigation.6-9

Traditionally, portfolios have been used in higher education and defined as: “… . a purposeful collection of student work that exhibits the student's efforts, progress, and achievement in one or more areas. The collection must include student participation in selecting contents, the criteria for selection, the criteria for judging merit and evidence of self-reflection.”10

While relatively new to pharmacy education, the emerging application of portfolios for assessment purposes appears to focus on the more constructivist paradigm emphasizing self-reflection, which is prevalent in the nursing literature11:

“Reflective portfolios are a collection of evidence that through critical reflection on its contents demonstrate achievement as well as personal and professional development through a critical analysis and reflection of its contents.”12

Both definitions highlight the paradigmatic conflict between constructivist and positivist portfolios that was identified over a decade ago.13 Whereas a positivist portfolio assesses learning outcomes defined externally (eg, accreditation standards, institutional mission/goals) that are constant across users, contexts, and purposes, the constructivist portfolio is more a learning tool in which the student constructs meaning, and that will vary by individuals, time, and purpose. Thus, the difficulty is in differentiating and choosing between the constructivist student-composed and owned portfolio approach, which is supported by McMullan and colleagues11 and Plaza and colleagues,12 and a positivist portfolio approach used by faculty members as an assessment management system and receptacle for student work to document evidence of students’ progress toward meeting externally developed competency standards. The choice will result in the development of entirely different portfolio activities: the positivist approach places a premium on the selection of items that reflect external standards and interests, whereas the constructivist approach emphasizes the selection of items that the student believes reflect learning.13 Further confusing the issue is the current trend toward online assessment management systems (a positivist approach) that are being called “electronic portfolios.” New systems are continually developed and marketed to educational programs and frequently offer numerical scoring of artifacts against a rubric with statistical analyses for aggregating collected data. Consequently, such electronic assessment management systems may be changing the more traditional (albeit ambiguous at best) definition of student portfolios.

The literature lacks a consistent message for the use of student portfolios as a means of outcomes assessment. The purpose of this study was to identify the prevalence of student portfolio use in US doctor of pharmacy degree programs, common portfolio components, and advantages and limitations to portfolio use in pharmacy education.

METHODS

A cross-sectional survey design using census sampling of all 109 US pharmacy schools and colleges was conducted. The primary inclusion criterion was having at least ACPE candidate accreditation status as of January 2009.14 The 17-item survey tool used was constructed by the authors. Because the study purpose was to describe the prevalence and use of student portfolios nationally, the survey validity was determined by assessment of the content validity through appropriate domain sampling to select and construct the items. No numerical value exists for content validity; it is determined by thorough inspection of the items with accumulation of evidence through content analysis of relevant texts and literature combined with review of the instrument by “experts” in the field. Typically, a table of specification and field-testing provide such evidence.15-19 A table of specifications (Table 1) was used to define the 4 survey domains and to guide the respective item-sampling process. The 4 domains included content (eg, holistic vs. behavioral competencies), methods (eg, rubric versus reflective, student vs. faculty/preceptors), training and resource support, and benefits and challenges of use. The table and items were developed through a review of the education and pharmacy disciplinary literature and input from faculty members trained and experienced in educational evaluation within the schools of pharmacy and medicine at Creighton University.

Table 1.
Specifications for a Survey of Experiential Coordinators at US Colleges and Schools of Pharmacy Regarding Portfolio Use

The survey instrument was pilot tested with a purposive sample of experiential faculty members and preceptors in the Pharmacy Practice Department within the Creighton University School of Pharmacy and Health Professions. The subsequent revisions were coordinated through an item review and refinement process, which further contributed evidence supporting content validity.16,18 The survey instrument was electronically disseminated (SurveyMonkey, Palo Alto, CA) in the spring of 2009 to experiential administrators at all 109 colleges and schools of pharmacy using a modified Dillman Web survey methodology.20 Survey responses were transferred to an Excel spreadsheet before analysis via SPSS, version 17.0 (IBM SPSS, Chicago, IL). The study was approved by the Creighton University Institutional Review Board. (The survey instrument is available from the lead author upon request.)

Descriptive and nonparametric bivariate statistical analyses were conducted from the data generated by the survey instrument. We stratified programs and analyses by “new” vs. “established” portfolio use based on the number of years since implementation. Programs reporting 6 or more years of use were considered “established” portfolio users based on the a priori assumption that this timeframe allowed for a full cycle of 1 graduate class exposed to portfolio use and subsequent programmatic evaluative feedback. Bivariate analyses used the chi-square and Mann-Whitney U tests for categorical and continuous data, respectively. No further adjustment of type I errors due to multiple significance testing were used. Content mapping provided through the table of specifications guided all analyses.

RESULTS

Eighty of the 109 pharmacy programs returned completed survey instruments for an overall response rate of 73.4%. To assess the representativeness of our resultant sample of 80 programs to the population of 109 US pharmacy programs surveyed, we compared 5 primary demographic characteristics. These included type of institution14 (public vs. private), regional location,21 urban/rural location,22 class size,23 and institution tenure.14 Nonparametric goodness-of-fit tests conducted included the binomial test and goodness-of-fit chi-square test for binary and categorical demographics, respectively, and the one-sample t test for continuous demographic characteristics. All tests were 2-tailed with an alpha (α) level of p < 0.05.

Each of the 5 demographic characteristics yielded nonsignificant results, indicating that the demographics of our study sample of pharmacy programs was no different than those of the population of 109 programs surveyed in 2009 (Table 2). Programs tended to be urban public institutions located in the southern region of the United States with an average class size of 117 and 16 years of experience as an accredited PharmD program.

Table 2.
Comparison of Demographics Between Population and Study Samplea

Student portfolios were used in 70 of the responding institutions (88%). Of the 10 programs not using portfolios, only 1 was new with candidate status. Seven of the 10 programs were in the process of implementing some form of portfolio system within the next academic year, while the remaining 3 programs identified faculty/staff workload and implementation problems impeding portfolio use.

The majority of the 70 programs using portfolios (91%) provided an institution-specific purpose statement of portfolio use and assessment. Primary themes included portfolios were used as a tool to compile performance examples documenting achievement and progression throughout the curriculum for assessment by faculty members. Most institutions (80%) believed portfolios should be used beyond the experiential portions of the curriculum (ie, introductory pharmacy practice experience and/or advanced pharmacy practice experience), while few (23%) stated that student self-reflections should be a primary focus of such a system.

While the majority of respondents implied having a more behavioral and rubric-based approach to portfolio assessment, a few colleges and schools included a more reflective approach combined with documentation of achievement. The 70 programs reporting the use of portfolios had been doing so for a mean of 4.5 ± 4.4 years (median = 3.0 years). Programs were dichotomized into established users (≥ 6 years use of portfolios) or new users (< 6 years) based on the authors’ a priori assumption that 6 or more years allowed for a full cycle of 1 graduate class exposed to portfolio use and subsequent programmatic evaluative feedback. New users represented 52 programs with a mean of 2.4 ± 1.8 years of portfolio use (median = 2.0 years). The remaining 18 established portfolio users had a mean of 10.6 ± 4.2 years of portfolio use (median = 9.5 years). Years of portfolio use ranged from first year of portfolio development to having used student portfolios for 21 years. Table 3 further summarizes this information in addition to the type of learning environment and portfolio format.

Table 3.
Portfolio Use, Environment, and Format at US Colleges and Schools of Pharmacy

While no significant differences were observed in the type of environment between new and established users, both reported a predominance and exclusive use in the experiential components of the professional program (ie, introductory pharmacy practice experience and/or advanced pharmacy practice experience). Sixty-one percent of the programs used portfolios exclusively in their experiential curriculum; stratified by subgroups, 67% of established users and 60% of new users exclusively use portfolios in the experiential component.

Established users relied significantly more on paper-based portfolio systems than their counterparts who were in the early years of portfolio implementation. Established portfolio users also required significantly greater diversity of data typology (ie, multiple types of data) in their portfolios than their new portfolio user counterparts. Established portfolio users also reported significantly greater inclusion of samples of student assignments and samples of student experiential evaluations in portfolios. Table 4 summarizes the portfolio content information.

Table 4.
Portfolio Content Typology by Utilization Group

As summarized in Table 5, approximately a third (23) of the 70 programs did not formally assess the documents (5 established portfolio user programs and 18 new portfolio user programs). Of the 47 programs that formally assessed portfolios, 16 (34%) had a specific person assigned to assess portfolios using a standardized grading rubric. The majority of programs used either an open-ended feedback or committee/group consensus process (45%), or a combination of this non-graded approach and a graded process using a standardized rubric (21%).

Table 5.
Assessment of Portfolios, No. (%)a

Experiential faculty members (66%) were predominately responsible for assessing portfolios, followed by program faculty members (28%). The frequency of portfolio assessment followed a formative approach, with most programs assessing portfolios at the end of each experience (23%), each semester (34%), or annually (17%). Only 1 program used student portfolios as a summative/capstone assessment of the entire professional curriculum. All but 4 programs (92%) provided feedback to students, with the majority of feedback consisting of verbal or written qualitative comments to address strengths and weaknesses and/or using a checklist indicating pass/fail or complete/incomplete based on portfolio content. Similar findings were observed across established and new portfolio users in terms of methods of assessment, frequency of assessment, feedback provided to students, and personnel involved in the assessment process. The median time to assess a portfolio was 2.5 hours and differed significantly between established (4.5 hours) and new portfolio users (45 minutes).

The most common benefit that approximately one-fourth of the respondents cited was that the portfolio was a useful tool for student self-assessment and reflection, which would hopefully lead to students’ development as professionals and skills for life-long learning. The next most often reported benefits of portfolios were their use as a data repository (18%) that included CV information as well as various achievements and evaluations, and as a resource for preceptors to learn where a student is in their professional progression, which would allow them to provide the student with a better and more individualized practice experience (16%). Approximately 10% of respondents indicated that the benefit of portfolios was as a quality assurance and/or assessment tool for their experiential activities and the curriculum. Many of the respondents who listed this benefit of portfolios had not yet used portfolios in this manner or seen the results of doing so.

The most common challenge of portfolio systems, which was cited by one-third of respondents, was the workload and time associated with their implementation and maintenance. Twenty percent described their paper-based systems as cumbersome, while another 20% described the challenges of student buy-in and motivation to use the portfolios. Approximately 10% indicated that their challenges centered on faculty questions about how to use, where to use, and how to incorporate portfolios into their curriculum; how to interpret and assess portfolios; and whether they provide new and useful information that is valuable to their program. Another 10% listed the lack of functionality, adaptability, and expense of their electronic portfolio systems as their challenge.

DISCUSSION

Literature describing best practices for using student portfolios is inconsistent and lacks clear guidance for pharmacy education. Traditionally, portfolios have been used in higher education as a purposeful collection of student work that exhibits student progress and achievement of educational outcomes. However, the nursing and pharmacy literature emphasizes student critical self-reflection as a major component of student portfolios. ACPE Accreditation Standards now require pharmacy programs to use portfolios as a means of documenting students’ achievement of competencies as they progress through the classroom and experiential curriculum.

Because the literature includes various approaches to portfolio use rather than a consistent message, we chose to survey all US pharmacy colleges and schools to determine the prevalence of student portfolio use, common portfolio components used, and advantages and limitations of portfolio use. The majority of institutions reported using some type of portfolio system; however, there was variation among the programs in terms of years of use, portfolio format, content, and assessment. The majority of programs used student portfolios as a formative evaluation, grounded primarily in the experiential curriculum, with only one program reporting portfolio use as a summative evaluation of the entire professional curriculum (ie, educational outcomes).

There is an interesting incongruence with regard to where respondents believed portfolios should be used versus where they are actually being used. According to the portfolio purpose statements, 80% of institutions believed that portfolio use should include the entire professional curriculum, not just the experiential portion. However, the majority of programs (62%) reported using portfolios exclusively in the experiential curriculum. In other words, how colleges and schools of pharmacy believe that portfolios should be used has not translated into practice.

The benefits of their portfolio systems that respondents cited were similar to their definitions regarding purpose, but it is unclear from this study whether the respondents believed that the portfolio benefits outweighed the challenges, other than in meeting accreditation requirements. Interestingly, the most common benefit cited for portfolios (student self-assessment and reflection) was inconsistent with the most common purpose cited for portfolios (a tool to compile performance examples documenting student achievement and progression). While assessment of competency and programmatic outcomes is consistent with ACPE accreditation standards, few respondents were actually using portfolios for this purpose, although several indicated that they planned to do so.

Common challenges cited by colleges and schools that used paper-based systems, such as the cumbersomeness, might only be replaced by new challenges if they were to switch to an electronic system, as those using electronic systems listed the expense and lack of flexibility and adaptability of the system to college/school needs as drawbacks. The reliance of new users of portfolios (ie, < 6 years of using portfolios) on electronic systems as a quick approach to implementing a data collection system for assessment is not surprising given the plethora of commercial electronic data management systems currently available.

Student portfolios can be a useful tool in assessing student progression and achievement of educational competencies. ACPE accreditation standards require pharmacy colleges and schools to use portfolios to document student progressive achievement of competencies throughout the curriculum and practice experiences. Articles within the literature recommend various approaches to using student portfolios ranging from a data receptacle of externally-driven competencies and assessment management system to student-driven perspective of learning and critical self-reflection. This study showed that the majority of pharmacy colleges and schools are using student portfolios within the curriculum. Most of the colleges and schools surveyed (62%) are using portfolios exclusively in experiential education. However, there is a wide variation regarding portfolio content (sample assignments, 76%; student self-reflections, 76%; student resume, 70%; professional service activities, 64%) with a mean of 3.9 of content typologies used in the portfolios. The majority of colleges and schools that use portfolios do formally assess them (67%), but there is a wide variation as to how they are assessed, who is assessing them, and how frequently they are being assessed.

Schools are challenged to select, assess, and manage assessment data. Electronic assessment management systems contain information that is useful for formative and summative assessment and are typically institution or program-centered based on external criteria. This contrasts with traditional student portfolios, which are student-constructed, student-centered, and reflective in nature. Issues and concerns of student portfolios have been raised and revisited for close to 2 decades. Remaining issues pertinent to pharmacy education include: Are students qualified to self-reflect and then use this information to determine whether learning outcomes are achieved, ie, do students know what they do not know? Does emphasis on a students’ “best” work misrepresent their “typical” work, compromising the true assessment of competency? A portfolio done well, and assessed with standardized rigor, is resource intensive: is it worth the effort? These questions represent only a few issues continually raised on using portfolios in outcomes assessment that need further discussion, clarification, and research.

Differentiating between an assessment management system and a portfolio system is important-- not only in definition, but in how each supports learning versus accountability differently. A consensus on the role of portfolios in teaching, learning, and/or assessment is imperative before pharmacy colleges and schools can appropriately incorporate them, and other tools, into the pedagogy of pharmacy education.

A limitation of the study may be that the respondents were the experiential directors who represented the various colleges and schools rather than assessment directors/associate deans. Unfortunately, only 2 of the institutions surveyed indicated that they had a director of assessment. If portfolios are truly integrated into the assessment process throughout the curriculum, experiential directors most likely were aware of portfolio use and therefore were the appropriate group to survey. ACPE accreditation standards specifically state that attainment of desired competencies needs to be measured in a variety of healthcare settings, and during the experiential curriculum is when students are most likely to be in healthcare settings. Another limitation that is common in survey research called acquiesce-bias, where respondents answer the way they think they should answer, may apply to our survey due to the political nature of the topic.

CONCLUSION

There is a difference between the way colleges and schools of pharmacy think portfolios should be used and the way they actually use them. Although respondents indicated that portfolios should be used for assessment of curricular outcomes and used throughout the entire curriculum, they reported that portfolios were most useful as a tool for student self-reflection and assessment and were used primarily in the experiential curriculum. Colleges and schools need to carefully examine the intended purpose of their portfolio system and follow through with implementation and maintenance of a system that meets their goals. Doctor of pharmacy degree programs considering implementing or revising student portfolios can use this information to determine how to construct a portfolio process and assess portfolio content at their college or school, as well as to evaluate all the challenges (time, workload, and resources) associated with implementation and maintenance of student portfolios.

REFERENCES

1. Ewell PT. An emerging scholarship: a brief history of assessment. In: Banta TW, editor. Building a Scholarship of Assessment. 1st ed. San Francisco: Jossey-Bass; 2002.
2. Ewell PT. Can assessment serve accountability? It depends on the question. In: Burke JC, editor. Achieving Accountability in Higher Education: Balancing Public, Academic, and Market Demands. San Francisco: Jossey-Bass; 2004.
3. Eaton JS. Is Accreditation Accountable? The Continuing Conversation Between Accreditation and the Federal Government. Washington, D.C: Council for Higher Education Accreditation; 2003.
4. Ratcliff JL, Lubinescu ES, Gaffney MA. How Accreditation Influences Assessment. San Francisco: Jossey-Bass; 2001.
5. Accreditation Council for Pharmacy Education. Accreditation standards and guidelines for the professional program in pharmacy leading to the doctor of pharmacy degree. http://www.acpe-accredit.org/pdf/ACPE_Revised_PharmD_Standards_Adopted_Jan152006.pdf Accessed February 19, 2012.
6. Wilkerson JR, Lang WS. Portfolios, the Pied Piper of teacher certification assessments: legal and psychometric issues. Educ Policy Analysis Archives. 2003;11(145):1–30.
7. Arter JA, Spandel V. NCME instructional module: using portfolios of student work in instruction and assessment. Educ Measurement Issues Pract. 1992;11(1):36–44.
8. Carraccio C, Englander R. Evaluating competence using a portfolio: a literature review and Web-based application to the ACGME competencies. Teach Learn Med. 2004;16(4):381–387. [PubMed]
9. Lucas C. Introduction: writing portfolios– changes and challenges. In: Yancey KB, editor. Portfolios in the Writing Classroom: An Introduction. Urbana, Ill: National Council of Teachers of English; 1992.
10. Paulson FL, Paulson PR, Meyer CA. What makes a portfolio a portfolio? Educ Leadership. 1991;48(5):60–63.
11. McMullan M, Endacott R, Gray MA, et al. Portfolios and assessment of competence: a review of the literature. J Adv Nurs. 2003;41(3):283–294. [PubMed]
12. Plaza CM, Draugalis JR, Slack MK, Skrepnek GH, Sauer KA. Use of reflective portfolios in health sciences education. Am J Pharm Educ. 2007;71(2):Article 34. [PMC free article] [PubMed]
13. Paulson FL, Paulson PR. Assessing portfolios using the constructivist paradigm. In: Fogarty R, editor. Student Portfolios: A Collection of Articles. Palatine, Ill: IRI/Skylight Training and Pub; 1996.
14. Accreditation Council for Pharmacy Education. Accredited Professional Programs of Colleges and Schools of Pharmacy. http://www.acpe-accredit.org/shared_info/programsSecure.asp Accessed February 19, 2012.
15. Kane MT. Validation. In: Brennan RL, editor. Educational Measurement. 4th ed. Westport, CT: Praeger Publishers; 2006. pp. 17–64.
16. Nunnally JC, Bernstein IH. Psychometric Theory. 3rd ed. New York, NY: McGraw-Hill, Inc.; 1994.
17. Shepard LA. Evaluating test validity. Rev Res Educ. 1993;19(1):405–450.
18. Mehrens WA, Lehmann IJ. Measurement and Evaluation in Education and Psychology. 4th ed. Fort Worth: TX: Holt, Rinehart and Winston, Inc; 1991.
19. Messick S Validity. In: Educational Measurement. 3rd ed. Linn RL, editor. New York, NY: Macmillan Publication Co, American Council on Education; 1989. pp. 13–103.
20. Dillman DA, Smyth JD, Christian LM. Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method. 3rd ed. Hoboken, NJ: John Wiley& Sons, Inc; 2009.
21. US Census Bureau. Census Bureau Regions and Divisions with State FIPS Codes. http://www.census.gov/geo/www/us_regdiv.pdf. Accessed February 19, 2012.
22. US Census Bureau. Proposed Urban Area Criteria for the 2010 Census. http://www.federalregister.gov/articles/2010/08/24/2010-20808/proposed-urban-area-criteria-for-the-2010-census Published August 24, 2010. Accessed February 19, 2012.
23. National Association of Boards of Pharmacy. Statistical Analysis of NAPLEX Passing Rates for First-time Candidates per Pharmacy School from 2007 to 2011. Accessed March 20, 2012.

Articles from American Journal of Pharmaceutical Education are provided here courtesy of American Association of Colleges of Pharmacy