Search tips
Search criteria 


Logo of cbelifesciLink to Publisher's site
CBE Life Sci Educ. 2010 Winter; 9(4): 408–416.
PMCID: PMC2995757

A Model for Using a Concept Inventory as a Tool for Students' Assessment and Faculty Professional Development

Clarissa Dirks, Monitoring Editor


This essay describes how the use of a concept inventory has enhanced professional development and curriculum reform efforts of a faculty teaching community. The Host Pathogen Interactions (HPI) teaching team is composed of research and teaching faculty with expertise in HPI who share the goal of improving the learning experience of students in nine linked undergraduate microbiology courses. To support evidence-based curriculum reform, we administered our HPI Concept Inventory as a pre- and postsurvey to approximately 400 students each year since 2006. The resulting data include student scores as well as their open-ended explanations for distractor choices. The data have enabled us to address curriculum reform goals of 1) reconciling student learning with our expectations, 2) correlating student learning with background variables, 3) understanding student learning across institutions, 4) measuring the effect of teaching techniques on student learning, and 5) demonstrating how our courses collectively form a learning progression. The analysis of the concept inventory data has anchored and deepened the team's discussions of student learning. Reading and discussing students' responses revealed the gap between our understanding and the students' understanding. We provide evidence to support the concept inventory as a tool for assessing student understanding of HPI concepts and faculty development.


As faculty members at a research university with expertise in Host Pathogen Interaction (HPI), we have established a faculty community to better understand student learning, facilitate faculty development, and affect evidence-based curriculum reform in a set of nine linked microbiology courses (Table 1). Our goal is to improve our students' learning experience. We want our students to learn science in a deep and meaningful manner, acquire conceptual understandings (Anderson and Schonborn, 2008 blue right-pointing triangle; Schonborn and Anderson, 2008 blue right-pointing triangle) of important HPI concepts at an appropriate level (introductory versus upper level courses), and be able to apply HPI concepts within the context of their own motivations for taking the courses (e.g., program requirement or future professional pathways, such as graduate school, medical school, or nursing).

Table 1.
HPI undergraduate courses at University of Maryland

For several decades, the science education community has stressed the importance of teaching and learning in meaningful ways (e.g., Anderson and Schonborn 2008 blue right-pointing triangle; Ausubel, 1968 blue right-pointing triangle, Mayer, 2002 blue right-pointing triangle, Schonborn and Anderson, 2008 blue right-pointing triangle). Although there are many definitions and interpretations of meaningful learning, we favor the definition in which meaningful learning results when students are first introduced to concepts in a simple, intuitive manner and then are challenged to connect learned concepts with newly presented information (Bruner, 1960 blue right-pointing triangle). We believe students should be able to transfer understanding to a higher level in the form of a “learning progression” or spiral curriculum (Bruner, 1960 blue right-pointing triangle; Mayer, 2002 blue right-pointing triangle; National Assessment of Educational Progress [NAEP], 2006 blue right-pointing triangle; National Research Council [NRC], 2006 blue right-pointing triangle; Smith et al., 2006 blue right-pointing triangle; Duschl et al., 2007 blue right-pointing triangle; Anderson and Schonborn, 2008 blue right-pointing triangle; Schonborn and Anderson, 2008 blue right-pointing triangle).

As we move forward in improving our students' learning experiences, we recognize the need for reliable and easy-to-use assessment tools that measure student understandings and provide evidence for the need to change the curriculum. Concept inventories have been used by many science education researchers as pre- and postmeasures to gauge student learning of fundamental concepts in specific courses (Michael et al., 2008 blue right-pointing triangle; Bioliteracy - Conceptual assessment, course & curricular design in the biological sciences, Several groups have been working to identify the fundamental concepts representative of various science disciplines and to design questions that target student understanding of these concepts (Hestenes and Wells, 1992 blue right-pointing triangle; Hestenes et al., 1992 blue right-pointing triangle; Odom and Barrow, 1995 blue right-pointing triangle; Hake, 1998 blue right-pointing triangle; e.g., Anderson et al., 2002 blue right-pointing triangle; Mulford and Robinson, 2002 blue right-pointing triangle; Khodor et al., 2004 blue right-pointing triangle; Garvin-Doxas et al., 2007 blue right-pointing triangle; Garvin-Doxas and Klymkowsky, 2008 blue right-pointing triangle; Smith et al., 2008 blue right-pointing triangle).

Here we describe the use of a concept inventory as a lever for evidence-based curriculum reform and faculty professional development. To guide our use of the HPI Concept Inventory (HPI-CI) (Marbach-Ad et al., 2009 blue right-pointing triangle) we developed seven goals with related research questions (Table 2) and used multi-year HPI-CI data across all courses to address these questions. We reconciled student HPI-CI scores with our expectations of student learning, investigated the possible correlation between background variables (major, ethnicity, gender, year in school, previous course work) and student learning, compared student performance across institutions, assessed how changes in our teaching techniques affected student performance, and tracked HPI-CI data across all courses in our program. The HPI-CI was particularly useful in challenging beliefs that faculty had about student learning in their courses and spurring deep conversations about student learning.

Table 2.
Lists of goals and research questions addressed using the Concept Inventory

Using the HPI-CI for nine courses allowed us to look at student growth in understanding over time as students progressed in our program. Discussing student performance as a group allowed us to think seriously about outcomes-based assessment and how we truly know what students learn in our courses and how effective our teaching is.

The communal analysis of the HPI-CI data contributed significantly to our professional development and has been the most significant motivator of faculty change since the institution of our group in 2004. It has been suggested that “to transform a culture, the people affected by the change must be involved in creating that change” (Brigham, 1996 blue right-pointing triangle, p. 28). Silvethorn et al. (2006) blue right-pointing triangle particularly recommended developing faculty learning communities along the model practiced by Cox (2004) blue right-pointing triangle where faculty learning communities provide regular opportunities for faculty to discuss questions that emerge from classroom teaching experiences, receive coaching and feedback on experiments with instructional strategies, address assessment issues, and interact within a supportive environment with like-minded colleagues. This is the practice that has developed among our community.


Our HPI teaching community was founded on shared research and teaching interests. It has been successful in part as it mirrors our research practice where classic research groups meet regularly to share ideas, review data, and discuss current findings (Marbach-Ad et al., 2007 blue right-pointing triangle). The team has also been successful as we operate as a Community of Practice (COP; Wenger, 1998 blue right-pointing triangle; Ash et al., 2009 blue right-pointing triangle), where each of us brings his/her various expertise and experiences and we learn from each other. Our team includes 19 members representing all faculty ranks, including those who have primarily teaching responsibilities (lecturers and instructors), and tenured/tenure-track faculty with substantial, externally funded research programs in HPI areas. In addition, the group includes a science education expert and several graduate students with a strong interest in the joint missions of research and teaching. Originally the team was solely composed of University of Maryland (UM) faculty. In 2009, in an interest to broaden our discussion, one faculty member from Montgomery College (MC)—UM's most significant feeder community college—joined the team (For more details about the team members go to: We have responsibility for teaching nine HPI undergraduate courses (Table 1), in which General Microbiology serves as a prerequisite for all eight upper-level courses. These courses are all part of the UM microbiology major, which allows students to select various course options for completing their program requirements. There is no required progression through upper-level courses; however, advisors suggest a progression as listed in Table 1 with students often enrolling in more than one advanced course in each semester of their senior year.


The HPI-CI was developed collaboratively by the HPI teaching team (Marbach-Ad et al., 2009 blue right-pointing triangle). Our previous papers have described the process through which we validated the tool and each question (Marbach-Ad et al., 2007 blue right-pointing triangle, 2009 blue right-pointing triangle). We have used the HPI-CI to assess our program of courses since fall 2006. We administered the HPI-CI as a pre- and postsurvey in four to six courses per year. The HPI-CI is a multiple-choice inventory consisting of 17 questions. Each question is designed to target one or more concepts from a list of 13 concepts previously identified by the group as being the most important for understanding how hosts and pathogens interact (Marbach-Ad et al., 2009 blue right-pointing triangle). The distractors for each question target students' specific misconceptions of HPI concepts (see Table 5 for an illustrative question). Each multiple choice question also asks students to explain their selected response, providing us with insight into the students' thought processes.

Table 5.
Example for a data table used by the group to stimulate analysis and discussion about each Concept Inventory question

To validate the questions we used an iterative approach that involved administering the multiple-choice questions and then asking students to explain their response to each question. We reviewed students' responses, and used those responses to evaluate the HPI-CI distractors. As a team we also reviewed students' explanations that supported the selected response. This process has occurred each semester in which we have administered the HPI-CI both as a pre- and postsurvey in our courses. Our original intent was simply to validate our HPI-CI. However, during the process we realized there was value in the iterative review and group discussion of student responses beyond inventory development. We learned a great deal about our students as the process became a faculty development exercise. As we read student responses, we analyzed them in context of the courses in which the students were enrolled. Comparing how students performed both at the beginning and end of the various courses, this review process evolved into a process for assessment of student learning and faculty professional development.

In this article we tell the story and present the data showing how the HPI-CI has influenced the work of our group and how it can serve as a potential model to other programs using a concept inventory to assess student learning and curricular change. We describe our findings according to our set of goals and Research Questions (RQ; Table 2).


Goal 1: To Investigate Whether or Not Students Are Making Significant Progress in Their Learning in Individual Courses, and to Check for Student Retention of Concept Knowledge

RQ 1a: Are Students Making Adequate Progress in Their Learning within Each Course? To examine whether or not students were making significant progress in their learning in individual courses, we reviewed a complete data set without controlling for students' background or how students responded to individual questions (these tests will be described later). For each course we compared pre– and post–HPI-CI mean scores using t tests and calculating mean scores based on a maximum score of 100. The results from three years of data collection (2006–2009) are shown in Table 3.

Table 3.
Mean scores (out of a maximum of 100) for pre- and post-Concept Inventory scores for each course over a period of three academic years

We found significant gains in learning for General Microbiology, Pathogenic Microbiology, and for Immunology Lecture (Spring 2007 only). For General Microbiology the magnitude of the gain was consistent over the three years, even though the course was taught by different instructors each year. General Microbiology is taught using an active-learning format (Smith et al., 2005 blue right-pointing triangle). Immunology lecture is generally the last course taken by students in the spring of their senior year. The average HPI-CI postscore following the Immunology courses is around 60%. There is a significant improvement in students' scores from the pre–General Microbiology BSCI 223 to post-Immunology BSCI 422.

RQ 1b. How Well Do Students Retain Concept Knowledge from Previous Courses as They Move to Subsequent Courses? Because there is no required specific order in taking the advanced courses, we were most interested to see whether the students retained the knowledge from the prerequisite course (General Microbiology) when they moved on to the advanced courses. To examine this we compared mean scores on the General Microbiology postsurvey to the mean presurvey scores of the advanced courses (Table 3). The presurvey scores for advanced courses were higher than or similar (not significantly different) to postsurvey scores for the prerequisite General Microbiology course.

RQ 2. Goal 2: To Identify in Which Courses Particular Concepts Are Expected to be Taught, and at What Depth

RQ 2. Are All Important Concepts Expected to be Covered Somewhere in the Curriculum, at Sufficient Depth, and in a Logical Order? In reviewing student performance on specific HPI-CI questions, we identified some questions for which students' pre- or postscores were lower or higher than expected by the instructor of the course. As such the implementation of the HPI-CI stimulated close examination of our curriculum. To address RQ2 we used a modification of Allen's (2003) blue right-pointing triangle Curriculum Alignment Matrix (Assessing Academic Programs in Higher Education). The use of the Curriculum Alignment Matrix makes it possible to identify where within a curriculum learning objectives are addressed and in what depth. We used the tool to investigate the alignment of HPI concepts within our curriculum. For each question, instructors reported: 1) their assumptions about student prior knowledge (Yes or No) and 2) the level of topic coverage in their classes (0 = not at all; 1 = briefly; 2 = moderately; 3 = detailed). Table 4 shows our matrix for eight of our courses.

Table 4.
Example of curricular Alignment Matrix for eight of our courses and 16 questions

Building the curriculum matrix according to the HPI-CI questions forced our group to consider the teaching of each HPI concept in every course. Overall, the matrix confirms that each of the concepts assessed by the HPI-CI questions is addressed in the learning goals of at least one of our HPI courses. Discussing the full table during a team meeting informed each instructor about ways that other instructors aim to teach each concept in their course. For example, we expected that those of us teaching advanced courses would predict that students would have prior knowledge of concepts addressed at the “3” level in BSCI 223 (General Microbiology). However, comments from the Microbial Genetics (BSCI 412) instructor indicated that he did not assume prior knowledge for most of the topics that were reported to be instructed in the General Microbiology (see Table 4, Questions 4, 7, and 10). Table 4 also shows that, in some cases, instructors who teach the same course reported very different levels of coverage for the same topic (e.g., Pathogenic Microbiology, BSCI 424).

The findings from the curriculum matrix review led us to discuss what we meant by “level of coverage.” We found that instructors have different definitions for “level” of coverage. For example, one instructor of the prerequisite course defined the level of coverage in relation to time spent on the topic in the class and whether or not the concept was included in an active-learning assignment. Another instructor, from one of the advanced courses, explained that “detailed coverage” means the level of complexity in which the topic is discussed, so that the students understand not only the phenomenon that is characterized by the concept but also the chemical reactions and the physical mechanisms behind the phenomenon. These discussions led us to better understand both how concepts are taught by various instructors and what level of knowledge the instructors expect students to bring to their classes.

Accordingly, analysis of Concept Inventory data can be considered as a catalyst for in-depth curriculum analysis. By using the Curriculum Alignment Matrix, we were able to analyze our curriculum and the way that we teach particular concepts in both the prerequisite course and the advanced courses. This information and an informed view among us as instructors both are essential to our goal of creating a learning progression where students will achieve deep learning of HPI concepts.

Goal 3: To Compare the Instructors' Reported Curriculum Coverage with Students' Actual Understanding of Each Concept

RQ 3. Do Students Learn in a Course What We Intend for Them to Learn? We aligned our “assumption of prior knowledge” and “level of coverage” with student performance on each HPI-CI question (see example from Immunology lecture BSCI 422 in Supplemental Material A). Close inspection of the alignments showed that for many questions our expectations matched student performance (Supplemental Material A, Questions 1, 3, 4, and 7–10). However, for other questions, our assumption (either of prior knowledge or of coverage) did not match student pre- and postscores. Again, the HPI-CI results fostered an important discussion. We considered the reasons for students' misunderstanding of concepts that we were confident had been addressed in our courses. We raised questions about whether we used the appropriate teaching method (i.e., demonstrations, case studies, problem solving) to teach the concept, or if there was another explanation.

One interesting finding was that students' scores on particular postsurvey questions were higher after the introductory course in comparison to postsurvey scores after an advanced course. For example, in the Microbial Pathogenesis (BSCI 417) course, the survey results for the question presented in Table 5 show that in the presurvey 94.1% of the students selected the correct response, while in the postsurvey only 41.7% of the same students chose the correct response. Through our conversations we realized that in introductory courses science is presented in terms of general rules with examples to support those rules. In advanced courses, those rules are challenged with presentation of exceptions and uncertainties. We suggest that some students may “lose their footing” in advanced courses and become confused with the multiple-choice distractors that target common misconceptions. Consider the question in Table 5. The question asked, “Two roommates fall ill: one has an ear infection and the other has pneumonia. Is it possible that the same causative agent is responsible for both types of disease?” The correct response is “Yes, because the same bacteria can adapt to different surroundings.” In the postsurvey of the Microbial Pathogenesis course, large number of the students (33.3%) selected the distractor, “No, because one infection is in the lung while the other is in the ear.” When we read the students' explanations for the selection of this distractor, we found comments like: “No, same bacterium is not able to attach at both places. The adherence is tissue specific,” “Tissue tropism.” The Microbial Pathogenesis instructor team explained that students learned about tissue tropism in BSCI 417. Tissue tropism is the idea that a specific pathogen infects a specific type of tissue due to tissue specific receptors. To understand the concept(s) targeted by this question students must realize that, although different bacterial pathogens are able to infect various tissues (tropism) leading to distinct diseases at those sites, one pathogen may elicit different diseases depending on its adaptation to distinct environments. The correct answer represents a complete understanding of the overall process of host-pathogen interaction. It seems that as students gain more detailed information on a topic that reveals more intricacies, they have difficulty in making generalizations.

In other advanced courses, we found similar types of advanced learning that seemed to confuse students. Such examples convinced us how important it is to explore the pattern of responses for each individual question rather than concentrating on total scores or on the difference between pre- and postscores.

Goal 4: To Investigate Possible Effects of Background Variables Such as Gender, Ethnicity, GPA, Age, and Previous Education on Student Learning

RQ 4. What Are the Effects of Background Variables Such as Gender, Ethnicity, and Prior Learning? Students come to HPI courses from a wide variety of backgrounds. Supplemental Material B shows the diversity of students in our courses in terms of gender, race, educational background, test scores, and success in college. The HPI-CI served as a tool to look at the effects of these variables and gather evidence that might guide our curriculum reform. For example, using t tests and regression analysis, we noticed that the HPI-CI total pre- and postscores (for all courses) appeared to be lower for females. A surprising and significant difference showed that males outperformed females on the presurveys (Female = 35.27 ± 17.44; Male = 42.56 ± 19.28) and on the postsurveys (Female = 49.06 ± 14.89; Male = 54.44 ± 16.22). However, further examination showed that the differences in scores between the genders might be linked to other background variables. Indeed, males in our sample were more likely (75.8% of the males) to be majoring in the College of Chemical and Life Sciences (CLFS), while only 55% of the females were CLFS majors. Comparing, for example, gender scores only for the cohort of the CLFS majors in our sample showed no significant differences in their performance on the pre- and postsurveys (postsurvey: Female = 52.94 ± 14.61; Male = 55.66 ± 15.28).

Once again the HPI-CI results stimulated a discussion and further investigations of student performance in our courses. We considered possible explanations for success of majors versus nonmajors. For example, nearly all majors had completed a sophomore Genetics course before taking the General Microbiology course. Review of the Genetics course content led us to assume that this course may have given CLFS majors an advantage. Whatever the reason for the differences between the performance of majors and nonmajors, these findings are consistent with our long-held view that students would be better served if we offered different versions of General Microbiology targeted to students' level of background, preparation, and interest. The findings explained here, based on the HPI-CI, allowed us to give evidence-based support for implementing long-discussed changes in our curriculum and proposing a new version of General Microbiology targeted to Biology majors.

Goal 5: To Compare Outcomes from Different Versions of a Course—For Example, from the Same Course as Offered at Two Different Institutions

RQ 5. Is the HPI-CI a Good Tool to Foster Cross-Institution Conversations on Student Learning? We have established a partnership with MC, one of the largest community colleges in our region with an annual enrollment of approximately 25,000 students (Maryland Association of Community Colleges, 2010 blue right-pointing triangle). We are investigating student learning of HPI concepts in the MC General Microbiology (GM) course as one of our HPI courses. MC GM students completed the HPI-CI in the spring and fall of 2009. Comparison of UM GM student performance with MC GM students revealed that, for some of the questions, the percentage of correct answers of MC students was higher than that of UM students, while in other questions the UM percentage of correct answers was higher than that of MC students. This finding led to discussions of learning goals and teaching practices and student populations at the two institutions. We have also compared the concept coverage goals for each of the courses through the curriculum matrix data (Table 4). It was clear that the instructors emphasized different concepts in their courses. The collaboration and agreement on learning goals between the MC and UM instructors is very important, because the MC GM is accepted as a prerequisite to our advanced HPI courses. Supplemental Material B shows that at least 20% of the students in each of the UM HPI courses had started their higher education studies in a community college. Most of these students are coming from MC.

Discussion of the HPI CI data were an avenue toward open communication about learning goals. One outcome of the discussion is a joint curriculum development project. We found a common challenge in engaging students in learning concepts of bacterial growth. From this conversation the instructors of GM in both MC and UM are cooperating to develop student activities that will meet the needs of both student populations. The HPI-CI will be used as one of the assessment measures to determine the impact of this joint effort on student learning of GM concepts and success in upper-level courses. In the future we plan to collaborate with more colleges and universities by sharing the HPI-CI. These collaborations will include a broad discussion about the concepts that are intended to be covered in their programs as compared with UM's program. Collaboration also will focus on evaluation of concept level of coverage in the different programs.

Goal 6: To Explore the Impact of Number and Combinations of HPI Courses Taken by the Student

RQ 6. Can We Identify Combinations of HPI Courses That Lead to Better Success? The UM curriculum for a biological sciences degree with specialization in microbiology is anchored with General Microbiology as the prerequisite course and is followed by a variety of upper-level courses from which the students may choose. The sequence of upper-level courses is not mandated. We are using the HPI-CI data to identify the optimum path through the microbiology curriculum. HPI-CI data to date show that students who have completed four or more HPI courses have a better understanding of HPI concepts (Figure 1). Using the dataset from all students enrolled in all courses from Spring 2008 to Fall 2009, we grouped students into three categories based upon their HPI-CI postscores: Lower-scores (students who scored below 35% on the HPI-CI); Midscores (scores from 35 to 70%), and Higher-scores (scores above 70%). Encouragingly, we found that a higher percentage of students (41.4%) who had completed more than four HPI courses fell within the Higher scores' group, in comparison with the students who had completed fewer than four HPI courses, where only a few (15.4%) achieved scores that placed them in the Higher scores group.

Figure 1.
Concept Inventory scores according to the number of HPI course taken by the students. Based upon the Concept Inventory scores, students were divided into three groups: Lower-scores (students who scored below 35% on the Concept Inventory), Midscores (scores ...

As we move forward with curriculum design the HPI-CI will be useful to help us gauge course sequencing. We will follow students through their four years of undergraduate studies and analyze the impact of the course sequence on students' concept understanding. Based on our findings we will adjust the program curriculum and the recommended course sequence.

Goal 7: To Compare Learning Outcomes for Different Teaching Methods–For Example, One Class Has Exercises That the Other Class Lacks

RQ 7. Do Innovative Teaching Techniques Enhance Concept Learning? As we change our curriculum we are also changing our teaching style. We are working to implement active- learning activities that engage students in research-oriented learning (e.g., Campbell, 2002 blue right-pointing triangle; Handelsman et al., 2007 blue right-pointing triangle; Jurkowski et al., 2007 blue right-pointing triangle; Prince et al., 2007 blue right-pointing triangle; Ebert-May and Hodder, 2008 blue right-pointing triangle). Because all faculty members in the Department of Cell Biology and Molecular Genetics who have research expertise in HPI are involved in this project, we have a unique opportunity to infuse our HPI courses with current research problems and scientific approaches. We are planning to use the HPI-CI to assess student progress on specific learning goals targeted by the activities that we develop. We will look at student pre- and postscores and student performance on particular questions that target the learning goals of each activity.


As a faculty teaching community working with large populations of students in a set of nine interlinked courses, we developed the HPI-CI to be used as a pre- and postmeasure. We developed the HPI-CI to assess student learning with the goal of establishing evidence to drive our curriculum reform. The HPI-CI development process involved iterative review of questions with detailed analysis of student responses and explanations for responses.

During the development phase we began to see the value of the HPI-CI in a broader way. With our overall goals of engaging students in deep learning of HPI concepts and developing a learning progression between our courses, the HPI-CI process was informing us in ways we had not expected. We used the HPI-CI qualitative and quantitative data to address seven research goals, and a variety of questions about student learning in HPI courses (Table 2).

From analysis of pre- and postscores (Table 3) in our introductory General Microbiology course, we found that students made significant gains in understanding HPI concepts, independent of course instructor. This level of understanding was maintained—students' postscores from General Microbiology were similar to prescores on advanced courses.

The student overall gain in understanding as indicated by comparing presurvey scores from General Microbiology (the prerequisite course) to postsurvey scores after completion of Immunology (the course generally taken last in the sequence) was significant (from ≈30% average score to ≈60% average score, p < 0.001). Although it is typical with concept inventories that average scores on postsurvey do not reach 100%, the 60% average score motivated us to probe into students' responses for each question. By mapping concepts to course learning goals in a curriculum matrix, we analyzed student performance course by course and question by question. We confronted our expectations with data on student performance. We found that it was very important not to interpret the Concept Inventory results in isolation. We looked at possible confounding variables such as gender and major. For example, student gains in understanding of HPI concepts in General Microbiology were correlated with prior completion of the UM sophomore-level Genetics course. This finding spurred us to discuss a view held by those of us who teach General Microbiology: Students with more sophisticated understanding of genetics are better prepared to learn in General Microbiology. This view has implications in our curriculum design, and the HPI CI data have prompted us to revisit the prerequisite courses for General Microbiology.

The HPI-CI was also useful to anchor discussions related to teaching across institutions. Using the HPI-CI as an objective tool we found an avenue to delve into discussions of learning goals and teaching approaches as we compared performance of students at UM and at MC. The discussion led us to a common understanding of student needs and, as a result, stimulated a cross-institution curriculum design project.

Wenger (1998) blue right-pointing triangle describes the value of communities in addressing a particularly challenging endeavor. The “endeavor” serves as the catalyst for group work, and the solution comes from the interplay from each member of a community working collaboratively toward a joint solution or “meaning making.” Our endeavor is student learning, and as it turned out the best catalyst to get us talking was the HPI-CI results. As a teaching community we found that the HPI-CI anchored and deepened discussions of student learning. Reading students' responses allowed us to see the gap between our understanding and students' understanding (Anderson and Schonborn, 2008 blue right-pointing triangle; Schonborn and Anderson, 2008 blue right-pointing triangle). Confronting our expectations of student learning (Table 4) with student responses challenged us to think and converse in a reflective manner. Working together during this process compounded the value of the work as each member brought individual perspectives and arguments to the work. The use of the HPI-CI as a broad program assessment tool and the approach we used to discuss our findings should be transferable to other groups interested in communal work on curriculum reform.

Supplementary Material

[Supplemental Material]


Special thanks to Gustavo C. Cerqueria for support and guidance related to data management. This research was supported in part by a grant to the University of Maryland from the Howard Hughes Medical Institute through the Undergraduate Science Education Program, and by a grant from the National Science Foundation, Division of Undergraduate Education, DUE0837515 CCLI-type 1 (Exploratory). This work has been approved by the Institutional Review Board as IRB 060140.


Allen, 2003. Allen M. J. Assessing Academic Programs in Higher Education. New York: Jossey-Bass; 2003.
Anderson et al., 2002. Anderson D. L., Fisher K. M., Norman G. J. Development and evaluation of conceptual inventory of natural selection. J Res. Sci. Teach. 2002;39:952–978.
Anderson and Schonborn, 2008. Anderson T. R., Schonborn K. J. Bridging the educational research-teaching practice gap. Conceptual understanding, part 1, The multifaceted nature of expert knowledge. Biochem. Mol. Biol. Educ. 2008;36:309–315. [PubMed]
Ash et al., 2009. Ash D., Brown C., Kluger-Bell B., Hunter L. Creating hybrid communities using inquiry as professional development for college science faculty. J. Coll. Sci. Teach. 2009;38:68–74.
Ausubel, 1968. Ausubel D. P. Educational Psychology: A Cognitive View. New York: Holt, Rinehart and Winston; 1968.
Brigham, 1996. Brigham S. E. Large-scale events: new ways of working across the organization. Change. 1996;28:28–37.
Bruner, 1960. Bruner J. The Process of Education. Cambridge, MA: Harvard University Press; 1960.
Campbell, 2002. Campbell A. M. Meeting report: genomics in the undergraduate curriculum—rocket science or basic science? CBE Life Sci. Educ. 2002;1:70–72. [PMC free article] [PubMed]
Cox, 2004. Cox M. D. Introduction to faculty learning communities. New Directions for Teaching and Learning. 2004;97:5–23.
Duschl et al., 2007. Duschl R. A., Schweingruber H. A., Shouse A. W., editors. Taking Science to School: Learning and Teaching Science in Grades K-8. Washington, DC: National Academies Press; 2007.
Ebert-May and Hodder, 2008. Ebert-May D., Hodder J., editors. Pathways to Scientific Teaching. Sunderland, CT: Sinauer Associates, Inc; 2008.
Garvin-Doxas and Klymkowsky, 2008. Garvin-Doxas K., Klymkowsky M. W. Understanding randomness and its impact on student learning: Lessons from the Biology Concept Inventory (BCI) CBE Life Sci. Educ. 2008;7:227–233. [PMC free article] [PubMed]
Garvin-Doxas et al., 2007. Garvin-Doxas K., Klymkowsky K. M., Elrod S. Building, using, and maximizing the impact of concept inventories in the biological sciences: report on a National Science Foundation–sponsored conference on the construction of concept inventories in the biological sciences. CBE Life Sci. Educ. 2007;6:277–282. [PMC free article] [PubMed]
Handelsman et al., 2007. Handelsman J., Miller S., Pfund C. Scientific Teaching. New York: W. H. Freeman & Co.; 2007. in collaboration with Roberts and Company Publishers.
Hake, 1998. Hake R. R. Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am. J. Phys. 1998;66:64–74.
Hestenes and Wells, 1992. Hestenes D., Wells M. A mechanics baseline test. Phys. Teach. 1992;30:159.
Hestenes et al., 1992. Hestenes D., Wells M., Swackhamer G. Force concept inventory. Phys. Teach. 1992;30:141–158.
Jurkowski et al., 2007. Jurkowski A., Reid A. H., Labov J. B. Metagenomics: A call for bringing a new science into the classroom (while it's still new) [accessed 12 May 2008];CBE Life Sci. Educ. 2007 6:260–265. [PMC free article] [PubMed]
Khodor et al., 2004. Khodor J., Gould Halme D., Walker G.C. A hierarchical biology concept framework: a tool for course design. [accessed 12 May 2008];CBE Life Sci. Educ. 2004 3:111–121. [PMC free article] [PubMed]
Marbach-Ad, 2007. Marbach-Ad G., et al. A faculty team works to create content linkages among various courses to increase meaningful learning of targeted concepts of microbiology. CBE Life Sci. Educ. 2007;6:155–162. [PMC free article] [PubMed]
Marbach-Ad, 2009. Marbach-Ad G., et al. Assessing student understanding of host pathogen interactions using a concept inventory. J Microbiol. Biol. Educ. 2009;10:43–50. [PMC free article] [PubMed]
Maryland Association of Community Colleges, 2010. Maryland Association of Community Colleges. 2010. [accessed 31 April 2010].
Mayer, 2002. Mayer R. E. Rote versus meaningful learning. Theory into Practice. 2002;41:226–232.
Michael et al., 2008. Michael J., McFarland J., Wright A. The second Conceptual Assessment in the Biological Sciences workshop. Adv. Physiol. Educ. 2008;32:248–251. [PubMed]
Mulford and Robinson, 2002. Mulford D. R., Robinson W. R. An inventory for alternate conceptions among first-semester general chemistry students. J Chem. Educ. 2002;79:739–774.
National Assessment of Educational Progress, 2006. National Assessment of Educational Progress. Science Assessment and Item Specifications for the 2009 National Assessment of Educational Progress. Washington, DC: National Assessment Governing Board; 2006.
National Research Council, 2006. National Research Council. Taking Science to School: Learning and Teaching Science in Grades K-8. Washington, DC: The National Academy Press; 2006.
Odom and Barrow, 1995. Odom A. L., Barrow L. H. Development and application of a two-tier diagnostic test measuring college biology students' understanding of diffusion and osmosis after a course of instruction. J Res. Sci. Teach. 1995;32:45–61.
Prince et al., 2007. Prince M. J., Felder R. M., Brent R. Does faculty research improve undergraduate teaching? An analysis of existing and potential synergies. J Eng. Educ. 2007;96:283–294.
Schonborn and Anderson, 2008. Schonborn K. J., Anderson T. R. Bridging the educational research-teaching practice gap. Conceptual understanding, part 2, Assessing and developing student knowledge. Biochem. Mol. Biol. Educ. 2008;36:372–379. [PubMed]
Silverthorn et al., 2006. Silverthorn D. U., Thorn P. M., Svinicki M. D. It's difficult to change the way we teach: lessons from the Integrative Themes in Physiology curriculum module project. Adv. Physiol. Educ. 2006;30:204–214. [PubMed]
Smith et al., 2005. Smith A. C., Stewart R., Shields P., Hayes-Klosteridis J., Robinson P., Yuan R. Introductory biology courses: a framework to support active learning in large enrollment introductory science courses. Cell Biol. Educ. 2005;4:143–156. [PMC free article] [PubMed]
Smith et al., 2006. Smith C., Wiser M., Anderson C. W., Krajcik J. Implications of research on children's learning for assessment: matter and atomic molecular theory. Measurement (Mahwah NJ) 2006;4:1–98.
Smith et al., 2008. Smith M. K., Wood W. B., Knight J. K. The genetics concept assessment: a new concept inventory for gauging student understanding of genetics. CBE Life Sci. Educ. 2008;7:422–430. [PMC free article] [PubMed]
Wenger, 1998. Wenger E. Communities of Practice: Learning, Meaning and Identity. Cambridge, UK: Cambridge University; 1998.

Articles from CBE Life Sciences Education are provided here courtesy of American Society for Cell Biology