Search tips
Search criteria 


Logo of cbelifesciLink to Publisher's site
CBE Life Sci Educ. 2010 Winter; 9(4): 473–481.
PMCID: PMC2995765

Learn before Lecture: A Strategy That Improves Learning Outcomes in a Large Introductory Biology Class

Barbara Wakimoto, Monitoring Editor


Actively engaging students in lecture has been shown to increase learning gains. To create time for active learning without displacing content we used two strategies for introducing material before class in a large introductory biology course. Four to five slides from 2007/8 were removed from each of three lectures in 2009 and the information introduced in preclass worksheets or narrated PowerPoint videos. In class, time created by shifting lecture material to learn before lecture (LBL) assignments was used to engage students in application of their new knowledge. Learning was evaluated by comparing student performance in 2009 versus 2007/8 on LBL-related question pairs, matched by level and format. The percentage of students who correctly answered five of six LBL-related exam questions was significantly higher (p < 0.001) in 2009 versus 2007/8. The mean increase in performance was 21% across the six LBL-related questions compared with <3% on all non-LBL exam questions. The worksheet and video LBL formats were equally effective based on a cross-over experimental design. These results demonstrate that LBLs combined with interactive exercises can be implemented incrementally and result in significant increases in learning gains in large introductory biology classes.


The traditional teaching/learning pattern in a large lecture course begins with faculty introducing new material each class, students reviewing the information at a later time, followed by a summative exam to assess student understanding. Higher education practices that promote learning through active engagement in class have been shown to improve student performance (Hake, 1998 blue right-pointing triangle; Knight and Wood, 2005 blue right-pointing triangle; Michael, 2006 blue right-pointing triangle; Freeman et al., 2007 blue right-pointing triangle; Chaplin, 2009 blue right-pointing triangle). We found that creating a more learner-centered environment in our large introductory biology course by replacing a small amount of new information in each class period with active learning exercises increases student engagement, encourages critical thinking, and has improved student attitudes (O'Dowd and Aguilar-Roca, 2009 blue right-pointing triangle). However, typical of introductory biology courses, we cover a large amount of material and have conformed to the traditional pattern of introducing most of this in lecture, thus limiting the time available for active engagement exercises in class. It is clear that if more knowledge-level information could be moved out of lecture, there would be more time in class for active learning. The question is, do students learn material as effectively if the first exposure is moved out of the classroom and time in lecture is devoted to teaching higher-order thinking?

There is mounting evidence that a variety of preclass activities that introduce new material can increase student performance compared with traditional lectures. Preclass online quizzes that encourage students to complete preparatory reading have resulted in improved exam performance (Narloch et al., 2006 blue right-pointing triangle; Dobson, 2008 blue right-pointing triangle; Johnson and Kiviniemi, 2009 blue right-pointing triangle). “Just-in-time teaching” (JiTT), a pedagogical strategy first developed for use in physics, uses preclass assignments to prompt thinking about the upcoming lecture topic (Novak and Patterson, 2000 blue right-pointing triangle). Students submit answers to several questions before class, often addressing common misconceptions, and the instructor uses the students' prelecture quiz answers to make last-minute adjustments to the day's lecture material. The JiTT strategy has been successfully adopted for use in a broad range of disciplines (Novak et al., 1999 blue right-pointing triangle, 2004 blue right-pointing triangle). A study done in an economics course showed that completing JiTT exercises before class resulted in a small positive effect on student exam performance (Simkins and Maier, 2004 blue right-pointing triangle). Results from a large nonmajors biology course suggest that students prefer JiTT to traditional lectures (Marrs and Novak, 2004 blue right-pointing triangle). These studies are encouraging but the JiTT strategy, as the name implies, involves substantial investment on the part of the faculty member to evaluate student responses and revise lecture material within hours of lecture. Faculty teaching large introductory classes, particularly at research universities, have little appetite for making major revisions to their traditional lecture classes in the face of limited time, resources, and reward (Justice et al., 2009 blue right-pointing triangle). A less comprehensive but easier to implement alternative is to move some knowledge-level material from lecture to preclass exposure so students arrive prepared to grapple with higher-level concepts during engagement exercises already scheduled into the class. Theoretically, this could be accomplished by students completing assigned readings before class but in our experience this rarely occurs, even when coupled with preclass quizzes worth a small number of points.

We therefore created preclass assignments designed to help students learn knowledge-level material in preparation for lecture. The assignments were presented in one of two formats: a narrated PowerPoint video with a notetaking sheet or a one-page worksheet. These had to be turned in but did not involve instructor review of student responses. The learn before lecture (LBL) assignments were coupled with in-class active learning exercises in which students were guided in applying their new knowledge to solve higher-level problems. To determine whether this strategy was effective in increasing learning, LBLs were included in a small number of lectures (three of 30 periods) in 2009 to replace material that was introduced as part of the lecture in 2007/2008. Performance of students on LBL-related exam questions, matched by level and format, in 2009 was compared with 2007/8. Performance on non-LBL exam questions was also compared.



The subjects for this study were students in introductory biology (Bio 93) taught at University of California, Irvine during fall quarter 2007 (Section A), 2008 (Sections A and B), and 2009 (Sections A and B). Students were informed that course material, including student responses on surveys, homework, and exams, would be collected as part of an ongoing science education study. The IRB-approved study information sheet was posted on the course website. Students were given the option of declining (anonymously) to have their data included in the education studies. Students with Family Educational Rights and Privacy Act holds and those under the age of 18 were excluded from the study. The demographic profiles and SAT scores of the 2007, 2008, and 2009 student cohorts were obtained from the registrar without student identifiers. Incoming knowledge level of students in 2008 was compared with 2009 by evaluating their performance on 17 questions administered online in a preclass quiz. The questions were from a multiple-choice Introductory Molecular and Cell Biology Concept Assessment developed at the University of Colorado (Jia Shi, personal communication).

Class Format

The 10-wk course was taught in three, 50-min classes per week in a lecture hall (capacity 444) with fixed seating. Each class period was divided into ≈3 × 10 min PowerPoint guided lecture segments to deliver content separated by 5–7 min active-learning exercises. These included problem solving in small groups, clicker questions, class discussion, and/or interactive physical demonstrations. In each year two lecture sections, A and B, were taught on MWF, from 12:00 to 1:00 pm and 1:00 to 2:00 pm, respectively, in the same room, by the same instructor team (O'Dowd and Warrior). All students were required to attend a 50-min weekly discussion section led by a graduate teaching assistant. The graded elements in the class included two in-class quizzes, a midterm exam, and a final exam (worth 85% of the total points). The remaining 15% of the total points possible was associated with online quizzes, clicker and discussion section participation, and LBL activities (2009 only).

Learn before Lecture Procedure

For the 2009 class we used three LBLs to introduce material that was previously covered in one content section (≈10 min of PowerPoint guided lecture) in each of three lectures in 2007/8. The LBLs were available online for download 2 d before the related lecture. To encourage completion of the LBLs, students received credit worth 1% of the total grade for uploading the completed assignment and 0.25% for completing an online quiz covering the material, before the associated class. Credit for the LBL assignment was awarded if legible electronic files were uploaded on or before the deadline (3 or 4 h before class).

We compared the effectiveness of LBLs presented in two distinct formats, a worksheet and a narrated PowerPoint video. The worksheet directed students to complete a short reading assignment and to answer associated questions. Students downloaded the worksheet PDF file, handwrote their answers, and submitted their assignment digitally as a scanned image or digital photo. The PDF file was chosen as the distribution format to minimize “copy and paste” plagiarizing that occurs more easily with Word documents. The second format was a PowerPoint video consisting of the slides that were used in class the previous year, narrated by the same faculty. The videos were made using Camtasia Relay software (TechSmith Corporation), a screen/sound capture program. Students downloaded a PDF of related PowerPoint slides and wrote notes on the slide PDF, as they would for a typical lecture session. They digitally submitted their notes as scanned images or a digital photo. In 2009, students in Section A were assigned the worksheet format for the Organelle LBL (associated with lecture 5) and the video format for the Inheritance LBL (associated with lecture 18). Students in Section B were assigned the video format for the Organelle LBL and the worksheet format for the Inheritance LBL.

Anecdotal evidence from previous classes suggested that students who make a cut-out model of a biological topic perform better on a related exam question than those who work only with a drawing on the same topic. We tested this by making the Cloning LBL (associated with lecture 24) a worksheet activity for both sections, where one group interpreted a drawing and filled in a worksheet and the other cut out and constructed a plasmid. Half of the students in each section were randomly assigned to the drawing or the cut- out and construct worksheet LBLs. After each LBL activity, all students completed the same preclass online quiz.

Each LBL homework assignment was associated with an in-class active-learning exercise, where students were presented with an interactive demonstration, clicker questions, problem-based small group discussion, and/or class discussion. Novel research findings on the topic were also presented for one subject. Time for these exercises, ≈10 min, was made by moving the basic textbook level knowledge to the LBLs. The overall experimental design is summarized in Table 1.

Table 1.
Summary of LBL exercises and associated in-class activities

LBL Example: Organelle LBL5: Nucleus and Ribosomes

LBL learning goals are to be able to:

  • Describe the basic structure and function of the nucleus and nucleolus
  • Describe the role of nuclear pores in transport across nuclear membrane
  • Describe the relationship between free and bound ribosomes and proteins they synthesize

In-class Activity on Nuclear Pores (5 min)

  • Clicker questions on transport through nuclear pores requiring students to apply basic understanding of nuclear pore trafficking to consequences of disrupting pore function.
  • Tie this to data from a recent article in Cell (D'Angelo et al., 2009 blue right-pointing triangle) reporting that as cells age, the lack of renewal of some nuclear pore components leads to loss of pore function. Nuclei in older cells become leaky and proteins normally excluded are found in the nucleus.

In-class Activity on Free versus Bound Ribosomes (5 min)

  • Class demonstration that includes physical representation of mRNAs, ribosomes in the cytosol, and ribosomes attached to the rER. Student volunteers are asked to illustrate how the ribosomes interact with mRNAs encoding secreted and cytosolic proteins.
  • Students answer clicker questions (old exam questions) about which ribosomes are synthesizing proteins that are either cytosolic or secreted/transmembrane.

Matched Question Pairs

The LBLs were spread throughout the quarter and addressed a span of biological topics (organelles, inheritance, and cloning). To determine the efficacy of LBLs in improving learning, we included questions on the final exam that specifically addressed material associated with the LBLs. The experiment included six question pairs: five multiple choice and one free response. One question pair was identical between years (restriction enzymes) but the rest were isomorphic variants between years, matched by topic and presentation (Smith et al., 2009 blue right-pointing triangle). One question pair (blood type) was nearly identical in terms of the wording of the stem and the answer choices. The ribosome question pair had one variant in 2008 that was compared with two isomorphic variants in 2009, one very similar and the other with changes in stem and answers choices. The remaining three question pairs (multiple alleles, plasmids, and nuclear transport) had changes in both the stem and the answer choices but the thought process required to answer the question correctly was the same for each question in the pair. The question pairs were also matched by level on Bloom's scale [i.e., Knowledge (1), Comprehension (2), Application (3); Bloom et al., 1956 blue right-pointing triangle; Crowe et al., 2008 blue right-pointing triangle]. Bloom's level rankings were based on consensus among three individuals either teaching or associated with the course, in addition to the author of each question.

Isomorphic Ribosome Questions (see Supplemental Material Appendix A for Complete List of Question Pairs)

Fall 2008 Section A

A free ribosome that binds to an mRNA molecule coding for a lysosomal proton pump in the lysosome membrane will:

  1. cleave off the signal peptide region before starting protein synthesis
  2. bind to the ER and synthesize the protein directly into rER membrane
  3. bind to the ER and synthesize the protein into the rER lumen
  4. synthesize the protein in the cytosol and transport it to the lysosome
  5. synthesize the protein in cytosol and package it in vesicles for transport to lysosome

Fall 2009 Section A

A free ribosome that binds to an mRNA molecule coding for a potassium channel that is located in the membrane of an axon will:

  1. synthesize the protein in the cytosol and insert it directly into the axon membrane
  2. synthesize the protein in the cytosol, package it in a vesicle, and transport to axon
  3. cleave off the signal peptide region before starting protein synthesis
  4. bind to the ER and synthesize the protein into the rER lumen
  5. bind to the ER and synthesize the protein directly into the rER membrane

Fall 2009 Section B

A proton pump molecule that is synthesized for addition to a lysosome membrane will be made by:

  1. a free ribosome which will bind to rough ER and insert it into the rough ER membrane
  2. a free ribosome and inserted directly into a mitochondrion
  3. a free ribosome and released into the cytosol
  4. a bound ribosome on the rough ER and delivered for exocytosis by a kinesin
  5. a free ribosome which will then bind the rough ER and insert it into the lumen

The Point Biserial correlation (PBS) was calculated post-hoc as a quantitative measure of the consistency of each LBL-related exam question with the whole test (Ding et al., 2006 blue right-pointing triangle). The PBS can vary between −1 and 1. A positive PBS indicates that students with high total scores are more likely to answer the question correctly than students with low total scores. A negative PBS indicates that student with low overall scores were more likely to get the correct answer and implies potential problems with the wording of the question. In general, values above 0.2 are considered a good indication of a well-written exam question (Ding et al., 2006 blue right-pointing triangle).

The PBS was calculated as:

equation image

where X1 is the average total score for those students who answered correctly, X is the average total score for all students, σx is the SD of the total score for all students, and P is the proportion of correct responses. For all topic-matched multiple-choice questions, the PBS values were above 0.3, indicating that the questions in each pair were well matched in their ability to discriminate stronger students from weaker students (Supplemental Material, Appendix A).

Assessment of Learning Gains

The percentage of students who correctly answered each of the six matched exam question pairs was compared between 2009 (with LBLs) and 2007/8 (no LBLs) students. Students who did not complete an LBL assignment in 2009 were omitted from the analysis of performance on the associated exam questions. Average scores on all non-LBL exam questions were also compared between 2007/8 and 2009. The LBL-related exam questions were worth <8% and the non-LBL exam questions were worth ≥92% of the total exam points in each year.

Student Participation and Feedback

The percentage of students who submitted complete LBL assignments (and LBL-related quizzes) was tracked throughout the quarter. After the course was completed and all final grades were posted, students were asked to complete an anonymous online survey to share their opinion about various aspects of the course, including LBLs.


Graph Pad InStat software (v3.1a) was used for statistical analysis. Specific tests applied for each comparison are described along with levels of significance in the text or figure legend.


To assess the effectiveness of moving several content segments covered in class in 2007/8 to preclass LBLs in 2009, we first evaluated level of participation in the LBL. Both the worksheet and video LBLs required completion and uploading of an assignment. We found that a small point reward (≈1% of the final grade) was sufficient motivation for at least 90% of the students to upload the assignments for each LBL (Table 2).

Table 2.
Percentage of students awarded points for completing LBL activities and online quizzes of a total of 771 students

Comparison of Student Learning with and without LBLs

Performance on Matched Exam Question Pairs

To determine whether LBL activities affected learning outcomes we included six questions on the 2009 final exams, two related to each of three LBL topics. Of these, five were multiple-choice questions: one that was identical and four that were isomorphic to questions from the 2007 or 2008 final exams. The percentage of students who correctly answered the multiple-choice questions for each topic was significantly higher for students in 2009 compared with 2007/8 (Fisher's exact test, p < 0.001; Figure 1). The performance differences were seen on question pairs at both the level of comprehension (multiple alleles and restriction enzymes) and application (ribosomes, blood type, nuclear transport; Figure 1).

Figure 1.
Increase in student performance associated with LBLs. Student performance in 2009 (with LBLs) was significantly higher than 2007/8 (no LBLs) on four isomorphic and one identical (restriction enzymes) multiple-choice exam question pairs. Multiple Alleles ...

The sixth isomorphic question pair was an application- level short answer (plasmids). This question was worth five points on the 2008 and 2009 final exams. There was no significant difference in performance between the two years with mean scores of 4.0 ± 0.04 (n = 854; 2008) compared with 3.9 ± 0.05 (n = 709; 2009) (Mann–Whitney U, p = 0.7).

The mean increase in percentage correct calculated for the six matched questions pairs was 21.3 ± 7.5% (% correct 2009 − % correct 2007/8).

Overall Academic Performance

To explore the possibility that differences in composition, ability, and/or preparation of the 2009 compared with the 2007/8 students contributed to the significant differences in performance on LBL exam questions, we further characterized the three cohorts. The preclass academic indices of the cohorts were similar (Table 3). There was no difference in performance on the preclass concept assessment between 2008 and 2009 (Table 3; 2007 was not given this test). In addition, the SAT scores were not different across the groups with the exception of verbal scores in 2007 which were 3% lower than 2009. The demographics of the groups were also similar, with no significant difference in gender balance or ethnicity (Table 3). There were small variations in the percentage of freshmen and biology majors from year to year.

Table 3.
Academic and demographic data for 2007 (Section A), 2008 (Sections A and B), and 2009 (Sections A and B)

Because the LBL-related exam questions accounted for <8% of the total exam points, the majority of the exam points were related to material presented to students in the same manner (by the same faculty) in all three years. Therefore we also compared the percentage of total non-LBL exam points earned to determine whether the 2009 students were consistently outperforming the 2007/8 students. This was not the case as the percentage of the non-LBL exam points earned in 2009 was 2.6% higher than 2008 but 7.5% lower than 2007 (Figure 2). This range is typical of the year-to-year variability due to slight differences in the difficulty of the non-LBL questions that are changed each year.

Figure 2.
Performance of students in 2007, 2008, and 2009 on non-LBL exam questions. Distributions represent the percentage of students who earned the indicated percentage of non-LBL exam points in each year. Data were binned in 2% intervals. The mean in 2009 (61.3 ...

The large and significant increase in mean performance on the LBL-related matched questions pairs (21%) in contrast to the <3% increase in exam performance on non-LBL questions, and similarity in preclass academic indices and composition of the 2007, 2008, and 2009 cohorts, indicate the majority of the increase in performance is associated with LBL-related learning gains.

Comparison of Different LBL Formats on Exam Performance in 2009

We hypothesized that worksheet activities requiring students to read, answer questions, and/or draw diagrams would result in higher learning gains than watching and taking notes on a narrated PowerPoint video covering the same material. To test this, we used a cross-over design using the LBL on inheritance and the LBL on organelles. There was no difference in performance on the exam questions between students assigned the video version of an LBL and those assigned the worksheet version (Figure 3; Fisher's exact test, p > 0.1). This indicates that both methods of preclass exposure produce similar learning gains.

Figure 3.
Worksheet and video LBLs produced similar learning gains. Comparison of performances on identical 2009 exam questions between video and worksheet LBL versions. Percent correct represents the number of students who answered each question correctly of the ...

For the Cloning LBL, we compared the performance of students who completed the drawing worksheet to those who completed the cut-out worksheet, on a free response exam question that required understanding of restriction mapping (plasmids). Of five possible points, the mean scores for the drawing (3.9 ± 0.07, n = 368) and the cut-out groups (3.9 ± 0.08, n = 341) were not significantly different (Mann-Whitney U, p = 0.69), indicating that both methods are equally effective for student learning.

Student Perception and Behavior

Students were asked to fill out a postclass anonymous survey to gather information on a variety of issues related to the class including their experience with the LBLs. Although there was no reward, 56% of the 2009 students (446/795) completed the survey.

The majority (80%) of students who responded indicated LBLs were helpful in learning the course material (Figure 4). In addition, 73% of the respondents indicated they viewed the LBL material one or more times after the assignment had been turned in. Interestingly, when asked which LBL was more effective in helping them learn the information, 50% (220/446) indicated the worksheets and 50% (218/446) indicated the video format. Students were also given an opportunity to provide additional comments about the LBLs. Of those who added comments (82/446), some were negative (13%), most commonly expressing frustration with the logistics of LBL submission. However, the majority (74%) of comments were positive and pointed out the usefulness of LBLs in preparing for lecture (e.g., “I thought it really gets your mind warmed up before you have a lecture about it”). Other comments indicated students felt the LBLs helped them keep up with, review, and understand the class material.

Figure 4.
Students indicate LBL assignments were helpful in learning course material. Graph shows percentage of students (of a total of 446 respondents) who selected each of the indicated response categories on an anonymous postclass survey.


Our data demonstrate inclusion of LBLs that shifted presentation of new material from lecture to preclass assignments but did not require instructor review, coupled with related participatory exercises in class, resulted in significant increases in student learning gains in a large introductory biology course. Our findings also indicate that the learning gains associated with LBLs did not come at the expense of reduced learning outcomes on non-LBL material. The combination of increased exam performance on LBL-related topics and the high level of student satisfaction encourages us to continue their use in our class.

Assessment Using Matched Pairs of Final-Exam Questions

The LBLs in 2009 were used for only a small number of topics, while the rest were taught with the same strategies used in the previous years. This allowed us to examine the effectiveness of LBLs, embedded in the same overall class environment. While it is common to assess learning gains by comparing student performance on identical questions before and after a new teaching strategy, these are most often administered as ungraded surveys at the beginning and the end of a class (Metz, 2008 blue right-pointing triangle; Perry et al., 2008 blue right-pointing triangle; Scholer and Hatton, 2008 blue right-pointing triangle). This was not possible because our study was conducted over three years with different classes. In addition, we wanted to evaluate both LBL and non-LBL course related learning outcomes under conditions where the performance expectations and motivation were consistent, so comparisons were all based on performance on exam questions. Our standard practice of posting answer keys to all exams precluded extensive use of identical questions from year to year. Therefore only one question pair was identical and the other five were isomorphic, closely matched by topic, presentation, and Bloom's level (Smith et al., 2009 blue right-pointing triangle). A post hoc analysis also showed that all questions in each pair were well matched in terms of their ability to discriminate stronger from weaker students based on PBS values of 0.3 or higher (Ding et al., 2006 blue right-pointing triangle). This strongly suggests that the large mean increase in performance on the matched final exam question pairs that we report in this study reflects increases in student learning associated with the LBLs. Further support for this conclusion is that there was little or no increase in the performance on non-LBL exam questions between 2009 and the previous years, and the cohorts in all years were similar in terms of their demographics and preclass academic indicators.

Choosing Material to Convert to an LBL

Our introductory biology course covers material from DNA to organisms in 10 wk. The goal of moving material out of lecture was not to increase the amount of material covered, as the content is already broad and challenging. Instead, we selected topics where the basic knowledge was explained well in the textbook and developed preclass assignments that guided students through the reading and interpretation of diagram/figures. The LBL-related questions associated with the largest performance increases were multiple alleles (comprehension level) and nuclear transport (application level). Historically these have been difficult topics for students, and modifications in presentation of the material in lecture over multiple years had resulted in little change in understanding. The only LBL-related question (plasmids) that did not result in a significant increase in performance was one that the majority (80%) of students got correct before LBLs were introduced, consistent with the interpretation that presentation of this material in lecture was already quite effective. Our data thus indicate that the LBL strategy is most effective for teaching core biological concepts that are challenging for students to learn in a traditional lecture format. The increase in performance on both comprehension- and application-level questions suggests that exposure to new material before class along with revisiting this material during the more in-depth exploration of the core concepts in class contributes to the learning gains.

Worksheet and Video Format LBLs

Previous studies have used homework assignments similar to our worksheets and online video lectures to introduce students to new material before class (Walvoord and Anderson, 1998 blue right-pointing triangle; Collard et al., 2002 blue right-pointing triangle; Klionsky, 2002 blue right-pointing triangle; Keefe, 2003 blue right-pointing triangle; Allen and Tanner, 2005 blue right-pointing triangle; Day and Foley, 2006 blue right-pointing triangle). However, there is little information regarding the comparative effectiveness of these methods on student learning. In our study, we found no difference in exam performance between students who completed a worksheet LBL and those who viewed a narrated video LBL on the same topic. This was somewhat surprising because we predicted the worksheet assignments would be more effective as they required more active participation including writing answers to questions and/or making drawings. We also saw no difference in learning between those who completed the cut-out and drawing versions of the third LBL assignment, again contrary to our prediction that actually making a model would be more effective in reinforcing learning than analysis of a drawing.

The data demonstrate that both methods of preclass exposure to the material are equally effective in producing learning gains when coupled with the same in-class activities. However, we were unable to determine what percentage of the learning gains were associated with the preclass exposure as opposed to the in-class activity, as the combination was part of the experimental design. It would be interesting to explore the relative contributions in our setting because a previous study showed that active-learning exercises coupled with online activities were more effective than active exercises alone (Riffell and Sibley, 2005 blue right-pointing triangle).

Online Submission of LBL Assignments

In large classes like ours, collection and assessment of assignments on paper is not feasible with limited administrative support. Therefore, we chose to have all LBL assignments distributed and submitted electronically. Online assignments can be submitted in multiple formats, including Word documents or online forms (Schaeffer et al., 2001 blue right-pointing triangle; Allain and Williams, 2006 blue right-pointing triangle). Our submission procedure required uploading of a digital image to our class website drop box, so we were able to assess whether students completed drawings and/or made cut-out models. More than 90% of the students were able to accomplish this task by scanning or photographing their completed worksheets. A small percentage of students uploaded Word documents. In a postclass survey approximately 5% of the students reported dissatisfaction with the submission procedure, including difficulty in accessing a digital camera or scanner. Virtually all of our students have cell phones with digital cameras, the more recent versions of which were used by a number of students to produce adequate images of their assignments. In coming years, high-resolution cell phone cameras will provide most students with a relatively simple method to create digital images of their completed assignments. It is also worth noting that web-based reading quizzes are relatively simple to set-up and administer through readily available online course management systems. Although we hypothesize that students learn more when they have to write out something by hand, it would be interesting to test the relative efficacy of LBLs with completely electronic reading quizzes.


Many faculty resist integration of active learning into their large courses because restructuring can be time-consuming, implementation administratively challenging, and there is often little support or reward for these activities, particularly in large research universities (Allen and Tanner, 2005 blue right-pointing triangle; Justice et al., 2009 blue right-pointing triangle). Adoption of incremental changes rather than an abrupt class overhaul has been advocated as a more feasible strategy for moving in the direction of creating learner-centered environments in large classes (Bonwell and Eison, 1991 blue right-pointing triangle; Allen and Tanner, 2005 blue right-pointing triangle). Our study included only a small number of LBLs to evaluate effectiveness of this specific tool as opposed to the effects of a global change in learning environment. The data show this strategy can be implemented incrementally and still result in significant learning gains. Additional studies will be needed to determine whether increasing the number of LBLs results in similar learning gains for each topic. It is possible that making LBLs a routine part of each lecture would reduce the novelty and potentially the effectiveness. We have found that including a wide variety of teaching strategies is helpful in keeping the diverse range of students that populate our introductory classes engaged throughout the quarter.

For implementation we recommend starting with a small number of topics that students have had difficulty grasping from traditional lectures in the past. In our experience, transforming small lecture segments into LBL worksheet activities takes less time than making narrated videos, and because both formats are equally effective instructors should choose the format that is easiest. Once the activities have been developed and implemented, they can be reused and/or revised in the same way faculty would update their established lectures. Administrative burden can be kept to a minimum when assignment distribution, submission, and point allocation are done using standard electronic course management systems. In class, we do not reteach the material in the LBL but instead focus on exploring the concepts in more depth through active engagement exercises that involve the students such as clicker questions and/or interactive demonstrations (O'Dowd and Aguilar-Roca, 2009 blue right-pointing triangle). Faculty teaching introductory classes who are involved in the generation of primary research data can also provide large numbers of students with meaningful access to the prime commodity of a research university, new knowledge not yet widely available. This may be especially important in the sciences where early exposure to basic research can lead to improved student success (Seymour et al., 2007 blue right-pointing triangle; Wischusen and Wischusen, 2007 blue right-pointing triangle). This approach, involving a relatively small initial investment, can help busy faculty transform large lecture halls into more active learning environments that support increased student learning gains.

Supplementary Material

[Supplemental Material]


We thank Dr. Rahul Warrior for participating in creating and implementing LBLs and all of our students in Bio 93 2007–2009. This research was funded by a grant from the Howard Hughes Medical Institute (HHMI) Professor Program (to D.K.O.).


Allain and Williams, 2006. Allain R., Williams T. The effectiveness of online homework in an introductory science class. J. Coll. Sci. Teach. 2006;35:28–30.
Allen and Tanner, 2005. Allen D., Tanner K. Infusing active learning into the large-enrollment biology class: seven strategies, from the simple to complex. Cell Biol. Educ. 2005;4:262–268. [PMC free article] [PubMed]
Bloom et al., 1956. Bloom B. S., Krathwohl D. R., Masia B. B. Taxonomy of Educational Objectives: The Classification of Educational Goals. 1st ed. New York: David McKay Company, Inc; 1956.
Bonwell and Eison, 1991. Bonwell C. C., Eison J. A. Washington, D.C: George Washington University, School of Education and Human Development; 1991. Active Learning: Creating Excitement in the Classroom. Ashe-Eric Higher Education Report No 1.
Chaplin, 2009. Chaplin S. Assessment of the impact of case studies on student learning gains in an introductory biology course. J. Coll. Sci. Teach. 2009;39:72–79.
Collard et al., 2002. Collard D. M., Girardot S. P., Deutsch H. M. From the textbook to the lecture: improving prelecture preparation in organic chemistry. J Chem. Educ. 2002;79:520–523.
Crowe et al., 2008. Crowe A., Dirks C., Wenderoth M. P. Biology in bloom: implementing bloom's taxonomy to enhance student learning in biology. CBE Life Sci. Educ. 2008;7:368–381. [PMC free article] [PubMed]
D'Angelo et al., 2009. D'Angelo M. A., Raices M., Panowski S. H., Hetzer M. W. Age-dependent deterioration of nuclear pore complexes causes a loss of nuclear integrity in postmitotic cells. Cell. 2009;136:284–295. [PMC free article] [PubMed]
Day and Foley, 2006. Day J. A., Foley J. D. Evaluating a web lecture intervention in a human-computer interaction course. IEEE Trans. Educ. 2006;49:420–431.
Ding et al., 2006. Ding L., Chabay R., Sherwood B., Beichner R. Evaluating an electricity and magnetism assessment tool: brief electricity and magnetism assessment. Phys. Rev. ST Phys. Educ. Res. 2006;2:1–7.
Dobson, 2008. Dobson J. L. The use of formative online quizzes to enhance class preparation and scores on summative exams. Adv. Physiol. Educ. 2008;32:297. [PubMed]
Freeman et al., 2007. Freeman S., O'Connor E., Parks J. W., Cunningham M., Hurley D., Haak D., Dirks C., Wenderoth M. P. Prescribed active learning increases performance in introductory biology. CBE Life Sci. Educ. 2007;6:132–139. [PMC free article] [PubMed]
Hake, 1998. Hake R. R. Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am. J. Phys. 1998;66:64–74.
Johnson and Kiviniemi, 2009. Johnson B. C., Kiviniemi M. T. The effect of online chapter quizzes on exam performance in an undergraduate social psychology course. Teach. Psychol. 2009;36:33–37. [PMC free article] [PubMed]
Justice et al., 2009. Justice C., Rice J., Roy D., Hudspith B., Jenkins H. Inquiry-based learning in higher education: administrators' perspectives on integrating inquiry pedagogy into the curriculum. High. Educ. 2009;58:841–855.
Keefe, 2003. Keefe T. Enhancing a Face-to-Face Course With Online Lectures: Instructional and Pedagogical Issues. Proceedings of the Annual Mid-South Instructional Technology Conference; Murfreesboro, TN: Middle Tennessee State University; 2003.
Klionsky, 2002. Klionsky D. J. Constructing knowledge in the lecture hall. J. Coll. Sci. Teach. 2002;31:246–251.
Knight and Wood, 2005. Knight J. K., Wood W. B. Teaching more by lecturing less. Cell Biol. Educ. 2005;4:298–310. [PMC free article] [PubMed]
Marrs and Novak, 2004. Marrs K. A., Novak G. Just-in-time teaching in biology: creating an active learner classroom using the internet. CBE Life Sci. Educ. 2004;3:49. [PMC free article] [PubMed]
Metz, 2008. Metz A. M. Teaching statistics in biology: using inquiry-based learning to strengthen understanding of statistical analysis in biology laboratory courses. CBE Life Sci. Educ. 2008;7:317–326. [PMC free article] [PubMed]
Michael, 2006. Michael J. Where's the evidence that active learning works? Adv. Physiol. Educ. 2006;30:159–167. [PubMed]
Narloch et al., 2006. Narloch R., Garbin C. P., Turnage K. D. Benefits of prelecture quizzes. Teach. Psychol. 2006;33:109–112.
Novak et al., 2004. Novak G. M., Gavrin A., Patterson E. T. Just-In-Time teaching. 2004. [accessed August 2010]., last modified 2006.
Novak et al., 1999. Novak G. M., Gavrin A., Wolfgang C. Just-in-Time Teaching: Blending Active Learning With Web Technology. Upper Saddle River, NJ: Prentice Hall; 1999.
Novak and Patterson, 2000. Novak G. M., Patterson E. T. The Best of Both Worlds: WWW Enhanced in-Class Instruction. Proceedings of the IASTED International Conference on Computers and Advanced Technology in Education; Calgary, AB: ACTA Press; 2000.
O'Dowd and Aguilar-Roca, 2009. O'Dowd D. K., Aguilar-Roca N. Garage demos: using physical models to illustrate dynamic aspects of microscopic biological processes. CBE Life Sci. Educ. 2009;8:118–122. [PMC free article] [PubMed]
Perry et al., 2008. Perry J., Meir E., Herron J. C., Maruca S., Stal D. Evaluating two approaches to helping college students understand evolutionary trees through diagramming tasks. CBE Life Sci. Educ. 2008;7:193–201. [PMC free article] [PubMed]
Riffell and Sibley, 2005. Riffell S., Sibley D. Using web-based instruction to improve large undergraduate biology courses: an evaluation of a hybrid course format. Comput. Educ. 2005;44:217–235.
Schaeffer et al., 2001. Schaeffer E., Bhargava T., Nash J., Kerns C., Stocker S. Innovation From Within the Box: Evaluation of Online Problem Sets in a Series of Large Lecture Undergraduate Science Courses. Meeting of the American Educational Research Association.2001.
Scholer and Hatton, 2008. Scholer A. M., Hatton M. An evaluation of the efficacy of a laboratory exercise on cellular respiration. J. Coll. Sci. Teach. 2008;38:40–45.
Seymour et al., 2007. Seymour E., Hunter A., Laursen S. L., DeAntoni T. Establishing the benefits of research experiences for undergraduates in the sciences: first findings from a three-year study. Sci. Educ. 2007;88:493–534.
Simkins and Maier, 2004. Simkins S., Maier M. Using just-in-time teaching techniques in the principles of economics course. Soc. Sci. Comp. Rev. 2004;22:444.
Smith et al., 2009. Smith M. K., Wood W. B., Adams W. K., Wieman C., Knight J. K., Guild N., Su T. T. Why peer discussion improves student performance on in-class concept questions. Science. 2009;323:122. [PubMed]
Walvoord and Anderson, 1998. Walvoord B. E., Anderson V. J. Effective Grading: A Tool for Learning and Assessment. 1st ed. San Francisco, CA: Jossey-Bass Publishers; 1998.
Wischusen and Wischusen, 2007. Wischusen S. M., Wischusen E. W. Biology intensive orientation for students (BIOS): a biology “boot camp” CBE Life Sci. Educ. 2007;6:172–178. [PMC free article] [PubMed]

Articles from CBE Life Sciences Education are provided here courtesy of American Society for Cell Biology