PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of jchiroedLink to Publisher's site
 
J Chiropr Educ. 2010 Fall; 24(2): 165–174.
PMCID: PMC2967341

Effect of Clinician Feedback Versus Video Self-Assessment in 5th-Year Chiropractic Students on an End-of-Year Communication Skills Examination

Mark D. Hecimovich, MSc, DC, ATC, Jo-Anne Maire, DC, M.Phil, M. TrainDev, and Barrett Losco, M.Chiro

Abstract

Purpose:

To compare the effect of two learning opportunities, clinician feedback and video self-assessment, on 5th-year chiropractic students' patient communication skills, specifically those required for history taking.

Methods:

A cohort of 51 final-year students was divided into two groups. The first group received immediate feedback from a clinical supervisor following a history-taking encounter with a patient. The second group performed self-assessments of their videotaped history-taking encounter. An end-of-year Viva Voce examination was used to measure the effectiveness of the students' history-taking skills, using two subscores, one for behavior and another for content, as well as an overall total score. An unpaired t-test was performed to determine whether any significant difference occurred between the two groups of students. Each group was then subdivided into two subgroups based on gender, and a two-way analysis of variance was performed to determine whether the type of feedback or the students' gender had any significant effect on the outcome of the Viva Voce.

Results:

There were no significant differences between the two groups of students in terms of their final scores in the Viva Voce. After dividing each group into their gender subgroups and further analysis of the results, neither the mode of feedback nor the students' gender had any significant effect on the outcome of the Viva Voce.

Conclusion:

This study suggests that, for a mixed cohort, video self-assessment and clinician feedback are equivalent in their ability to enhance students' communication skills relating to history taking.

Key Indexing Terms: Chiropractic, Clinical Competence, Communication, Education

Introduction

Patient communication is a vital clinical skill that has a significant impact on patient care.1 Empirical evidence supports the correlation between effective patient communication and improved health outcomes and health care quality in patients.2, 3 In health education the development of effective learning opportunities that build proficiency in patient communication skills is an ongoing process. Diverse learning opportunities, such as lectures and small group discussions,4–9 computer-aided learning,10 self-assessment,11 virtual patients, narrative case summaries,12 and various methods of feedback, have been implemented in health care education programs with the aim of building patient communication skills for their students. For these diverse learning opportunities, there is an abundance of documentation in health education advocating the various methods of feedback, such as clinician, peer, video, clinical encounter cards, patient, and academic/teacher, as the primary learning opportunity implemented to assist in the building of patient communication skills.

Feedback is about providing information to students with the intention of narrowing the gap between actual and desired performance13, 14 and has been shown to be effective in the building of communication skills such as patient interviewing and history taking.15, 16 Feedback encourages students to think about their performance and how they might improve17, 18 and, without it, mistakes may go uncorrected, good performance might not be reinforced, and clinical competence is achieved empirically or not at all.19 However, the lack of adequate amounts of effective feedback in the clinical setting, such as the student-clinic internship, has been identified as a major ongoing problem in health care education.19–23 For example, teaching in the clinical setting often occurs at a rapid pace with multiple demands placed on the clinical supervisor; is variable in teaching and learning opportunities because cases vary unpredictably in number, type, and complexity; and has a relative lack of continuity, which limits the time for effective feedback.24 This is vital because if feedback is limited, then the building of patient communication skills, one may assume, is affected, which ultimately may have a profound impact on the quality of patient care. Therefore, it is essential that educators implement effective learning opportunities that can be used in addition to feedback methods in order to enhance patient communication skills in their students.

It has been suggested that self-assessment of communication skills using videotape may assist the building of patient communication skills because it provides an opportunity for students to assess themselves and identify what they did well and which aspects of their communication they could improve.11 This was documented when Dawson et al25 observed students who were entering their 6th year of medical school, with the use of video self-assessment during history taking, patient interviewing, and physical examination procedures. Results from their study provided evidence that students strongly believed video self-assessment improved their history and interviewing skills. However, there is strong support that suggests that student self-assessment does not necessarily correlate with actual external independent ratings of their communication skills.26–28 Arguably though, the support for this comes from studies that utilized closed-ended approaches, which gauge the accuracy of students' self-assessment against the assessment of peers, standardized patients, or expert observers,29–32 but fail to take into account the value of open-ended formats which encourage deeper reflection and active learning. The open-ended format is a nonthreatening approach because it eliminates concern for the judgment of others. For example, when Zick et al33 assessed the development of communication skills with 1st-year medical students, they were concerned more about the formative value of self-assessment than about the precision of student responses during assessment tasks. In their study they used an open-ended approach that encouraged students to reflect on their behaviors in a way that was meaningful to them. This approach yielded more salient data on students' strengths and weaknesses, as opposed to data pertaining to a set of specific behaviors on a checklist, and could be used to complement other learning opportunities, such as standardized patient exams, direct observations of encounters with real patients, and collection of feedback from patients.34, 35 The authors concluded that self-assessment using videotaping was a feasible, practical, and informative approach to developing effective patient communication skills by providing students with an opportunity to review their own behavior and make specific comments that were supported by tangible examples. It could be assumed then that self-assessment with an open-ended approach, coupled with feedback, may assist in the building of patient communication skills and therefore provide a way of dealing with the lack of immediate feedback in the academic or clinical setting.

In sum, if effective patient communication is an important element for a health care provider to possess, it is imperative to determine how diverse learning opportunities offered to students during their education may contribute to its development. This is important because the use of feedback is the primary learning opportunity most often used to build this skill but many times cannot be adequately done due to a myriad of reasons. Therefore, it is essential that other learning opportunities be explored as to their effectiveness in the development of patient communication skills in health care students. The aim of this study was to assess two learning opportunities, clinician feedback and video self-assessment, and their effect on students' patient communication skills during their clinical experience. The impetus for this study stemmed from a review of literature which supports the view that there is a lack of adequate amounts of feedback during the clinical experience and first-hand perceptions of this in our program. If video self-assessment is as effective as clinician feedback for developing students' communication skills, this may imply that it is a valid tool for inclusion in the chiropractic clinical environment.

Methods

Background

Murdoch University School of Chiropractic and Sports Science offers an accredited, 5-year, double bachelor degree in chiropractic, which, on successful completion, allows the student to register and practice as a chiropractor in Australia. Additionally, graduates from the program are allowed to sit for registration and/or licensing boards globally. Starting in year 4, students commence their clinical experience at the school's on-campus outpatient clinic by observing 5th-year students during all aspects of patient care. They gradually take on more responsibilities so that by the end of year 4, students take over the caseload of their 5th-year students. Once in year 5, students assume the responsibility of patient care under the direct supervision of registered chiropractic clinicians (supervising clinician). Supervising clinicians are an essential constituent to the clinic experience and their responsibilities are to manage patient care and to help the students learn patient care skills by involving them in all phases of care whenever possible. It is also the clinician's responsibility to determine the readiness of students to proceed with the various phases of patient care and to assess and verify students' competence. Clinician feedback is provided after history taking, physical examination procedures, presentation of management plan, consent procedures, and treatment to the patient; however, it is often limited due to time constraints. In order to address the issue of limited feedback opportunities, the use of videotaping has also become an essential constituent to the student's clinic experience. With this, students are required to electronically record (via web-cam and with patient approval) history taking and physical examination and to carry out self-assessments allowing them to document their views and perceptions of the patient encounters.

Design

This study used a quantitative design in order to determine the effectiveness of clinician feedback and video self-assessment on the history-taking skills of 5th-year chiropractic students. The 2008 year 5 cohort (N = 51) was randomly divided into two groups: one who received clinician feedback immediately after taking patient histories and one who completed video self-assessments after taking patient histories. The clinician feedback group (n = 26) received feedback after each of three separate patient encounters, while the self-assessment group (n = 25) performed self-assessments of videotaped encounters after each of three separate patient histories. These interventions occurred once each trimester (12-week trimesters) for both groups, from January to September, 2008. Following this, each student was required to participate in an assessed Viva Voce examination that included a history-taking component. Scores from the history-taking component, comprising content and behavior subscores, and total scores for this component, were matched against the learning opportunity that was provided to the student.

Interventions and Instrument of Measurement

This study used two interventions and one instrument of measurement. An end-of-year Viva Voce was used to measure the effectiveness of students' history-taking skills, while the two interventions were clinician feedback and video self-assessment. Each is outlined in the next section.

Clinician Feedback

Three times between January and September a supervising clinician observed and then provided verbal feedback to a student after a new patient history encounter using the Patient History Assessment form (Fig. (Fig.1).1). The feedback took place immediately after the encounter. This form is a revised and abbreviated version of the History Taking Checklist used for the History Taking Objective Structured Teaching Experience (OSTE) developed by the Clinical Skills Training and Assessment Program Office of Medical Education at Indiana University School of Medicine.36 It was used only as an aide memoir for the supervising clinician because there was no actual mark given.

Figure 1
Patient history assessment form.

Videotape Self-Assessment

Three times between January and September the students videotaped themselves while taking a new patient history and, within a nominated period (maximum 1 week), viewed the recording and used a specific performance evaluation template (Fig. (Fig.2)2) that was designed to guide students through an evaluation of their history-taking performance.

Figure 2
Performance evaluation template.

Viva Voce

In October, at completion of their clinical experience, all students were assessed with a Viva Voce examination. The history-taking component aims to ascertain whether students are competent in history taking and was utilized as the instrument of measurement. Two subscores were used: content and behavior. The content subscore assessed students' ability to elicit relevant information from a standardized patient, while the behavior subscore assessed skills such as active listening, appropriate maintenance of eye contact, recognition of verbal and nonverbal cues and body language, appropriate use of open- and closed-ended questions, and ability to clarify information gathered from the patient. An overall mark was given for the history-taking component, which is the combined scores of the two subscores. The overall history-taking mark and each of the two subscores were used for comparison between the two groups.

Sampling and Recruitment

After receiving ethical approval from Murdoch University's Human Research Ethics Committee for this study, all 5th-year chiropractic students enrolled at Murdoch University School of Chiropractic and Sports Science in the academic session 2008 were invited to participate. All of the 51 students (21 male and 30 female) agreed to participate in this study and were randomly placed into either the clinician feedback group (group A) or the video self-assessment group (group B). Group A had 26 participants (12 male, 14 female) and group B had 25 participants (9 male, 16 female). Age and educational background were not analyzed for this study.

Patients who agreed to have their encounters video recorded for this study signed a patient consent form authorizing the recording. All patients who agreed to be recorded were independent and aged 18 or older.

Data Analysis

Data were analyzed using Graph Pad Prism 4.1 (GraphPad Software, La Jolla, California) statistical analysis program. An unpaired t-test was utilized to determine whether any significant difference in the students' overall results occurred between either of the two groups as a whole. During the course of the project, a further question arose as to whether the students' gender may have an effect in conjunction with the mode of feedback on the final result of the Viva Voce. As a result, each group was divided into two subgroups according to student gender. A two-way analysis of variance (ANOVA) was utilized, where significance was set at the 95% confidence interval (p-value of.05 indicated a statistically significant difference), as there were two independent variables (feedback and gender) and one dependent variable (final scores). Data analysis with the two-way ANOVA was conducted to determine (1) whether the mode of feedback has the same effect at all values of gender, (2) whether the mode of feedback has an effect on the final result, and (3) whether the student's gender has an effect on the final result?

Results

The statistical breakdown between group A (clinician feedback) and group B (video self-assessment) demonstrated 26 students for group A (12 male, 14 female) and 25 students for group B (9 male, 16 female). Three scores achieved during the Viva Voce were analyzed, namely total scores, content scores, and behavior scores (Table (Table11).

Table 1
Descriptive statistics for groups A and B

The results of the unpaired t-test comparing the total scores achieved by group A and group B failed to demonstrate any significant difference between the two groups with a p-value of.4048. A similar result occurred when analyzing the content scores for group A and group B, and when analyzing the behavior scores of group A and group B, where the t-test failed to demonstrate any significant difference between the two groups with p-values of.35 and.48, respectively.

Analysis of the total scores achieved between the respective groups using the two-way ANOVA revealed that the interaction between mode of feedback and gender had no significant effect on the total scores with a p-value of.44. Similarly, the two-way ANOVA failed to demonstrate any significant effect of feedback mode or gender on the total scores with p-values of.40 and.66, respectively, suggesting that neither of the variables had any significant effect on the total scores achieved by the student in the Viva Voce.

Analysis of the content scores to determine whether there was any interaction between the mode of feedback and gender using the two-way ANOVA failed to demonstrate any significant interaction between these two variables with a p-value of.42. Further analysis to determine whether mode of feedback and gender had any effect on content scores did not demonstrate any significant effect of either variable on the content scores with p-values of.61 and.70, respectively, once again suggesting that neither variable has any effect on the content scores achieved during the Viva Voce.

Analysis of the behavior scores to determine whether the interaction between gender and mode of feedback, mode of feedback, and gender had an effect on the behavior scores failed to demonstrate any significant results, with p-values of.24,.24, and.36, respectively. This suggests that neither the mode of feedback, gender of the student, nor an interaction between the mode of feedback and gender had an effect on the overall behavior scores.

Discussion

This study assessed two forms of learning opportunities and their effect on chiropractic students' communication skills. The two opportunities were traditional clinician feedback and videotape self-assessment. Video self-assessment without clinician feedback has grown in popularity due to cost and convenience factors; however, its effectiveness on improvement of skills is questionable because without clinician feedback students may not notice behaviors that need improvement.37 Conversely, clinician feedback is a concept that is strongly theory based and has been shown to reinforce or modify behaviors38 but can also cause demotivation and deterioration in performance if not carefully managed.39 Therefore, both opportunities may be either supportive or an impediment for success.

The results from this study suggest, based solely on a single measurement tool, that the clinician feedback and the video self-assessment were equally effective in the development of overall communications skills. However, a breakdown of the results did demonstrate some interesting findings. For example, although not statistically significant, males who utilized the video self-assessment (group B) had higher behavior scores (mean 67.72) than those who received clinician feedback (group A, mean 45.86). This could be due, in part, to how men respond to negative feedback. According to Roberts and Nolen-Hoeksema40 and Roberts,41 men are influenced more by positive feedback and appear to ignore negative feedback by assuming a self-protective posture and discounting the value of others' evaluation. It may be assumed then that the clinician feedback provided to the males in this study was perceived negatively by them and thus not utilized to incorporate change. This may lead to the conclusion that the video self-assessment group should have scored better on the Viva Voce assessment than their clinician feedback counterparts. However, psychology and health education literature42, 43 supports the concept that men tend to overestimate their abilities when self-assessing, supporting the argument that the self-assessment group in this study should have consequently documented lower Viva Voce communication scores. This contrast in findings might suggest that the clinician feedback is more influential, especially in the development of communication skills, than the video self-assessment. This would need to be supported by future studies.

Another interesting result was the lack of significant difference between the men and women who received clinician feedback (group A) in total score and its two subcomponents (behavior and content scores). This also is in contrast to empirical evidence, notably social role theory,41 which posits that women tend to treat feedback as an opportunity to learn more about themselves and are more willing to internalize and apply the feedback to future decisions. It may be assumed then that the female scores should have been greater than those of the males in reference to those who received the clinician feedback. Unfortunately, this study did not categorize the type of feedback that each student received—positive or negative.

Being a single cohort study meant that it was underpowered to demonstrate significant differences, which was a notable limitation. Another limitation was its use of a single measurement tool (Viva Voce) to assess communication skills, an issue that needs to be addressed in future studies. But this may be difficult because there is substantial evidence that health care students can be taught communication skills16, 44 but limited empirical evidence supporting effective assessment of these skills.11 The Objective Structured Clinical Examination (OSCE) is commonly used in most health education programs and includes assessment of communication skills. However, Rees et al11 noted that medical students do not value OSCEs and feel they are artificial and contain discrepancies between the assessment and the teaching of communication skills. Additionally they noted that even if the marking sheets were available to students before an assessment, students feel that their patient interactions are then being reduced to a series of boxes which have to be ticked. This may hold true for the Viva Voce as well. A suggestion would be to include some form of formal assessment (OSCE, Viva Voce) but also rely on clinician and patient assessment of the students' communication skills as they progress through their clinical internship.

The lack of a control group may also be viewed as a limitation. However, this poses a dilemma in clinical education studies. Typically one of the aims of the clinical internship is student feedback, or a similar opportunity. Removing it in order to justify a control group would be questionable. The control group participants would be, in essence, left out of clinical educational opportunities. Another issue that would need to be addressed and was not included in this study was clinician demographics. This could be important due to differences in students' response to clinicians of a certain age, clinical experience, gender, and nationality.

In sum, future studies would need to address in greater detail, or expand on, the main components assessed in this study, such as clinicians, video self-assessment, communication assessment, and student demographics. First, the clinician feedback would need to be documented as either critical or supportive and by both the student and clinician independently. The demographics of the clinician should be noted in order to document if this affects how certain students respond to a specific clinician. The use of qualitative data may shed light on this issue. Second, the use of the video self-assessment may also need to be evaluated by the students to see whether they feel comfortable with this type of self-assessment and if it needs to be formulated differently. If students perceive that it is not a supportive educational opportunity, they may discount its aim of providing a useful way for self-assessment. Additionally, the use of clinician and peer video assessment may benefit the student by providing additional feedback and also be incorporated into a busy and oftentimes understaffed student clinic. Third, the way in which students' patient communication skills are assessed needs to be expanded. For example, the use of periodic formative evaluations from the clinicians and possibly patients before a formal assessment such as an OSCE or Viva Voce would lend supportive evidence in accurately documenting change in patient communication skills. Last, the age, educational background, and other student specifics may also be useful.

Conclusion

It has been documented that effective patient communication is an essential element for health care providers to possess. Equally, it is essential that academics understand which learning opportunities contribute to its development during a provider's formal training. Given the results of this study, it appears that two clinical learning opportunities, including one that is regularly implemented in clinical education curricula, whose aims are to assist in the development of patient communication skills, are essentially equal. The two learning opportunities, clinician feedback and student video self-assessment, are implemented in clinical education but it is the former which is utilized to a greater extent. However, there is a lack of adequate amount of clinician feedback during the clinical experience, which may have a profound effect on the development of clinical skills such as patient communication. Therefore, another learning opportunity, video self-assessment, which is easier to implement and contributes equally to the development of patient communication may be confidently added to the clinic experience.

Conflict of Interest

The authors have no conflicts of interest to declare.

Contributor Information

Mark D. Hecimovich, Murdoch University.

Jo-Anne Maire, Murdoch University.

Barrett Losco, Murdoch University.

References

1. Lipkin M. Sisphus or Pegasus? The physician interviewer in the era of corporatization of care. Ann Intern Med. 1996;124:511–13. [PubMed]
2. Stewart MA. Effective physician-patient communication and health outcomes: a review. Can Med Assoc J. 1995;152:1423–33. [PMC free article] [PubMed]
3. Stewart MA, Brown JB, Boon H, Galadja J, Meredith L, Sangster M. Evidence on patient-doctor communication. Cancer Prev Control. 1999;2:25–30. [PubMed]
4. Wolf F, Wooliscroft J, Calhoun J, Boxer G. A. controlled experiment in teaching students to respond to patients' emotional concerns. J Med Educ. 1987;62:25–34. [PubMed]
5. Evans B, Stanley R, Burrows G, Sweet B. Lectures and skills workshops as teaching formats in a history-taking skills course for medical students. Med Educ. 1989;23:364–70. [PubMed]
6. Marteau TM, Humphrey C, Matoon G, Kidd J, Lloyd M, Horder J. Factors influencing the communication skills of first-year medical students. Med Educ. 1991;25(2):127–34. [PubMed]
7. Evans B, Stanley R, Mestrovic R, Rose L. Effects of communication skills training on students' diagnostic efficiency. Med Educ. 1991;25:517–26. [PubMed]
8. Evans B, Stanley R, Burrows G. Measuring medical students' empathy skills. Br J Med Psychol. 1993;66:121–33. [PubMed]
9. Campbell E, Weeks C, Walsh R, Sanson-Fisher R. Training medical students in HIV/AIDS test counselling: results of a randomized trial. Med Educ. 1996;30:134–41. [PubMed]
10. Fleetwood J, Vaught W, Feldman D, Gracely E, Kassutto Z, Novack D. MedEthEx online: a computer-based learning program in medical ethics and communication skills. Teach Learn Med. 2000;12:96–104. [PubMed]
11. Rees C, Sheard C, McPherson A. Communication skills assessment: the perception of medical students at the University of Nottingham. Med Educ. 2002;36:868–78. [PubMed]
12. Bearman M, Cesnik B, Liddell M. Random comparison of ‘virtual patients’ models in the context of teaching clinical communication skills. Med Educ. 2001;35:824–32. [PubMed]
13. Ramaprasas A. On the definition of feedback. Behav Sci. 1983;28(4):4–13.
14. Taras M. Summartive and formative assessment—some theoretical reflections. Br J Educ Stud. 2005;53:466–78.
15. Yedidia MJ, Gillespie CC, Kachur E. Effect of communications training on medical student performance. JAMA. 2003;290:1157–65. [PubMed]
16. Aspegren K. Teaching and learning communication skills in medicine: a review with quality grading of articles. Med Teach. 1999;21:563–70. [PubMed]
17. Hesketh EA, Laidlaw JM. Developing the teaching instinct: 1: Feedback. Med Teach. 2002;24:245–8. [PubMed]
18. Liberman AS, Liberman M, Steinert Y, McLeod P, Meterissian S. Surgery residents and attending surgeons have different perceptions of feedback. Med Teacher. 2005;27:470–7. [PubMed]
19. Ende J. Feedback in clinical medical education. JAMA. 1983;250(6):777–81. [PubMed]
20. Gill DM, Heins M, Jone PB. Perceptions of medical school faculty members and students on clinical clerkship feedback. J Med Educ. 1984;259:856–64. [PubMed]
21. Irby DM. What clinical teachers in medicine need to know. Acad Med. 1994;69:333–42. [PubMed]
22. Smith CS, Irby DM. The role of experience and reflection in ambulatory care education. Acad Med. 1997;72:32–35. [PubMed]
23. Ferenchick G, Simpson D, Blackman J, et al. Strategies for efficient and effective teaching in the ambulatory care setting. Cad Med. 1997;72:277–80. [PubMed]
24. Irby DM. Teaching and learning in ambulatory care settings: a thematic review of the literature. Acad Med. 1995;70:898–931. [PubMed]
25. Dawson PS, Lanphear JH, Cheema MY. Video recording feedback: a feasible and effective approach to teaching history-taking and physical examination skills in undergraduate paediatric medicine. Med Educ. 1998;32:332–36. [PubMed]
26. Tamburrino MD, Lynch DJ, Nagel R, Mangen M. Evaluating empathy in interviewing: comparing self-report with actual behaviour. Teach Learn Med. 1993;5:217–20.
27. Marteau TM, Humphrey C, Matoon G, Kidd J, Lloyd M, Horder J. Factors influencing the communication skills of first-year clinical medical students. Med Educ. 1991;25:127–34. [PubMed]
28. Farnhill D, Hayes SC, Todisco J. Interviewing skills: self-evaluation by medical students. Med Educ. 1997;31:122–7. [PubMed]
29. Gruppen LD, Garcia J, Grum CM, et al. Medical students’ self-assessment accuracy in communication skills. Acad Med. 1997;72:S57–59. [PubMed]
30. Martin D, Regehr G, Hodges B, McNaughton N. Using videotaped benchmarks to improve the self-assessment ability of family practice residents. Acad Med. 1998;73:1201–6. [PubMed]
31. Rudy DW, Fejfar MC, Griffith CH, Wilson JF. Self-and peer assessment in a first–year communication and interviewing course. Eval Health Prof. 2001;24:436–45. [PubMed]
32. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence; a systematic review. JAMA. 2006;296:1094–102. [PubMed]
33. Zick A, Granieri M, Makoul G. First-year medical students’ assessment of their own communication skills: a video-based, open-ended approach. Patient Educ Coun. 2007;68:161–6. [PubMed]
34. Makoul G. Communication skills: how simulation training supplements experiential and humanist learning. Acad Med. 2006;81:271–4. [PubMed]
35. Duffy FD, Gordon GH, Whelan G, et al. Assessing competence in communication and interpersonal skills: the Kalamazoo II report. Acad Med. 2004;79:495–507. [PubMed]
36. Indiana University School of Medicine, Office of Medical Education. Clinical Skills Training and Assessment Program. Indianapolis. Available at: http://meca.iusm.iu.edu. Accessed Jan. 10, 2008.
37. Medina MS. Providing feedback to enhance pharmacy students' performance. Am J Health-Syst Pharm. 2007;64:2543–5. [PubMed]
38. Thorndike EL. Human learning. New York: Century; 1931.
39. Kluger AN, DeNisi A. The effects of feedback interventions on performance: a historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychol Bull. 1996;119:254–84.
40. Roberts TA, Nolen-Hoeksema S. Gender comparisons in responsiveness to the evaluations in achievement settings. Psychol Women Q. 1994;18(2):221–40.
41. Roberts TA. Gender and the influence of evaluations on self-assessment in achievement settings. Psychol Bull. 1991;109(2):297–308. [PubMed]
42. Beyer S, Bowden EH. Gender differences in self-perceptions: convergent evidence from three measures of accuracy and bias. Pers Soc Psychol Bull. 1997;23:157–72.
43. Lind DS, Rekkas S, Bui V, Lam T, Beierle E, Copeland EM. Compentency-based student self-assessment on a surgery rotation. J Surg Res. 2002;105:31–34. [PubMed]
44. Maguire P, Pitceathly C. Key communication skills and how to acquire them. Br Med J. 2002;325:697–700. [PMC free article] [PubMed]

Articles from The Journal of Chiropractic Education are provided here courtesy of Association of Chiropractic Colleges