PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of nihpaAbout Author manuscriptsSubmit a manuscriptHHS Public Access; Author Manuscript; Accepted for publication in peer reviewed journal;
 
Addict Behav. Author manuscript; available in PMC 2010 September 1.
Published in final edited form as:
PMCID: PMC2697274
NIHMSID: NIHMS113244

Psychometric Properties of the Peer Proficiency Assessment (PEPA): A Tool for Evaluation of Undergraduate Peer Counselors' Motivational Interviewing Fidelity

Abstract

Despite the expanding use of undergraduate student peer counseling interventions aimed at reducing college student drinking, few programs evaluate peer counselors' competency to conduct these interventions. The present research describes the development and psychometric assessments of the Peer Proficiency Assessment (PEPA), a new tool for examining Motivational Interviewing adherence in undergraduate student peer delivered interventions. Twenty peer delivered sessions were evaluated by master and undergraduate student coders using a cross-validation design to examine peer based alcohol intervention sessions. Assessments revealed high inter-rater reliability between student and master coders and good correlations between previously established fidelity tools. Findings lend support for the use of the PEPA to examine peer counselor competency. The PEPA, training for use, inter-rater reliability information, construct and predictive validity, and tool usefulness are described.

Keywords: Peer Counseling, Motivational Interviewing, Alcohol, College Students

1. Introduction

Alcohol use is routinely cited by researchers as a concern at U.S. colleges causing university administrators to invest time and money toward managing and solving problems associated with drinking (Faden & Baskin, 2002; Johnston, O'Malley, Bachman, & Schulenberg, 2005). One prevention approach consists of using alcohol intervention programs (e.g., Brief Alcohol Screening in College Students; BASICS) which incorporate Motivational Interviewing (MI) skills developed by Miller and Rollnick (1991, 2002) to facilitate change and reduce college student drinking behaviors. Initial interventions trained professional counselors in MI to meet individually with undergraduate college students with an aim at reducing alcohol use and associated negative consequences but have since expanded to include interventions using undergraduate student peer counselors (Borsari & Carey, 2000; Borsari & Carey, 2005; Larimer, Turner, Anderson, Fader, Kilmer, Palmer et al., 2001; Marlatt, Baer, Kivlahan, Dimeff, Larimer, Quigley et al., 1998; Turrisi, Larimer, Mallett, Kilmer, Ray, Mastroleo et al., under review). These empirically supported treatments offer tremendous promise towards reducing college student alcohol use and associated negative consequences (Larimer & Cronce, 2002, 2007). However, the cost of using professional staff to conduct these interventions may limit implementing such programs on a wide scale.

More recently, there has been a growing body of research examining the efficacy of undergraduate peer-delivered MI feedback sessions for students mandated for alcohol counseling (see Larimer et al., 2001; Larimer & Cronce, 2002; 2007). The past empirical studies have evaluated undergraduate student counselors ability to conduct MI interventions with adherence through tools such as the Motivational Interviewing Skills Code (MISC; Miller, 2000), Motivational Interviewing Treatment Integrity (MITI; Moyers, Martin, Manuel, & Miller, 2003), and Motivational Interviewing Supervision and Training Scale (MISTS; Madson, Campbell, Barrett, Brondino, & Melchert, 2005). In contrast, when this intervention approach has been implemented in traditional practice settings across college campuses a lack of consistent peer training and adherence protocols has been observed (Mastroleo, Mallett, Turrisi, & Ray, 2008). Research evaluating undergraduate peer counselor implementation practices have shown that no minimum level of standardized counselor competency has been traditionally employed (Mastroleo et al., 2008).

Although use of these counselor evaluation tools in empirical studies allow researchers to confirm an appropriate intervention has been conducted, they may be less suitable for undergraduate peer counselors. First, there are distinguishable clinical skill differences between highly trained professional counselors and undergraduate student counselors, yet the standards for competency are not weighted by the counselor type. Second, although each tool offers important components of evaluation for MI adherence or skill acquisition, they have been validated when the interventions were delivered by counselors with masters or higher educational training (Tappin, McKay, McIntyre, Gilmour, Cowan, Crawford et al., 2000) and not with undergraduate student counselors. Third, training for use of the MI evaluation tools vary from five hours to 3 days while session reviews for intervention integrity range from 20 to 50 minutes. Student affairs professionals generally have limited time, limited training, varied professional backgrounds, and high staff turnover to be responsible for evaluating undergraduate peer-delivered MI fidelity (Mastroleo et al., 2006). Given these empirical and practical constraints a simple to learn and use, and time efficient tool is needed to evaluate the effectiveness of undergraduate peer intervention evaluations.

The Peer Proficiency Assessment (PEPA) was developed with these issues in mind and requires only two hours of training time and 15 minutes of session review to effectively examine undergraduate peer MI skill use. Specifically, the design and training required to effectively use the tool requires minimal time and past experience with MI and coding peer counseling sessions. To establish this we used both individuals highly trained in MI and undergraduate students with both moderate and limited previous exposure to MI principles to test the ability of newly trained coders to successfully identify MI consistent and inconsistent behaviors. The focus of this paper is to document the development and validation of the PEPA for evaluating peer counselor MI adherence and skills that can be used in both research and practice settings.

2. Method

2.1. Instrument Development

The PEPA was developed on the basis of using behavior counts consistent with MI adherent intervention approaches as defined by Miller and Rollnick (1991, 2002). First, appropriate intervention content and communication skills were identified through a review of literature examining essential components of MI adherent interventions (Madson et al., 2005; Miller, 2000; Miller & Rollnick, 1991, 2002; Moyers et al., 2003). Specifically, we evaluated current components of the MITI (Moyers et al., 2005), an empirically supported MI fidelity tool previously used in peer-led brief alcohol intervention efficacy trials (e.g., Tollison et al., 2008; Turrisi et al., in press). One component of MITI scoring includes examination of counseling microskills behavior counts to assess MI adherence in addition to subjective evaluation of counselor empathy and MI spirit.

Based upon the components of the MITI, BASICS sessions behavior counts (open and closed ended questions, simple and complex reflections) were correlated with global MI scores used for evaluating MI adherence on the MITI (i.e., empathy and spirit; Moyers et al., 2003). Results identified high, positive relationships (all rs > .7) between use of open questions and complex reflections such that when peer counselors used more of each skill, adherent scores were recorded. Therefore, use of open and closed ended questions, simple (e.g., repeat, rephrase), and complex (e.g., paraphrase, metaphor, double-sided, etc.) reflections are identified through analysis of audio recorded peer counseling sessions. Further, the decision to exclusively examine behavior counts of MI skills was made to capture specific areas of skill most often displayed in brief, individual feedback interventions. Although past research has examined both behavior counts of MI skills and global scores of empathy and spirit, this was done with professional counselors completing specific training in MI, followed by evaluation of MI counseling sessions devoid of personalized graphic feedback (Moyers et al., 2005). The differences in training between professional and peer counselors, combined with peer counselors' use of graphic feedback to complete interventions, suggest subjective global score ratings may offer little added benefit when examining peer counseling motivational enhancement sessions. The PEPA was designed to allow a systematic assessment of fidelity using an objective measure which may benefit not only less clinically experienced peer counselors, but supervisors who are familiar with MI, but are have limited training and experience with the skills necessary to evaluate the fidelity of a session. In evaluating adherence to MI microskills behavior counts, appropriate use of MI skills relevant to brief motivational enhancement interventions can be assured. Definitions of skill behaviors and discrepancies between specific types of communication skills are identified in Table 1.

Table 1
PEPA Coding Definitions

Finally, we examined differences between variations in time slices using the MITI. Our initial analysis pointed towards the first 15 minutes as similar to other session segments in which fidelity was assured. In doing so, we evaluated coded behavior counts relative to the timed segment chosen for MITI evaluation and identified a higher percentage of open questions and complex reflections were used during earlier timed segments (encompassing the first 20-25 minutes of the BASICS intervention). This was likely due to the nature of the BASICS intervention in which the first 15-20 minutes are used for client exploration of drinking behaviors, beliefs about drinking, and rapport building prior to discussing participant personalized, graphic feedback. Based upon this information and the nature of the BASICS intervention (motivational enhancement using personalized, graphic feedback), the initial 15 minutes of each session was chosen for session evaluation. As noted, the BASICS intervention is a structured interview in which a personalized feedback sheet is used to guide the intervention. Miller and Rollnick (2002) have defined this type of intervention as motivational enhancement rather than true motivational interviewing due to the structured nature of the conversation. Therefore, when examining a full 50 minute BASICS session, only the first 15-20 minutes most closely employs true MI identifying this as the most appropriate segment to examine MI adherence. Additionally, all major components of MI are employed during this initial 15 minutes of a BASICS intervention and allow the fidelity check to capture more MI behaviors, rather than components of a feedback intervention.

2.2. Use as a Competency Tool

Miller and Rollnick (2002) identify use of higher level counseling skills (i.e., open-ended questions, complex reflections) as facilitating change talk, which in turn impacts drinking behaviors to more closely align with less hazardous outcomes. However, the use of closed ended questions and simple reflections have been associated with limited to no change in drinking behaviors (Miller & Rollnick, 2002; Tollison, Lee, Neighbors, Neill, Olson, & Larimer, 2008). The PEPA focuses on the examination of MI skills components with emphasis on evaluating higher level counseling skills for intervention fidelity and adherence. Thus, competency to conduct an efficacious peer counseling session was operationally defined as interaction between peer counselors and clients meeting a 1:1 ratio of open to closed questions, a 1:1 ratio of complex to simple reflections, and a 2:1 ratio of reflections to questions. For examples of the constructs scored in the PEPA see Table 1.

2.2. Sample Intervention Sessions and Peer Counselors

A random sample of 20 sessions were drawn from 96 audio recorded peer counseling sessions submitted for intervention fidelity as part of The GOALS Project (Goals and Options for Alcohol in Life and School) completed at a large northeastern university. The sessions were 50-minute BASICS (Dimeff, Baer, Kivlahan, & Marlatt, 1999) interventions completed over a 4-month period. Peer counselors conducting the interventions were undergraduate students trained in a 15-week course in MI skills, alcohol, and college student drinking information. MI training was conducted by three members of the GOALS research team. Each trainer had extensive experience in conducting MI interventions as well as training undergraduate peer counselors in MI skills and intervention approaches. Participants (i.e., clients) in the GOALS research study were first-semester undergraduate students randomly selected from the overall university population.

3. Procedures

Tapes were cross-coded using the PEPA and the MITI (Moyers et al., 2003) to evaluate the psychometric properties of the PEPA. Three groups of individuals coded the PEPA intervention sessions (master coders, undergraduate peer counselors from GOALS, undergraduate peer counselors from a northeastern university's alcohol intervention program). Twelve trained coders completed MITI coding examining random 20 minute segments of every session, using the MITI 2.0 (Moyers et al., 2003) coding system. The MITI is used for assessing beginning proficiency in MI via a 7-point Likert-type scale on global measures of empathy and MI spirit, and specific in-session behavior counts. Although undergraduate peer counselors were initially trained to beginning proficiency criteria (5.0) on global scores, the mean score for facilitator empathy for all sessions was 4.57 and for MI spirit was 4.49. Facilitators exceeded beginning proficiency criteria on all behavior count ratios. All coders received university training and certification for conducting research with human participants. The coding process, background of coders, and results will be discussed in turn.

3.1. PEPA Coding Process

Each coder evaluated the initial 15 minutes of 20 GOALS audio recorded sessions over a two week period of time. Behavior counts of open and closed ended questions and simple and complex reflections were recorded on individual PEPA recording sheets (see Table 1 for coding definitions). Tallies were summed and overall scores for number of open and closed ended questions asked were recorded. Reflections were broken down into categories based on type of reflection used. Tallies of the various types of simple and complex reflections were first identified and then summed to create overall scores for simple and complex reflections.

3.2.2. Master Coders

Two master coders were used to establish initial inter-rater reliability of the PEPA for examining peer counseling sessions. The individuals were selected in part because their background and training is consistent with typical individuals involved with similar programs (e.g., Larimer et al., 2001; Mastroleo et al., 2006) and also because they were the developers of the PEPA instrument. The first was a Ph.D. level Clinical Psychologist at the Prevention Research Center at Penn State University and the second was an A.B.D. doctoral student in Counselor Education and Supervision at Penn State with a master's degree in counseling and sanctioned as a Nationally Certified Counselor by the National Board for Certified Counselors. Both had received extensive training in MI skills, have conducted counseling sessions similar to BASICS with college students, have conducted MI trainings with professional and undergraduate (peer) counselors, and have been extensively involved in empirical studies examining efficacy of undergraduate peer delivered MI interventions (e.g., Mallett et al., 2007; Mallett et al., 2008; Mastroleo, 2008; Mastroleo & Mallett, 2006).

3.2.2. GOALS coders

GOALS coders (three female, one male) completed extensive training in MI consistent with past research studies examining effectiveness of BASICS (Dimeff et al., 1999) with college student peer counselors (e.g., Larimer et al., 2001). These individuals had previously completed a 15-week course learning MI skills, alcohol information, and peer counseling skills to conduct a BASICS intervention as part of The GOALS Project. In addition, they received weekly individual and group supervision during the implementation of the GOALS intervention. Coders ranged in educational level (1 sophomore, 2 juniors, 1 senior) and academic major (3 psychology, 1 biobehavioral health).

3.2.3. Alcohol intervention program (AIP) coders

The AIP coders (two females), previously completed a two-day training on MI skills, alcohol and other drug information, and peer counseling skills to conduct BASICS interventions as part of the Alcohol Intervention Program at a northeastern university through their office of health promotion and education. The training was consistent with typical university programs in which peer delivered alcohol interventions are used to reduce underage drinking (e.g., Mastroleo et al., 2008). In addition, they received no individual or group supervision on their implementation of the MI interventions. These coders were senior students majoring in psychology and biobehavioral health.

3.2.4. PEPA training and coding procedures

PEPA coder training was conducted by the lead author. Two hours of training included exercises identifying open and closed questions, and differences in reflective responses. Coder trainees completed one supervised coding session in which the training group worked together to identify behavior counts while listening to a sample audio recorded session. Next, trainees worked independently coding an additional 15-minute segment to solidify their understanding of the coding system and various definitions. The second practice coding session was evaluated for consistency with the master coder prior to study initiation. Following training all GOALS and AIP coders worked independently to code the 20 selected audio-recorded intervention sessions. Coding was completed over a three week time period with no consultation or supervision during the coding.

4. Results

4.1. Master Coder Results

The correlations in Tables 2 through through55 reveal high relationships between master coders' behavior counts for all identified variables using the PEPA. For components identified as essential to facilitating change talk with clients, i.e., open-ended questions, complex reflections, master coders PEPA scores were highly correlated with r = .973 and .889 respectively.

Table 2
Correlations of Open-Ended Questions
Table 5
Correlations of Closed-Ended Questions

4.2. GOALS and AIP Coder Results

Correlations between master and GOALS coders were analyzed and revealed significant correlations across all variables (Tables 2--5).5). For components identified as essential to creating change talk with clients GOALS and master coder open-ended question scores and complex reflection scores were highly significantly correlated. Similar positive results, albeit slightly lower correlations, were observed between AIP coders and master coders.

4.3. Construct Validity

To examine the construct validity correlational analyses were conducted between MI adherent scores (as scored on the MITI 2.0; Moyers et al., 2005) and closed ended questions and simple reflections derived from the PEPA scoring. Second, correlations were computed between behavior counts of open and closed ended questions derived from the PEPA scoring.

First, MI adherent scores as scored on the MITI were correlated with behavior counts on the PEPA. Specifically the number of closed ended questions and simple reflections were examined as past research has identified use of these skills associated with limited reductions in drinking behaviors (Miller & Rollnick, 2002; Tollison et al., 2008). It was expected that the relationship between closed and simple reflections on the PEPA and MITI adherence would be low and potentially negative as use of these skills are inconsistent with high MI fidelity. Correlation scores were averaged and as expected, ranged from r = -.409 to r = -.031 for closed questions and from -.444 to -.063 for simple reflections (see Table 6). Overall, master and undergraduate coders demonstrated good construct validity and the ability to evaluate peer counseling sessions for intervention fidelity.

Table 6
Correlations of PEPA Closed-Ended Questions and Simple Reflections with MITI adherence scores

Second, interventions carried out with higher fidelity generally use more open ended questions and fewer closed-ended questions. Individual correlations for each coder between open and close-ended questions across the 20 sessions were first transformed to Fishers' zs, averaged within a coder category (e.g., master coders), and then transformed back to averaged correlations for a coder category. These results identified non-significant averaged correlations for master coders (r = 0.25), GOALS coders (r = -.09) and AIP coders (r = .188). The consistent patterns of non-significant correlations among different coder groups lends support to the notion of each coding group's ability to identify no relationship between the number of open and closed questions asked by the MI interventionists during the sessions. This pattern of results would reflect a lack of MI adherence for the sessions which was in fact the case for the present study as we selected sessions randomly from the larger pool of those available and only four of 20 sessions were coded as adherent using the MITI 2.0.

Taken together, these findings are suggestive of construct validity and the ability to evaluate peer counseling sessions for intervention fidelity using the PEPA.

4.4. Predictive Validity

In an effort to assess the ability of PEPA adherent sessions to predict drinking reductions we examined the correlations between PEPA scores and client drinking outcomes post-intervention (approximately 3 months). Relationships of PEPA adherence scores were correlated with changes in drinking behaviors as measured by changes in total reported number of drinks per week. Results indicated a significant correlation r = .872, in that as PEPA scores indicated MI adherence, drinking at follow-up decreased supporting the predictive validity of the PEPA. Important to note, this finding maybe somewhat compromised by the nature of the research study from which the audio recorded BASICS sessions were collected testing two interventions; an individual BASICS session and a combined parent intervention and BASICS session.

5. Discussion

With the wide range of peer counseling interventions being conducted on college campuses across the U.S., the ability to evaluate intervention fidelity and peer counselor ability to conduct appropriate interventions is vital to the continued usefulness of such programs. The PEPA allows examination of peer counseling skill level and ability to deliver MI interventions with fidelity while identifying peer counselor level of competency. Currently the majority of peer delivered alcohol interventions use limited or no competency evaluation prior to and throughout intervention implementation (Mastroleo et al., 2008). University administrators are consistently faced with the challenge of identifying effective ways to reduce the harm associated with student drinking behavior yet few tools allow evaluation of established approaches. The inability to standardize training across campuses and intervention sites identifies the importance of evaluating peer counselor competency to ensure appropriate delivery of interventions. The PEPA provides effective evaluations of students delivering MI based interventions while examining intervention quality. The PEPA is a quick, easy method for trainers to remove concerns related to peer counselors' abilities to conduct interventions with fidelity. This tool will help administrators make decisions about which peer counselors best serve their programs needs and expectations.

The validation process of the PEPA is important since without reliability and validity data, there would be little usefulness for the tool. Therefore three steps were conducted including coding with master coders, highly trained MI peer counselors (GOALS coders), and minimally trained MI peer counselors (AIP coders). The authors identified the importance of all three levels of validation due to the wide range of MI training conducted across peer counseling interventions and variability in experience of potential peer counselors and coders. The variety of coder levels is typical of peer counseling interventions conducted in the U.S. (e.g., Larimer et al., 2001; Mastroleo et al., 2008). It was expected that master coders' scores would be highly correlated due to their extensive experience with MI. It was also expected to find high inter-rater reliability with GOALS coders due to the intense training protocol and high skill level of MI each had achieved. What was unclear was whether AIP coders would reach the necessary level to establish the PEPA as a useful tool for minimally trained peer counselors with limited training on instrument use. The validation process established that all levels of MI trained individuals are able to use the PEPA. Benefits to the usefulness of the PEPA include the minimal training requirements to use the tool and ability to effectively evaluate a peer counseling session in 15 minutes. This offers important utility in establishing competency to deliver appropriate interventions. In sum, the PEPA shows promise as an effective tool for peer counselor competency evaluation that has the potential to be used in both research and practice settings.

Despite the noted benefits of the PEPA, it is important to identify limitations to the research and recognize it should not replace other established instruments designed to evaluate MI adherence such as the MISC or MITI (Miller, 2000 ; Moyers et al., 2003). The PEPA differs from these instruments as scores for global spirit of MI are not evaluated due to the subjective nature of such evaluations. In addition, the instrument has not been tested at other sites or with a variety of coding teams. This however is not a limiting factor to using the tool as currently most universities supporting peer counseling programs do not use competency tools. The ease of training and use of the PEPA, along with initial reliability and validity of the tool, identify a simple method to incorporate immediate improvements for peer counseling intervention programs. Since neither the MITI or MISC were developed to examine peer counselors, the PEPA fills a gap in the field while working to ensure the continued delivery of MI adherent interventions.

Although examination of coder correlations identified a small number of outliers we do not believe this limits the utility of the tool as most scores were above .75, which has been previously identified by Cichetti (1994) as “excellent.” It could be that the few outliers observed were a result of subjective evaluation of voice intonation in peer counselors' utterances. Although PEPA training was meant to reduce this occurrence, eliminations of coder interpretation is not possible. Instead, the authors suggest continued supervision of coders once they have been trained in order to enhance coder reliability. Additionally, past research has identified use of two primary independent coders with high correlations as sufficient in establishing rating consistency for MI interventions (Bennett, Roberts, Vaughan, Gibbins, & Rouse, 2007). The use of two additional groups of peer counselors was to advance the usefulness of the tool with individuals less experienced in MI and establish the ease of training and use for further competence evaluation.

An additional limitation is the PEPA only examines the initial 15 minutes of peer counseling sessions, therefore it would be important to examine other session segments to compare adherence over time. Future studies comparing a sample segment from the initial 15 minutes and an additional 15 minutes from the feedback portion to ensure MI skills are being used throughout the interview may enhance confidence in individual peer counselors' MI skills. However, as noted, the initial 15 minutes of a BASICS motivational enhancement intervention most closely mirrors the spirit of MI, making this the most appropriate segment to examine MI fidelity. This 15 minute examination of MI skills also does not ensure that all aspects of intervention protocol are fulfilled. Coding for correct dissemination of alcohol content must also be reviewed which will require examination of the entire intervention session. This necessity again does not limit the utility of the PEPA but rather identifies its primary role as examining MI adherence to conduct brief interventions. Finally, although scores on the PEPA demonstrated good predictive validity, the nature of the research study by which the BASICS sessions were acquired (the inclusion of a combined parent intervention + BASICS condition) limits findings. It would be important to replicate findings with peer counseling BASICS sessions in which the BASICS intervention was evaluated independently of additive interventions.

Overall, the PEPA is a valuable tool for examining peer counselor competency to facilitate a MI session with fidelity. Training is completed in two hours with session evaluation requiring only 15 minutes of session review. Through use of the PEPA it is believed more efficacious interventions will be administered on college campuses facilitating a stronger impact on reducing college student drinking concerns. Although the MITI and MISC are incredibly useful tools for examining both intervention fidelity and completing process research, the PEPA fills an area of need for the field of college student drinking research. Use of the PEPA will allow advancement in the field of college student alcohol reduction due to its quick and economical utility while obtaining roughly equivalent adherence information as previously established tools. The PEPA can be easily used in practice settings, which would likely result in university based interventions more closely mapping onto evidence based approaches to alcohol use reduction. This research was supported by grants F31 AA 017012 awarded to Nadine Mastroleo and R01 AA 12529 from the National Institute on Alcohol Abuse and Alcoholism awarded to Rob Turrisi. The authors would like to thank Rachel Bachrach and Katherine Peters for assistance with development of the manuscript.

Table 3
Correlations of Simple Reflections
Table 4
Correlations of Complex Reflections

Footnotes

Publisher's Disclaimer: This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

References

  • Bennett GA, Roberts HA, Vaughan TE, Gibbins JA, Rouse L. Evaluating a method of assessing competence in Motivational Interviewing: A study using simulated patients in the United Kingdom. Addictive Behaviors. 2007;32:69–79. [PubMed]
  • Borsari B, Carey KB. Effects of a brief motivational intervention with college student drinkers. Journal of Consulting and Clinical Psychology. 2000;68(4):728–733. [PubMed]
  • Borsari B, Carey KB. Two brief alcohol interventions for mandated college students. Psychology of Addictive Behaviors. 2005;19(3):296–302. [PMC free article] [PubMed]
  • Cichetti DV. Guidelines, criteria and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychological Assessment. 1994;6:284–290.
  • Dimeff LA, Baer JS, Kivlahan DR, Marlatt GA. Brief alcohol screening and intervention for college students (BASICS): A harm reduction approach. New York: Guilford Press; 1999.
  • Faden VB, Baskin ML. An evaluation of college online alcohol-policy information. Journal of American College Health. 2002;51:101–107. [PubMed]
  • Johnson LD, O'Malley PM, Bachman JG, Schulenberg JE. Monitoring the future: National results on adolescent and drug use Overview of key findings. National Institute of Drug Abuse; 2005. pp. 1–66.
  • Larimer ME, Cronce JM. Identification, prevention, and treatment: A review of individual-focused strategies to reduce problematic alcohol consumption by college students. Journal of Studies on Alcohol. 2002;63(Suppl 14):148–163. [PubMed]
  • Larimer ME, Cronce JM. Identification, prevention, and treatment revisited: Individual-focused college drinking prevention strategies 1999-2006. Addictive Behaviors. 2007;32:2439–2468. [PubMed]
  • Larimer ME, Turner AP, Anderson BK, Fader JS, Kilmer JR, Palmer RS, Cronce JM. Evaluating a brief alcohol intervention with fraternities. Journal of Studies on Alcohol. 2001;62:370–380. [PubMed]
  • Madson MB, Campbell TC, Barrett DE, Brondino MJ, Melchert TP. Development of the motivational interviewing supervision and training scale. Psychology of Addictive Behaviors. 2005;19(3):303–310. [PubMed]
  • Mallett KA, Mastroleo NR, Turrisi R, Bachrach R. Motivational interviewing in health care settings: Foundations and skills. An invited presentation delivered to medical staff at University Health Services. Penn State University; State College, PA: 2008. May,
  • Mallett KA, Turrisi R, Larimer ME, Mastroleo NR, Ray AE, Geisner IM, Grossbard J, Kilmer J. Examination of the efficacy of a combined peer delivered MI and parent intervention to reduce college student drinking. Paper symposium presented at the Research Society on Alcoholism; Chicago, IL. 2007.
  • Marlatt GA, Baer JS, Kivlahan DR, Dimeff LA, Larimer ME, Quigley LA, et al. Screening and brief intervention for high-risk college student drinkers: Results from a two-year follow-up assessment. Journal of Consulting and Clinical Psychology. 1998;66(4):604–615. [PubMed]
  • Mastroleo NR. Motivational interviewing and evidence based practices. Invited presentation delivered to therapeutic staff at Center for Counseling and Psychological Services. Penn State University; State College, PA: 2007. Dec,
  • Mastroleo NR, Mallett KA. Motivational interviewing: Foundations and skills. Workshop presented at the Pennsylvania Counseling Association Annual Conference; State College, PA. 2006. Oct,
  • Mastroleo NR, Mallett KA, Ray AE, Turrisi R. The process of delivering peer-based alcohol intervention programs in college settings. Journal of College Student Development. 2008;49:255–259. [PMC free article] [PubMed]
  • Mastroleo NR, Ray AE, Turrisi R. Analysis of peer-based alcohol intervention programs in college settings. Poster presented at the Research Society on Alcoholism Scientific Meeting; Baltimore, MD. 2006. Jun,
  • Miller WR. University of New Mexico; 2000. Motivational interviewing skill code (MISC): Coder's manual. Unpublished manual. Available at: http://www.motivationalinterviewing.org/
  • Miller WR, Rollnick S. Motivational interviewing: Preparing people to change addictive behavior. New York: Guilford Press; 1991.
  • Miller WR, Rollnick S. Motivational interviewing: Preparing people to change addictive behavior. New York: Guilford Press; 2002.
  • Moyers TB, Martin T, Manuel JK, Miller WR. The motivational interviewing treatment integrity (MITI) code: Version 2.0. 2003. Available at http://www.casaa.unm.edu/download/miti.pdf. Retrieved June 25, 2006.
  • Tappin DM, McKay C, McIntyre D, Gilmour WH, Cowan S, Crawford F, et al. A practical instrument to document the process of motivational interviewing. Behavioral and Cognitive Psychotherapy. 2000;28:17–32.
  • Tollison SJ, Lee CM, Neighbors C, Neill TA, Olson ND, Larimer ME. Questions and reflections: The use of motivational interviewing microskills in a peer led brief alcohol intervention for college students. Behavior Therapy. 2008;39:183–194. [PubMed]
  • Turrisi R, Larimer ME, Mallett KA, Kilmer J, Ray AE, Mastroleo NR, et al. It takes two: A randomized clinical trial evaluating a combined peer-delivered BASICS and parent–based intervention to reduce drinking in a high-risk sample of college students. Journal of Studies on Alcohol and Drugs in press.