2.1 Setting and Population
The UCSF Breast Care Center (BCC) is a multidisciplinary clinic in a university medical center. In 2005, the BCC saw 599 breast cancer patients new to the clinic who consulted specialists about treatment decisions over the course of 843 visits, with 44% of those visits being to 5 surgeons and 56 % to 9 oncologists. The average age in 2005 was 57 years. The distribution of diagnosis by stage was 94 new patients with ductal or lobular carcinoma in situ (16%); 414 patients with stage 1–3 breast cancer (69%); 19 patients with metastatic disease (3%); and 72 patients with unknown/unrecorded stage (12%). The distribution of first treatment course was 32% lumpectomy, 21% mastectomy, and 20% chemotherapy. The ethnic/racial distribution in 2005 was 64.5% White, 18.5% Asian, 7.5% Hispanic, 6.5% African American, and 3.0% Unknown/Other. Most patients had private (67%) or public (29.5%) insurance, with 3.5% uninsured.
2.2 Study design, sample, and participants
The study consisted of prospectively planned field observations in which we qualitatively and quantitatively monitored the process and impact of the CPRS service in the UCSF BCC through record reviews, surveys, and interviews. Patients were eligible for CPRS if they had a confirmed diagnosis of breast cancer, and were scheduled for an appointment at the UCSF BCC regarding a treatment decision between March 1, 2005 and December 31, 2006, during one of the two daily times when a CPRS staff member was available. We approached patients to respond to efficacy surveys if they spoke and wrote English, and if our staff could approach them to obtain informed consent and administer surveys without interrupting other visit activities such as rooming, vital signs, patient history questionnaires, history-taking by residents, or discussions with nurses and attending physicians.
We also studied physician, schedulers, and CPRS staff reactions to the implementation. We included, over the study period, five surgeons and nine oncologists who saw breast cancer patients in our clinic. We included the schedulers who acted as new patient schedulers for the physicians, and the 14 CPRS staff that performed CPRS for the patients of the physicians. The CPRS staff consisted of BCC employees participating in a premedical internship program after graduation from college and before attending medical school. BCC faculty hire these interns as research study coordinators, and assign them to the CPRS team for one day per week as a means of enriching their internship experience with additional patient contact. The CPRS staffing policies meant that one day per week per CPRS staff member translated into a capacity of two CPRS service slots per day, one in the morning, and one in the afternoon.
We obtained ethics approval from the UCSF Committee on Human Research and the Human Subjects Research Review Board of the Department of Defense, and obtained informed consent from patients for study surveys.
2.3 Intervention and Procedures
Members of our study team had developed [4
], evaluated [9
], and begun to implement [11
] CPRS between 1994 and 2004. In 2005, we obtained a grant to better integrate CPRS, along with other decision support programs, into the routine clinical care workflow at the UCSF BCC.
After a needs assessment and program redesign phase featuring input from clinic physicians, nurses, and schedulers, as well as our institution’s risk management department, the Director of the BCC decided that new patient schedulers would henceforth offer CPRS (along with other supportive services) to breast cancer patients at the time they were scheduling their appointments with physicians. We therefore amended the clinic scheduler job descriptions, and established monthly meetings between the schedulers and the Decision Services team to monitor the scheduling of CPRS.
Schedulers used a script we provided to describe CPRS to patients as a supportive service in which a CPRS staff member would meet the patient at the clinic 60–90 minutes in advance of their medical appointment, help the patient brainstorm and write down a list of questions for the doctor, and then accompany the patient, audio-recording the medical appointment and taking notes.
Physicians at the BCC were already accustomed to CPRS due to the prior period of development, piloting, and ad-hoc (i.e. non-integrated) implementation [11
]. During the study period there was a maximum of 8 and a total of 14 CPRS staff, since the study period spanned two fiscal years during which some interns went on to medical school and new interns arrived. We offered the CPRS training at the start of each fiscal year in a 3-day workshop led by the Director of Decision Services (JB) and featuring assistance from second-year interns (including JC, DC, ML). The CPRS trainees learned, through lectures and role-plays, to administer a Consultation Planning prompt sheet (see ), and to paraphrase and summarize patient questions into Consultation Plans () and physician responses into Consultation Summaries (). Trainees learned to word-process these documents on portable computers, print copies for patients, family members, and attending physicians, and store copies on a secure clinic file server. Trainees also learned to audio-record the physician visits using digital audio recorders, burn compact discs for patients after their visit, and store electronic copies on the secure clinic file server as required by the UCSF Risk Management department. We enjoined the trainees from providing medical advice or information as part of the CPRS encounter, and monitored their compliance with these and other CPRS procedures and policies, documented in a 247-page reference guide, during weekly staff meetings [14
Consultation Planning Prompt Sheet
Example Consultation Plan [names and identifying details redacted]
Example Consultation Summary [names and identifying information redacted.]
2.4 Outcomes, measures, and instruments
For study question 1, as a measure of the degree to which we were able to deliver CPRS we counted how many patients received the services and compared this to our capacity.
For study question 2, as a means of measuring changes in decision self-efficacy and decisional conflict in patients we administered the Decision Self Efficacy (DSE) and Decisional Conflict (DCS) scales to patients before, during, and after their visits. See . The DSE and DCS are Likert scales with acceptable levels of validity and reliability documented in similar studies and similar settings [15
]. The DSE measures patients’ confidence in their ability to participate in decision making [17
]. The DCS measures the level of decisional conflict perceived by the patient [18
Flow chart showing sequence of interventions, including surveys for a convenience sample of patients. CPRS denotes Consultation Planning, Recording, and Summarizing.
For study question 3, over the study period, we were interested in which program design elements physicians, schedulers, and CPRS staff felt were acceptable and should be continued; which were unacceptable and should be discontinued; and which required improvement and should be modified as a means of improving the sustainability of CPRS. The instrument for measuring this outcome consisted of semi-structured interview guides based on the Critical Incident Technique [19
], including probes regarding the program infrastructure, policies, processes, techniques, and tools.
2.5 Data Collection, Management, and Analysis
Study Question 1
Schedulers used the clinic scheduling system to schedule CPRS sessions with patients. We compared the total number of completed CPRS sessions to the number of available CPRS slots based on a capacity of two per day, one in the morning and one in the afternoon.
Study Question 2
Physicians requested that we avoid disrupting the flow of patients through the clinic, so we administered our study surveys only when it was convenient, i.e. when patients were not needed by the clinic. For testing the null hypothesis of no change in DCS or DSE score against the alternative of a significant change, we used a two-sided matched-pair t-test of significance at the 0.05 level.
Study Question 3
In order to collect interview data, during regular clinic meetings we probed for physicians, schedulers, and CPRS staff to provide specific best and worst cases of how CPRS had performed in the past with respect to the program goals, and what aspects of the service to continue, discontinue, or change. We stated the program goals as helping patients list their questions and retain records of the answers provided by their physicians. We obtained input from physicians at their weekly multidisciplinary review meeting, and monthly and weekly input from the schedulers and CPRS staff during their respective regular meetings with the Director of Decision Services (JB). The study team reviewed notes from these meetings, and discussed program changes during weekly Decision Services management meetings. The Director of Decision Services made the final decisions on program design, and documented them in the Decision Services Reference Guide, a 247 page manuscript that summarizes the policies and procedures for the program [14