|Home | About | Journals | Submit | Contact Us | Français|
The Centers for Medicare and Medicaid Services will introduce the reporting of patient surveys in 2008. The Consumer Assessment of Health Care Providers and Systems (CAHPS®) Hospital Survey contains 18 questions about hospital care. Internal consistency reliability of the discharge information scale is relatively low and some important domains of care are not represented.
To determine whether adding questions increases the reliability and validity of the survey.
Surveys of patients at 181 hospitals participating in the California Hospitals Assessment and Reporting Taskforce (CHART), an initiative for voluntary public reporting of hospital performance in California.
CHART added nine questions to the CAHPS Hospital Survey; two to improve reliability of the discharge information domain, five to create a coordination of care domain, and two relating to interpreter services.
Surveys were sent to randomly selected patients from each CHART hospital.
A total of 40,172 surveys were included. Adding the new discharge information questions improved the internal consistency reliability from 0.45 to 0.72 and the hospital-level reliability from 0.75 to 0.81. New coordination of care composites had good internal consistency reliabilities ranging from 0.58 to 0.70 and hospital-level reliabilities ranging from 0.84 to 0.87. The new coordination of care composites were more closely correlated with overall hospital ratings and willingness to recommend than six of the seven original domains.
The additional discharge information questions and the new coordination of care questions significantly improved the psychometric properties of the CAHPS Hospital Survey.
Patient perceptions about care received during the hospital stay have been a major focus of hospital management (Bell, Krivich, and Boyd 1997), and interest in patients' perceptions has burgeoned along with implementation of performance-reporting systems and payment mechanisms based on performance (Hibbard, Stockard, and Tusler 2005; Scanlon et al. 2005). Patient experience data provide an important insight into care received from the patient's perspective and have been used by hospitals internally for many years for quality improvement. These data have not been available universally for statewide or national cross-hospital comparisons or public reporting (PR) initiatives because of the lack of a standardized survey instrument.
As part of the Consumer Assessment of Health Care Providers and Systems (CAHPS®) project, a standardized patient experience survey was developed to assess care provided to adult hospitalized patients (Crofton et al. 2005; Darby, Hays, and Kletke 2005). The CAHPS Hospital Survey has been endorsed by the National Quality Forum and will be included in the reporting requirements and pay-for-performance programs for the Centers for Medicare and Medicaid Services and the Hospital Quality Alliance in 2008 (AHRQ 2006). The CAHPS Hospital Survey includes 16 questions assessing specific aspects of care and two hospital rating questions (plus demographic and screener questions) (Darby, Hays, and Kletke 2005). This survey is a major step in producing a standardized core set of items, and it addresses key aspects of care such as nurse communication, doctor communication, nursing services, physical environment, pain control, communication about medicines, and discharge information. However, the survey also has some identified weaknesses. The two-item discharge information composite has relatively low internal consistency reliability (α=0.45) (Keller et al. 2005). In addition, the survey does not assess some important aspects of the patient experience. For example, coordination of care is a domain identified by Gerteis, Edgman-Levitan, and Daley (1993) and highlighted by the Institute of Medicine (IOM) (Institute of Medicine 2001) in Crossing the Quality Chasm as a key aspect of patient care, but it is not included in the CAHPS Hospital Survey. In this paper, we describe a statewide project conducted in California with the goal of improving the reliability and expanding the scope of the CAHPS Hospital Survey.
In 2004, the California Hospitals Assessment and Reporting Taskforce (CHART) began meeting with the goal of establishing a voluntary PR initiative in California (California Health Care Foundation 2005). The consensus measure set that evolved included a variety of clinical measures along with a patient experience survey. Although standardization with national measures was an important goal in CHART, the CHART group wanted to expand on the CAHPS Hospital Survey to improve its reliability and address an additional IOM (Institute of Medicine) domain.
CHART hospitals agreed to include additional questions in their patient surveys based on an analysis of psychometric data from the CAHPS Hospital Survey three-state pilot study. These data showed a benefit of increased reliability by adding additional questions used in California through the Patient's Evaluation of Performance in California (PEP-C) project in the areas of discharge information and coordination of care. We defined coordination of care (based on the definition developed by the IOM ) as:
To establish and support a continuous healing relationship, enabled by an integrated clinical environment and characterized by the proactive delivery of evidence-based care and follow-up. Clinical integration is further defined as the extent to which patient care services are coordinated across people, functions activities and sites over time so as to maximize the value of services delivered to patients.
The CHART project then asked all the major vendors of patient experience surveyance to submit questions and supporting psychometric data in these areas. Based on preliminary estimates of the psychometric properties as provided by the vendors (these data have not been published), the group decided to test the performance of discharge and coordination of care questions submitted by NRC+Picker Inc. Two items were added to improve the reliability of the discharge information composite (. Did someone on the hospital staff explain the purpose of the medicines you were to take at home in a way you could understand? and . Did they tell you what danger signals about your illness or operation to watch for after you went home?). Five questions were added to allow consideration of alternative approaches to the new coordination of care composite (. How organized was the admission process?, . If you had to wait to go to your room, did someone from the hospital explain the reason for the delay?, . Were your scheduled tests and procedures performed on time?, . Staff checked ID band before giving meds/treatment/tests, and . Sometimes in the hospital, one doctor or nurse will say one thing and another will say something quite different. Did this happen to you?). In addition, the group decided that interpreter availability was an important aspect of care not addressed in the CAHPS Hospital Survey. Two questions (one, a screener question) that had been used in California through the PEP-C project were added to assess the need for and provision of interpreter services, an important issue in California. (. [Screener question] An interpreter is someone who repeats or signs what one person says in a language used by another person. Did you need an interpreter to help you speak with doctors or other health providers? and . When you needed an interpreter to help you speak with doctors or other health providers, how often did you get one?).
A total of 41,701 surveys were obtained from patients discharged from 186 hospitals between December 2005 and February 2006. After deleting 91 observations that completed <50 percent of the items, 19 observations that were under 18 years of age and 1,419 observations from five hospitals that administered only CAHPS Hospital Survey items without the additional CHART questions, there were 40,172 observations from 181 hospitals available for the analyses (average of 222 observations per hospital). The survey was available in three languages (English, Spanish, and Chinese) and administered using a two-wave mail-only method. Ninety-five percent of respondents completed the survey in English and 5 percent in Spanish. The lag time between discharge and survey completion ranged from 10 to 210 days.
Fifty-one percent of the participating hospitals had bed sizes ranging from 100 to 299, 24 percent from 300 to 499, 19 percent <100, and 6 percent over 500. The number of patients per hospital ranged from 30 to 523. Forty-nine percent of hospitals had sample sizes of between 201 and 300 patients and 28 percent between 101 and 200 patients.
Just under half (49 percent) of the 40,172 respondents were 65 or older; 17 percent were 18–34 years old (Table 1). The majority of the sample was female (63 percent). More than 12 years of education was reported by 61 percent of the sample. Sixty-five percent of respondents were white, 19 percent Hispanic, 10 percent Asian/Pacific Islander, and 5 percent black. Sixty-one percent of respondents were admitted to the hospital through physician/clinic/health maintenance organization referral, 34 percent through the emergency room, 3 percent were transfers from another hospital, and 1 percent were skilled nursing facilities or other transfers. Ninety percent were discharged to home with self-care or home health services, 8 percent to other intermediate or nursing facilities, 1 percent to another short-term general hospital, and 1 percent left against medical advice. The percentage of the sample from medical was 49 percent, surgery was 33 percent, and maternity care was 17 percent (Table 1).
We estimated item descriptive statistics for the 18 CAHPS Hospital Survey report and rating items plus the nine additional questions added for the CHART project. Next, we estimated product–moment correlations of the CAHPS Hospital survey items and additional CHART items with the seven CAHPS Hospital Survey composites (scales). Item–scale correlations were corrected for overlap when necessary. Then, we estimated internal consistency reliability for multi-item scales, hospital-level reliability for items and scales, and the correlations of these items and scales with two “bottom-line” indicators of hospital care: the global rating of the hospital and willingness to recommend the hospital to friends and family items. We also evaluated the CAHPS Hospital Survey discharge information composite, augmented by the two CHART discharge items, and the new hypothesized coordination of care composite created in three versions, using three, four, or five new questions, respectively.
We regressed the global rating of the hospital and willingness to recommend items on the CAHPS Hospital Survey and CHART hospital items to identify the total and unique variance (the increase in R2 obtained by adding CHART items) accounted for by the report items, adjusting for age, gender, education, ethnicity, admission source, reason for admission, self-rated health, and discharge status. We corrected standard errors in the regression models for clustering within hospital (White 1980). Finally, we computed rank–order correlations of hospital-level scores between the CAHPS Hospital Survey discharge information and the augmented discharge information composite as well as between an overall summary score based on the CAHPS Hospital Survey items and an overall summary score based on the CAHPS® items plus the CHART items.
The percentage of completes for applicable items ranged from 85 percent for availability of interpreter to 99 percent for nurses' response. Missing values were primarily due to structured item nonresponse (not applicable).
Table 2 shows correlations of items with the seven CAHPS Hospital Survey composites, an augmented discharge information composite that included two of the CHART items, and coordination of care composites based on three, four, or all five CHART items (see online appendix). The item–scale correlations for hypothesized scales were generally supportive of item convergence and discrimination of the CAHPS survey items, but the physical environment items (Q8, Q9) and one nursing service item (Q4) correlated more highly with nurse communication than with their hypothesized composites.
The two CHART discharge items ([Q28]. Did someone on the hospital staff explain the purpose of the medicines you were to take at home in a way you could understand? and [Q29]. Did they tell you what danger signals about your illness or operation to watch for after you went home?) had noteworthy correlations with the CAHPS Hospital Survey discharge information composite. Another CHART item ([Q27]. Received different information from doctors and nurses?) correlated with the nurse communication (0.33) and doctor communication (0.30) as well as with the CHART coordination of care composites (0.29, 0.26, and 0.28). The interpreter availability item (Q31) had a high rate of nonapplicable responses (87 percent) and was not correlated with any of the scales; hence, we did not consider the item or the screener question any further.
Table 3 provides internal consistency and hospital-level reliability estimates for the CAHPS Hospital Survey composites and composites formed from the CHART items as well as individual CAHPS and CHART items. Internal consistency reliability for all CAHPS Hospital Survey composites except for physical environment and discharge information exceeded the 0.70 threshold for group comparisons (Nunnally and Bernstein 1994). The reliability of the discharge information composite was enhanced by adding the two CHART items: internal consistency reliability increased from 0.45 to 0.72 and hospital-level reliability from 0.75 to 0.81. While the CAHPS Hospital Survey physical environment composite had a low level of internal consistency reliability, it had the highest hospital-level reliability. Communication about medicine was the only CAHPS Hospital survey composite that did not have a hospital-level reliability >0.70. The three-, four-, and five-question coordination of care composites had adequate internal consistency (0.58, 0.66, and 0.70, respectively) and strong hospital-level reliability (0.84, 0.84, and 0.87, respectively). Hospital-level reliability was very high for two of the CHART questions (Q23—Organized admission process=0.84 and Q26—Check ID band=0.84) and only surpassed by two CAHPS items (Q4—Got help when pressed call button=0.85 and Q9—Room quiet at night=0.88). Two of the CHART items (Q24—Explained reason for room delay and Q27—Received different information from doctors and nurses) and three CAHPS items (Q13—Pain well controlled, Q16—Staff told what new medicine for, and Q17—Staff described side effects of new medicine) had hospital-level reliabilities lower than 0.70.
Correlations among the multi-item composites and with the global rating of the hospital and willingness to recommend items are provided in Tables 4 and and5,5, respectively. The strongest correlations with the global rating of the hospital and the willingness to recommend items were found for nurse communication (r's=0.62 and 0.60, respectively) and the CHART coordination of care composites (three-item correlations=0.54 and 0.53, four-item correlations=0.58 and 0.56, and five-item correlations=0.60 and 0.59, respectively). The nursing service and pain control composites also showed high correlations with the global rating of the hospital and the willingness to recommend the hospital to friends and family. The CAHPS Hospital Survey discharge information composite did not correlate very highly with these bottom-line indicators (r's of 0.27 and 0.26), the lowest correlations of any composites. This was significantly improved by the additional CHART questions (r's of 0.44 and 0.41); however, discharge information has the lowest correlations of any of the composites with overall hospital rating and willingness to recommend.
Addition of the CHART items increased the variance in global ratings explained by responses to the survey questions. The seven CHART items accounted for only about 2 percent of unique variance in the global rating and the willingness to recommend.
Table 6 shows regression coefficients of the composites and patients' characteristics on the global rating and the willingness to recommend, which were estimated in the regression models with CAHPS composites and the models with CAHPS and CHART composites. The CAHPS composites and patient characteristics accounted for 47 percent of variance in the global rating of the hospital, and the variance explained increased to 49 percent when the discharge information composite was replaced by the CHART-augmented discharge information composite and any of the CHART coordination of care composites were added to the model. In the regression model for the willingness to recommend the hospital, the percent of variance explained by the model increased from 42 to 44 percent by including the augmented CAHPS discharge composites and any of the CHART coordination of care composites.
Rank-order correlation at the hospital level (n=181) between the CAHPS discharge information composite and the augmented CHART discharge information composite was 0.91; correlation of the sum of the CAHPS items with the sum of CAHPS and CHART items was 0.95.
The CAHPS Hospital Survey is an assessment tool soon to be in use across the country that was designed to be supplemented with additional questions to reflect user-specific needs. This study was undertaken to assess the psychometric properties of the survey and to evaluate benefits of the additional survey items used in the CHART project.
The study findings are generally consistent with previous evaluations of the psychometric properties of the instrument (Keller et al. 2005; Sofaer et al. 2005; Arah et al. 2006). However, the original two-item CAHPS discharge information and physical environment composites had relatively low internal consistency reliabilities. The augmentation of the discharge information composite by two CHART items improved the psychometric properties of the composite as well as the associations with the global rating and the willingness to recommend the hospital. The results of this study also indicated a need for improvement in the two-item CAHPS communication about medicine composite. The hospital-level reliability of both items and the composite were relatively low.
Consistent with findings in other research, the nurse communication composite within the CAHPS Hospital Survey had the strongest association with the ratings and willingness to recommend (Jenkinson et al. 2002; Sofaer et al. 2005; Arah et al. 2006). For example, the regression coefficients for explaining variation in the willingness to recommend the hospital in the CAHPS Hospital Survey was 0.013 for nurse communication compared with 0.005 for the physical environment—the composite that had the second strongest association. Hence, nurse communication is the major driver of bottom-line perceptions of hospital care. When the CHART coordination of care composites are considered along with the CAHPS Hospital Survey, all three had stronger associations with the hospital ratings and willingness to recommend than the physical environment composite, making these the composites with the second strongest associations with overall hospital rating and willingness to recommend.
The strong performance of the CHART coordination of care composites supports the reconsideration of including a coordination of care domain to the CAHPS Hospital Survey. Although there has been significant discussion regarding the inclusion of a coordination of care domain within the CAHPS Hospital Survey, the discussion has been based on a broad concept of coordination of care, much of which may not be visible or detectable by the patient, about which, therefore, the respondent would not be expected to be a knowledgeable informant. It has also been suggested that coordination of care may only be recognized in its absence (Levine, Fowler, and Brown 2005). Questions that were considered and deleted during the original CAHPS Hospital Survey development asked, for example, “did staff members who cared for you know about your condition without having to ask you?”
The questions used in the CHART coordination of care domain are more specifically focused on elements of coordination that may be directly experienced and understood by the patient (e.g., “scheduled tests were performed on time”). In this study, we chose to examine three different composites in coordination of care: a three-item composite that included three questions that had been clearly mapped to coordination of care in pilot studies, a four-item composite that included the “delay going to room explained” question that mapped well in this analysis, and a five-item composite that included an additional question—“ID band checked”—that also mapped to this domain. Although all of these composites performed well in this analysis, the five-item composite is slightly superior psychometrically.
There is always tension surrounding the inclusion of additional items within any patient survey—the benefit of increased reliability of the survey as questions are added must be balanced against the additional response burden on patients. Thus, the decision to add questions to improve the discharge information domain or add a coordination of care domain must be made in the context of overall patient survey strategies. For instance, some hospitals might already be asking questions beyond the CAHPS Hospital Survey that address areas of their own choosing, and using an expanded CAHPS Hospital Survey may reduce their capacity to address these local issues. However, it will increase the amount of standardized information available if many hospitals agree to use a specific expanded version of the CAHPS Hospital Survey, as has occurred in CHART.
There is also an issue of using proprietary versus public domain questions in a patient survey. The CAHPS Hospital Survey questions are now in the public domain, but the process for moving questions from proprietary to public domain status has not yet been worked out. Furthermore, we cannot state whether the questions we used in this study are superior to similar questions from other sources. For instance, the three-item Care Transitions Measures developed by Coleman, Mahoney, and Parry (2005) might function as well as the four-item discharge information domain presented here. Further investigation of alternatives to improving the discharge information domain (and the physical environment domain) of the CAHPS Hospital Survey is needed, as is research on how best to measure coordination of care. However, our results suggest that these improvements in patient surveys are feasible and can generate important information.
In summary, the findings of this study provide further support for the reliability and validity of the CAHPS Hospital Survey. Because of its brevity, it is feasible to add items to the CAHPS Hospital Survey to provide a more psychometrically sound assessment of the current domains and to assess domains not represented. The findings of this study illustrate how such additional items can be assessed and that additional items can improve our measurement of patients' hospital experiences.
Joint Acknowledgment/Disclosure Statement: We would like to acknowledge the support of the California Health Care Foundation that was responsible for the funding of the CHART project within which the original data were generated. Dr. Dudley's work on this project was partially funded by a Robert Wood Johnson Foundation Investigator Award in Health Policy. Data analyses were supported by cooperative agreements from the Agency for Healthcare Research and Quality with Dr. Hays. Dr.Hays was supported by grants from the National Institutes on Aging and the UCLA Center for Health Improvement in Minority Elderly/Resource Centers for Minority Aging Research. We would also like to acknowledge the work of NRC+Picker in providing proprietary questions and data support to develop the CHART survey. We also would like to thank The California Hospital Assessment and Reporting Taskforce and all participating California CHART hospitals that submitted data through their vendors. In addition, we would like to thank the staff at the National CAHPS Benchmarking Database (NCBD) who developed a portal for data submission and carried out initial cleaning and analysis of the CAHPS data from hospitals. Beth Thew and Edie Wade at UCSF have also contributed their editing skills to read and reread many versions. And, most importantly, we acknowledge the thousands of patients who took the time to complete the surveys and share their experiences with the public. Many thanks to all.
The following supplementary material for this article is available online:
CAHPS® Hospital Survey Items.
This material is available as part of the online article from: http://www.blackwell-synergy.com/doi/abs/10.1111/j.1475-6773.2008.00867.x (this link will take you to the article abstract).
Please note: Blackwell Publishing is not responsible for the content or functionality of any supplementary materials supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article.