Search tips
Search criteria 


Logo of jgimedspringer.comThis journalToc AlertsSubmit OnlineOpen Choice
J Gen Intern Med. 2009 February; 24(2): 162–169.
Published online 2008 December 3. doi:  10.1007/s11606-008-0856-x
PMCID: PMC2629002

Readiness for the Patient-Centered Medical Home: Structural Capabilities of Massachusetts Primary Care Practices

Mark W. Friedberg, M.D., M.P.P.,1,2 Dana G. Safran, Sc.D.,3 Kathryn L. Coltin, M.P.H.,4 Marguerite Dresser, M.S.,5 and Eric C. Schneider, M.D., M.Sc.corresponding author1,2



The Patient-Centered Medical Home (PCMH), a popular model for primary care reorganization, includes several structural capabilities intended to enhance quality of care. The extent to which different types of primary care practices have adopted these capabilities has not been previously studied.


To measure the prevalence of recommended structural capabilities among primary care practices and to determine whether prevalence varies among practices of different size (number of physicians) and administrative affiliation with networks of practices.


Cross-sectional analysis.


One physician chosen at random from each of 412 primary care practices in Massachusetts was surveyed about practice capabilities during 2007. Practice size and network affiliation were obtained from an existing database.


Presence of 13 structural capabilities representing 4 domains relevant to quality: patient assistance and reminders, culture of quality, enhanced access, and electronic health records (EHRs).

Main Results

Three hundred eight (75%) physicians responded, representing practices with a median size of 4 physicians (range 2–74). Among these practices, 64% were affiliated with 1 of 9 networks. The prevalence of surveyed capabilities ranged from 24% to 88%. Larger practice size was associated with higher prevalence for 9 of the 13 capabilities spanning all 4 domains (P < 0.05). Network affiliation was associated with higher prevalence of 5 capabilities (P < 0.05) in 3 domains. Associations were not substantively altered by statistical adjustment for other practice characteristics.


Larger and network-affiliated primary care practices are more likely than smaller, non-affiliated practices to have adopted several recommended capabilities. In order to achieve PCMH designation, smaller non-affiliated practices may require the greatest investments.

Electronic supplementary material

The online version of this article (doi:10.1007/s11606-008-0856-x) contains supplementary material, which is available to authorized users.

KEY WORDS: primary care, quality improvement, health policy, patient centered care


Over the past decade, many commentators have called for innovations to improve the quality of outpatient care.16 Primary care practices have been encouraged to implement several structural capabilities felt likely to enhance quality. These include aiding patient self-management,7 using reminder systems for evidence-based care,1 engaging in performance measurement and feedback,8,9 providing enhanced access to care,10,11 and implementing electronic health records (EHRs).1,12,13 This has occurred at a time when primary care is facing a number of challenges,14 and for practices that do not already possess these capabilities, their implementation may require substantial financial investment.

Recently, leading primary care professional societies have jointly issued a set of principles for a “Patient-Centered Medical Home” (PCMH).15,16 These principles include greater coordination of care, continuous improvement of quality and safety, and enhanced access.15 The National Committee for Quality Assurance (NCQA) has created standards for the PCMH that identify several structural capabilities linked to the principles, such as use of evidence-based guidelines and information technology.17 These and other proposals would include incrementally higher payments to practices that adopt capabilities consistent with the PCMH.15,16,1821

Despite this interest in restructuring care delivery, the readiness of primary care practices to implement these reforms may be uneven. Primary care practices can vary in their number of physicians (“practice size”) and degree of affiliation with networks of other practices (“network affiliation”)22,23, characteristics that may influence practices’ readiness to adopt the structural capabilities of the PCMH. Larger practices and network-affiliated practices might already possess these capabilities due to economies of scale and highly-developed management.24 Smaller practices may be less likely to engage in quality improvement activities.25

To assess the current prevalence of recommended structural capabilities among primary care practices and evaluate their relationship to practice size and network affiliation, we conducted a statewide survey of over 400 primary care practice sites in Massachusetts. Based on previous research, we hypothesized that recommended capabilities would be more prevalent among larger practices than among smaller practices, and that network affiliation might enable adoption of these capabilities by smaller practices.26



Massachusetts Health Quality Partners (MHQP), a statewide quality improvement collaborative, maintains a comprehensive statewide physician directory that includes physicians’ organizational affiliations.26,27 This directory identifies the physicians at each practice and, if applicable, each practice’s affiliation with 1 of the 9 large physician networks present in the state. Because the MHQP directory is used to guide a statewide performance reporting system, physician group and network managers review these lists regularly, updating them as organizational relationships change. Each practice can affiliate with 1 network or remain non-affiliated.

For the purposes of our study, a practice was defined as a set of at least 2 physicians as well as nurses and other staff practicing at a single address. Practice size was represented by the number of physicians at each site who contributed Health Plan Employer Data and Information Set (HEDIS) measure observations on adult processes of care to the 2005 MHQP clinical performance database (an aggregation of HEDIS data on managed care enrollees in 5 participating health plans). The MHQP database did not distinguish physicians by specialty training. Each physician contributing HEDIS measure observations to the database was designated as a primary care physician by at least 1 of the 5 health plans, regardless of specialty training.

Instrument Development

Our goal was to query practicing physicians about the presence of quality enhancing structural capabilities in their practices. We selected practicing physicians as survey respondents because they may be in the best position to report on the use (in addition to the presence) of structural capabilities. Prior research also suggests that this approach can be used to measure important aspects of physician practices.28

Based on published literature and previously developed surveys on physician group characteristics,25,2934 we defined 13 key capabilities that could be assessed by a survey of practicing physicians and classified them into 4 domains: patient assistance and reminders (assistance of patient self-management, system for contacting patients for preventive services, clinical reminder systems); culture of quality (feedback to physicians on quality and patient experience, new initiatives on quality and patient experience, frequent meetings on quality performance, presence of a leader for quality improvement); enhanced access (language interpreters, providers’ spoken languages, regular appointment hours on weekends); and electronic health records (frequently-used, multi-functional EHR) [Box]. Since the sophistication of EHRs may vary, the instrument assessed 6 specific EHR functionalities in 4 of the core areas identified by the Institute of Medicine: results management (lab and radiology results), communication and connectivity (notes from consultants), health information and data (problem and medication lists), and decision support (electronic reminders).32 We also asked respondents to report on the presence of trainees (medical students, residents, or fellows) and specialist physicians (other than primary care) at their practices.

Table thumbnail

To improve its validity, the survey instrument underwent iterative revision based on cognitive testing with a small number of local physician volunteers. We found that knowledge about some of the questions (e.g., detailed practice staff composition) was limited unless the physician was also the practice manager. These questions were revised to be more appropriate to the sampled physicians or eliminated from the survey. Survey items contributing data to the current analysis are available in the Appendix (available online).

Survey Sample

We selected adult primary care practices from the 2004–2005 MHQP physician directory, which included 729 practices of ≥2 physicians contributing ≥1 HEDIS measure observation in the MHQP clinical performance database. We excluded 220 practices with too few HEDIS measure observations to be certain that they were predominantly primary care practices [Fig. 1]. We excluded 19 practices that had data on only the Chlamydia screening HEDIS measure because these practices delivered predominantly pediatric care (as confirmed by direct contact).

Figure 1
Derivation of study sample.

To ensure that the sampled respondent would report features pertaining only to the practice of interest, we excluded multiple-practice physicians (6.6% of those remaining) who delivered care at 2 or more practices. This resulted in exclusion of 13 practices containing only multiple-practice physicians. To identify part-time physicians, we first calculated the total number of HEDIS observations (across all measures) contributed by each physician in 2004 and 2005. Then, for a convenience subset of physicians known to be part-time or full-time practitioners, we identified the threshold number of HEDIS observations that adequately discriminated between part-time or full-time practitioners. We excluded 62 practices containing only physicians identified as part-time practitioners using this threshold.

To obtain the survey sample, we sampled one physician at random from each of the remaining 415 practices and confirmed each sampled physicians’ practice address against data from the Massachusetts Board of Registration in Medicine. Discrepancies between the MHQP physician directory and Board of Registration addresses were resolved by telephoning physicians’ offices. Thirteen sampled physicians were no longer at the practice of interest. Ten of these were replaced by an alternate physician selected at random from rosters of the same practice. There was no eligible alternate physician at 3 practices, and these were excluded from the study. The final study sample consisted of 412 practices, each containing 1 randomly sampled physician whose ongoing clinical presence was confirmed via Board of Registration address match or telephone call.

Survey Administration

We conducted the survey during May–October of 2007. The first wave was followed by up to four waves of reminders, including letters and telephone calls, to prompt non-respondents to return the survey.


In general, survey items identified the presence or absence of a quality enhancing capability. Item non-response rates were  < 5% for each survey question included in the analysis. For the purposes of the current analysis, we considered “no” and “don’t know” responses to be equivalent. For each item that featured more than 2 possible responses (e.g., 5-point Likert scales), we created a dichotomous variable by establishing a threshold near the midpoint of the item response distribution. For EHRs we created a composite of the 6 functionalities assessed by the survey, classifying a practice’s EHR as a “frequently-used, multi-functional EHR” only if it contained all 6 functionalities and was reported as being used “usually” or “always” during a typical day in clinic. We designated practices affiliating with any of the 9 networks as network-affiliated and all practices not affiliating with a network as non-affiliated. For ease of presentation, we grouped practice sizes into 4 categories that approximated quartiles: 2, 3–4, 5–8, and ≥9 physicians.

To assess the impact of survey non-response, we compared the size and network affiliation of practices with and without a response. We assessed the bivariate relationships between surveyed capabilities and our major predictors—size category and network affiliation status—using linear probability regression models predicting presence of each capability as a function of each major predictor. We then constructed multivariable regression models predicting the presence of each capability as a function of practice size category and network affiliation as well as the following potential confounders: presence of trainees, and presence of multiple specialties. Analyses based on alternative groupings of practice size (quintiles, deciles, and continuous physician count) produced similar results in all models. We also introduced interaction terms between network affiliation and size category, but these terms were not statistically significant and were therefore not retained. For capabilities with significant relationships to network affiliation, we introduced fixed effects representing each of the 9 networks and tested for differences between the networks.

All statistical analyses were performed with SAS software, version 9.1.3 (SAS Institute, Inc., Cary, North Carolina) using generalized estimating equations with exchangeable working correlation structures and empirical standard errors to adjust estimated variances for clustering of observations within networks.35,36P values  < 0.05 were considered statistically significant for all comparisons. Linear probability models were chosen for ease of generating adjusted capability prevalences (by holding covariates at their mean values), with observed probabilities being within a reasonable range for this type of modeling. To confirm the stability of statistical inferences, we re-ran all analyses using logit-linked generalized models with substantively similar results.

This study was approved by the Human Subjects Committee of the Harvard School of Public Health.

Role of the Funding Source

The study was supported by the Commonwealth Fund. No funding source had a role in the study design, analyses, or decision to submit the manuscript for publication.


We received 308 completed surveys (response rate 75%). There were no significant differences in the distribution of size or rates of network affiliation between responding and non-responding practices. Among respondents, practices varied in size from 2 to 74, and 64% of practices were network-affiliated [Table 1]. Overall, 47% of practices were teaching practices and 45% were multi-specialty (i.e., specialists other than primary care physicians were present). Larger practices were significantly more likely than smaller practices to be network-affiliated (76% in the largest quartile vs. 47% in the smallest, P = 0.03), teaching (54% vs. 35%, P < 0.001), and multi-specialty (67% vs. 20%, P < 0.001).

Table 1
Characteristics of Surveyed Practices

The prevalence of individual structural capabilities among practices ranged from 24% (practices regularly open to provide care on weekends) to 88% (respondent aware of results on measures of clinical quality) [Table 2]. In addition to awareness of clinical quality results, the most common capabilities included reminders for guideline-based preventive care (87%) and initiatives to improve results on measures of clinical quality (73%). In addition to providing care on weekends, the least common individual quality enhancing capabilities included on-site interpreter services (32%), and frequent meetings to discuss quality (43%). Thirty-three percent of practice respondents reported frequent use of a multi-functional EHR. The prevalence of specific EHR functionalities varied widely: 88% of respondents reported availability of radiology and laboratory via computer, but only 44% reported receiving electronic reminders to provide guideline-based preventive care for eligible patients.

Table 2
Unadjusted Associations Between the Prevalence of Quality Enhancing Capabilities, Practice Size, and Network Affiliation Status

Compared to smaller practices, respondents in larger practices were more likely to report presence of 9 capabilities spanning all 4 domains (P < 0.05 for each capability). Network-affiliated practices were more likely than non-affiliated practices to have 5 capabilities in 3 domains (P < 0.05 for each capability). The only exception to this trend was that network-affiliated practices were less likely than non-affiliated practices to regularly provide care on weekends (29% vs. 22%, P = 0.04).

After adjustment for potential confounders, the direction of each significant bivariate relationship between practice size, network affiliation, and structural capabilities was unchanged [Table 3]. However, statistical significance was lost for relationships between size, network affiliation, and presence of a clinician speaking a language other than English. There were no statistically significant interactions between the larger size and network affiliation for any of the capabilities. Neither the presence of trainees nor the presence of physicians of multiple specialties were consistently associated with capabilities [data not shown].

Table 3
Adjusted Associations Between the Prevalence of Quality Enhancing Capabilities, Practice Size, and Network Affiliation Status

There was significant heterogeneity among the nine networks in the adjusted prevalence of 3 of the 5 capabilities shown to be associated with network affiliation (awareness of patient experience ratings, initiatives to improve patient experience ratings, and frequently-used, multi-functional EHRs) [Fig. 2]. For awareness and initiatives to improve patient experience ratings, the adjusted prevalence was higher in ≥7 of the 9 networks than among non-affiliated practices. However, only 5 networks had higher adjusted prevalence of multi-functional EHRs (range 0–73%) than the average prevalence among non-affiliated practices (18%).

Figure 2
Adjusted network-by-network prevalences of capabilities associating with network affiliation. For all 3 panels, P<0.001 for variation among networks. Point estimates and 95% CI's (represented by error bars) are calculated using multivariable logistic ...


Recent efforts to improve the quality of primary care have encouraged a new practice model: the Patient-Centered Medical Home (PCMH).1,15 Formal standards defining the structural capabilities of the PCMH are still evolving, but they include capabilities in the 4 investigated domains: patient assistance and reminders, culture of quality, enhanced access, and electronic health records (EHRs).16,17,21 Where these capabilities are not already in place, their adoption may require substantial investment.

In a statewide cohort of primary care practices ranging in size from 2 to 74 physicians, we found wide variation in the presence of 13 pre-defined quality enhancing capabilities that we studied. Larger practices were significantly more likely than smaller practices to have 9 of 13 possible capabilities. Network-affiliated practices were more likely than non-affiliated practices to have 5 of the 13 capabilities. Larger practice size and network affiliation were both associated with higher prevalence of capabilities in 3 domains of improvement: feedback and improvement infrastructure, linguistic capabilities, and EHRs. However, the provision of weekend care was significantly lower among network-affiliated practices compared to non-affiliated practices, possibly reflecting the consolidation of these services to a limited number of locations. None of the findings were substantially altered by adjustment for other practice characteristics.

Primary care practices vary in many ways. Practice size and network affiliation, among other organizational characteristics, may affect practices’ ability to make PCMH investments. Prior studies of groups containing at least 20 physicians have found higher prevalence of structural capabilities among larger physician organizations and integrated groups than among smaller organizations and independent practice associations.29,37,38 However, smaller practices like those that we studied constitute the dominant organizational model for primary care in the United States.39 Our findings are consistent with other studies showing that even among these smaller practices, larger size is associated with higher rates of EHR adoption and physician participation in quality improvement.25,28,40,41 The lack of a consistent relationship between teaching status and structural capabilities may reflect competing organizational priorities within teaching practices.

The association between network affiliation and EHR adoption was more complex than for other capabilities. Overall, network affiliation was associated with frequently-used, highly-functional EHRs. However, their prevalence varied significantly between the 9 networks, and 4 networks had lower adjusted prevalence than the average among non-affiliated practices. This finding suggests that although some networks have fostered investment in EHRs, other factors (e.g., grants from health plans and quality improvement organizations) may also influence EHR adoption.

The study has limitations. First, the capabilities we studied do not encompass all of the potential attributes of the PCMH. Recent and evolving standards include attributes like having a personal physician-directed medical practice, providing access to care via telephone or email, and identifying health conditions important to the practice.1517,21 Our intent was to evaluate, in a way suitable for research, capabilities common to PCMH proposals that were likely to be especially pertinent to measured quality, to require investment, and to be familiar to practicing physicians. Second, we chose to survey practicing physicians rather than managers. This constrained our ability to gather some types of practice information (e.g., detailed staff composition). Third, a cross-sectional study design limits inferences about causation. We cannot distinguish whether larger practice size makes investment in quality enhancing capabilities more likely, or if such capabilities lead to growth. Similarly, we cannot tell whether some networks invest in EHRs or whether some networks selectively affiliate with practices already having EHRs. Finally, the study took place in Massachusetts, a relatively small state with a high concentration of major medical teaching centers. Some of these centers have led the adoption of EHRs, and their graduates may practice locally. This may limit the generalizeability of some study findings.

Proposals to create PCMH standards and link payments to practices’ quality enhancing capabilities signal a renewed focus on structural measures of primary care quality—though whether possession of these structural capabilities for quality will translate to better processes or outcomes remains uncertain.2,42 Our analysis suggests that on many structural measures, larger practices have an advantage over smaller practices, with network affiliation conferring a narrower advantage. Small, non-affiliated practices may therefore require the largest investments in order to achieve PCMH designation. If strong financial or regulatory incentives are tied to PCMH designation, such practices may be encouraged to grow, merge, or affiliate with networks of other sites. Alternatively, if structural investment proves too difficult for small practices, these practices may not be able to count on PCMH-based payments in order to survive the “crisis” in primary care;14,43 structural improvements in primary care could come at the price of new access problems in areas predominantly served by small practices. As PCMH pilot programs go forward, policy makers should monitor their effects on both quality and access to primary care.

Electronic Supplementary Material

Below is the link to the electronic supplementary material


We thank Katherine Howitt, M.A. for invaluable assistance in fielding the survey. We thank the Massachusetts Medical Society for helpful comments on early drafts of the survey instrument and for its encouragement of physician respondent participation.

This study was supported by the Commonwealth Fund. Dr. Friedberg was supported by a National Research Service Award from the Health Resources and Services Administration (5 T32 HP11001 20).

Conflict of Interest None disclosed.


Financial support

This study was supported by the Commonwealth Fund. Dr. Friedberg was supported by a National Research Service Award from the Health Resources and Services Administration (5 T32 HP11001 20).

Prior presentations

Partial results from this paper were orally presented at the National Meeting of the Society of General Internal Medicine in Pittsburgh, PA, on April 12, 2008.

Electronic supplementary material

The online version of this article (doi:10.1007/s11606-008-0856-x) contains supplementary material, which is available to authorized users.


1. Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, D.C.: National Academies Press; 2001. [PubMed]
2. Shortell SM, Rundall TG, Hsu J. Improving patient care by linking evidence-based medicine and evidence-based management. JAMA. 2007;298(6):673–676. Aug 8. [PubMed]
3. Chassin MR, Galvin RW. The urgent need to improve health care quality. Institute of Medicine National Roundtable on Health Care Quality. JAMA. 1998;280(11):1000–1005. Sep 16. [PubMed]
4. Grumbach K, Bodenheimer T. A primary care home for Americans: putting the house in order. JAMA. 2002;288(7):889–893. Aug 21. [PubMed]
5. Bergeson SC, Dean JD. A systems approach to patient-centered care. JAMA. 2006;296(23):2848–2851. Dec 20. [PubMed]
6. Casalino LP. Disease management and the organization of physician practice. JAMA. 2005;293(4):485–488. Jan 26. [PubMed]
7. Bodenheimer T, Lorig K, Holman H, Grumbach K. Patient self-management of chronic disease in primary care. JAMA. 2002;288(19):2469–2475. Nov 20. [PubMed]
8. Institute of Medicine. Performance Measurement: Accelerating Improvement. Washington, D.C.: The National Academies Press; 2006.
9. Jamtvedt G, Young JM, Kristoffersen DT, O’Brien MA, Oxman AD. Audit and feedback: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2006;19(2):CD000259. [PubMed]
10. Institute of Medicine. Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care. Washington, D.C.: National Academies Press; 2002. [PMC free article] [PubMed]
11. Murray M, Berwick DM. Advanced access: reducing waiting and delays in primary care. JAMA. 2003;289(8):1035–1040. Feb 26. [PubMed]
12. Bates DW, Ebell M, Gotlieb E, Zapp J, Mullins HC. A proposal for electronic medical records in U.S. primary care. J Am Med Inform Assoc. 2003;10(1):1–10. Jan–Feb. [PMC free article] [PubMed]
13. Bodenheimer T, Grumbach K. Electronic technology: a spark to revitalize primary care? JAMA. 2003;290(2):259–264. Jul 9. [PubMed]
14. Bodenheimer T. Primary care-will it survive? N Engl J Med. 2006;355(9):861–864. Aug 31. [PubMed]
15. American Academy of Family Physicians, American Academy of Pediatrics, American College of Physicians, American Osteopathic Association. Joint principles of the patient-centered medical home. March 2007; Accessed September 30, 2008.
16. American College of Physicians. The advanced medical home: a patient-centered physician-guided model of health care. January 2006; Accessed September 30, 2008.
17. National Center for Quality Assurance. NCQA Program to Evaluate Patient-Centered Medical Homes: Press Release; 2008, January 8.
18. Martin JC, Avant RF, Bowman MA, et al. The Future of Family Medicine: a collaborative project of the family medicine community. Ann Fam Med. 2004;2(Suppl 1):S3–S32. Mar–Apr. [PubMed]
19. Davis K, Schoenbaum SC, Audet AM. A 2020 vision of patient-centered primary care. J Gen Intern Med. 2005;20(10):953–957. Oct. [PMC free article] [PubMed]
20. Goroll AH, Berenson RA, Schoenbaum SC, Gardner LB. Fundamental reform of payment for adult primary care: comprehensive payment for comprehensive care. J Gen Intern Med. 2007;22(3):410–415. Mar. [PMC free article] [PubMed]
21. Tax Relief and Health Care Act of 2006, Pub. L. No. 109–432, 120 Stat. 2922 (Dec 20, 2006).
22. Robinson JC. Consolidation of medical groups into physician practice management organizations. JAMA. 1998;279(2):144–149. Jan 14. [PubMed]
23. Robinson JC, Casalino LP. Vertical integration and organizational networks in health care. Health Aff. 1996;15(1):7–22. Spring. [PubMed]
24. Berenson RA, Hammons T, Gans DN, et al. A house is not a home: keeping patients at the center of practice redesign. Health Aff. 2008;27(5):1219–1230. Sep–Oct. [PubMed]
25. Audet AM, Doty MM, Shamasdin J, Schoenbaum SC. Measure, learn, and improve: physicians’ involvement in quality improvement. Health Aff. 2005;24(3):843–853. May–Jun. [PubMed]
26. Friedberg MW, Coltin KL, Pearson SD, et al. Does affiliation of physician groups with one another produce higher quality primary care? J Gen Intern Med. 2007;22(10):1385–1392. Oct. [PMC free article] [PubMed]
27. MHQP website. Accessed September 30, 2008.
28. Simon SR, Kaushal R, Cleary PD, et al. Correlates of electronic health record adoption in office practices: a statewide survey. J Am Med Inform Assoc. 2007;14(1):110–117. Jan–Feb. [PMC free article] [PubMed]
29. Casalino L, Gillies RR, Shortell SM, et al. External incentives, information technology, and organized processes to improve health care quality for patients with chronic diseases. JAMA. 2003;289(4):434–441. Jan 22–29. [PubMed]
30. Mehrotra A, Epstein AM, Rosenthal MB. Do integrated medical groups provide higher-quality medical care than individual practice associations? Ann Intern Med. 2006;145(11)826–833. Dec 5. [PubMed]
31. Li R, Simon J, Bodenheimer T, et al. Organizational factors affecting the adoption of diabetes care management processes in physician organizations. Diabetes Care. 2004;27(10):2312–2316. Oct. [PubMed]
32. Institute of Medicine. Key Capabilities of an Electronic Health Record System. Washington, D.C.: National Academies Press; 2003.
33. Bodenheimer T, Wang MC, Rundall TG, et al. What are the facilitators and barriers in physician organizations’ use of care management processes? Jt Comm J Qual Saf. 2004;30(9):505–514. Sep. [PubMed]
34. Shortell SM, Marsteller JA, Lin M, et al. The role of perceived team effectiveness in improving chronic illness care. Med Care. 2004;42(11):1040–1048. Nov. [PubMed]
35. Liang K, Zeger S. Longitudinal data analysis using generalized linear models. Biometrika. 1986;73:13–22.
36. Zeger SL, Liang KY. Longitudinal data analysis for discrete and continuous outcomes. Biometrics. 1986;42(1):121–130. Mar. [PubMed]
37. Schmittdiel J, McMenamin SB, Halpin HA, et al. The use of patient and physician reminders for preventive services: results from a National Study of Physician Organizations. Prev Med. 2004;39(5):1000–1006. Nov. [PubMed]
38. Rittenhouse DR, Casalino LP, Gillies RR, Shortell SM, Lau B. Measuring the medical home infrastructure in large medical groups. Health Aff. 2008;27(5):1246–1258. Sep–Oct. [PubMed]
39. Casalino LP, Devers KJ, Lake TK, Reed M, Stoddard JJ. Benefits of and barriers to large medical group practice in the United States. Arch Intern Med. 2003;163(16):1958–1964. Sep 8. [PubMed]
40. Burt CW, Sisk JE. Which physicians and practices are using electronic medical records? Health Aff. 2005;24(5):1334–1343. Sep–Oct. [PubMed]
41. Simon SR, Kaushal R, Cleary PD, et al. Physicians and electronic health records: a statewide survey. Arch Intern Med. 2007;167(5):507–512. Mar 12. [PubMed]
42. Donabedian A. Evaluating the quality of medical care. The Milbank Memorial Fund Quarterly. 1966;44(3):166–203. [PubMed]
43. Bodenheimer T. Coordinating care-a perilous journey through the health care system. N Engl J Med. 2008;358(10):1064–1071. March 6. [PubMed]

Articles from Journal of General Internal Medicine are provided here courtesy of Society of General Internal Medicine