Search tips
Search criteria 


Logo of jamiaAlertsAuthor InstructionsSubmitAboutJAMIA - The Journal of the American Medical Informatics Association
J Am Med Inform Assoc. 2009 Nov-Dec; 16(6): 775–783.
PMCID: PMC3002122

Clinical Case Registries: Simultaneous Local and National Disease Registries for Population Quality Management

Lisa I. Backus, MD, PhD, a , * Sergey Gavrilov, MS, b Timothy P. Loomis, PhD, a James P. Halloran, RN, MSN, CNS, a Barbara R. Phillips, PhD, a Pamela S. Belperio, PharmD, a and Larry A. Mole, PharmD a


The Department of Veterans Affairs (VA) has a system-wide, patient-centric electronic medical record system (EMR) within which the authors developed the Clinical Case Registries (CCR) to support population-centric delivery and evaluation of VA medical care. To date, the authors have applied the CCR to populations with human immunodeficiency virus (HIV) and hepatitis C virus (HCV). Local components use diagnosis codes and laboratory test results to identify patients who may have HIV or HCV and support queries on local care delivery with customizable reports. For each patient in a local registry, key EMR data are transferred via HL7 messaging to a single national registry. From 128 local registry systems, over 60,000 and 320,000 veterans in VA care have been identified as having HIV and HCV, respectively, and entered in the national database. Local and national reports covering demographics, resource usage, quality of care metrics and medication safety issues have been generated.

Introduction and Background

The US Department of Veterans Affairs (VA) operates the largest medical system in the United States with 153 medical centers, 745 community-based outpatient clinics, 135 nursing homes, 43 domiciliaries, and 225 readjustment counseling centers grouped into 21 Veterans Integrated Service Networks (VISNs) around the United States. In Federal Government fiscal year 2008, the VA provided care to 5.5 million veterans. 1

The VA has a long and successful history of using information technology in delivering high-quality health care. In the late 1970s, development started on the VA electronic medical record system (EMR), now known as VistA (Veterans Health Information Systems and Technology Architecture). 2 CPRS (Computerized Patient Record System), the graphical user interface (GUI) component of VistA, was released in the late 1990s. The installation of CPRS was mandated nationally in 1999 and thus, medical documentation and ordering have been fully computerized at all VA facilities for almost a decade.

From its inception, the EMR software relied on flexible local configuration while requiring that code be portable across the entire VA. While a common structure was mandated for data dictionaries, local users were given control over the content. For example, each site has local legacy files for medication and laboratory definitions. While the system has successfully met the needs of local users, the local legacy systems must be mapped to a common set of codes when data aggregation is required, which is often a major undertaking. Furthermore, allowing free-text data entry in most fields, another flexible feature, poses major problems for machine interpretation of the data when portability is required.

Numerous decision-support tools have been developed within VistA/CPRS that prompt the health care provider on a range of issues such as appropriate medication dosing, potential medication interactions, preventative medicine recommendations, and routine clinical condition monitoring. These tools function on a local level once a clinician has accessed an individual patient's medical record.

Despite the availability of these support tools, VistA/CPRS does not provide the local clinician or administrator with ready ability to assess the care of a population of individuals. For example, decision support tools prompt a clinician seeing a diabetic patient regarding delivery of appropriate preventative services, but a clinician or administrator cannot readily assess the frequency with which these same preventative services are provided to a group of diabetic patients. Being able to generate population information for a distinct location and to compare that location to other locations or to the national VA system or external systems is critical to assessing safe, effective, and cost-effective care.

The VA has developed, implemented, and refined a registry system, the Clinical Case Registries (CCR), which uses the established EMR to identify cohorts of veterans with targeted conditions at the local and national level. The CCR incorporated lessons learned from two earlier VA registries: one for human immunodeficiency virus (HIV), known as the Immunology Case Registry (ICR), 3 and one for hepatitis C virus (HCV), known as the Hepatitis C Case Registry (HCCR). These two registries supported only limited local reports and transmitted limited data elements to a national database. In contrast, in the CCR, local clinicians can generate customizable reports on a variety of issues including resource usage, medication safety and quality of care on groups within their local population. On a national level, the CCR collects and aggregates selected clinical and demographic data from the EMR and structures it in a modern relational database system for statistical analysis. This system was designed and implemented in the context of an already well-established EMR. We believe that our experience designing this system of local and national registries and moving data from local legacy medical-records systems into a relational database system, while maintaining data integrity, quality and security, should prove useful to others undertaking similar or related tasks.

Design Objectives

Purpose and Intended Users

The purpose of the CCR is to support accurate identification of a population locally and nationally, provide customizable reporting on the local population, and create a national database to support administrative, clinical and research needs, including quality management.

We identified local clinicians, local and national healthcare administrators, quality management staff, and VA researchers as potential users of the CCR. The VA clinical staff based at the Center for Quality Management in Public Health (CQM) is the primary user of the national database. The CQM staff provided software specifications and conducted extensive alpha and beta testing on both local and national CCR software.

Desired Performance

In the design process, we identified several important performance characteristics of a successful population-management system. First, the system should identify a target population with a high degree of sensitivity and specificity. Second, the system should provide local and national report-generation capabilities that can be customized by the end user. Third, the local system should be simple to learn, easy to use, and local reports must be quick to finish. Fourth, because veterans receive care at multiple facilities, data from all medical facilities should be aggregated to the unique-patient level. Fifth, the local and national systems had to operate in a changing clinical environment. Sixth, local, regional and national level data on agreed-upon measures should be available to permit comparisons between local, regional and national performance. Finally, both the local and national systems had to be secure.

System Description

Multiple Registries

The CCR software allows multiple registries, each with registry-specific data elements. The same patient may be included in multiple registries. A patient is entered once in the local CCR when selected for his/her first registry with flags set to identify entry into the first and any subsequent registry.

The CCR has two registries defined: HIV and HCV. These two chronic viral infections share risk factors and about one third of veterans in the HIV registry are also in the HCV registry.

Local Case Identification and Confirmation

The local CCR software package resides within the VistA environment at 128 VA reporting facilities which encompass data from all VA care sites. To identify patients potentially with HIV or HCV, the CCR software scans VistA records each night for new occurrences of a set of International Classification of Diseases − 9th Revision (ICD-9) codes in inpatient, outpatient visit, and problem-list records. The software also scans the result field of a set of laboratory tests identified by Logical Observation Identifiers Names and Codes (LOINC) for any new occurrences of a “positive” test result. This result field is a free text field. Using data from the ICR, we developed an algorithm to identify positive results as those that equal “P” or contain “POS” “DETEC” or “REACT” and do not contain “NEG” “NO” or “IND.” A patient identified as meeting the electronic criteria for a target condition is called a “pending” patient.

At each of the 128 reporting facilities, a local staff member, referred to as a “CCR coordinator,” is responsible for a periodic, manual review of the list of pending patients. The coordinator uses a GUI interface in the CCR software, which identifies each pending patient at that facility and displays the initial diagnoses and/or laboratory results indicative of the target condition as well as subsequent information that might help to confirm infection. Patients who have the condition of interest are confirmed manually by the coordinator and added to the list of local registry patients. That night, information on confirmed patients is transmitted and added to the national registry.

Patients who are not confirmed as having the condition of interest are manually deleted from the list of pending patients. After a patient has been deleted from the pending list, date triggers ensure that any new diagnosis or laboratory test result indicative of the target condition results in that patient being added once again to the pending list.

Local Data Security and Access to the CCR

Security of medical health information is a top priority for VA. Since the local CCR contains information protected under federal law, access to it must be restricted. Access to the CCR software package is managed through the VistA software structure of security key and menu assignment. Local facilities are requested to restrict access to the CCR to those who need access for routine medical care, quality and safety initiatives, administrative oversight and, when approved in accordance with regulations, for research. Access is provided at three levels. Office of Information Technology (OIT) staff hold programmer keys which permit full access. Staff with administrator keys can process pending patients, create local customized fields, set up local report parameters, and run local reports. Staff with user keys can only run local reports.

Local Fields and Local Registry Management Tools

Users with administrative keys can create an unlimited number of local user-definable fields to “flag” subsets of patients in a single registry. For example, local staff might want to identify all veterans in the CCR: HCV who have successfully completed HCV antiviral therapy. A local field can be created and, for each patient to whom it applies, the flag can be set and a user-defined date entered for each patient. Information from local definable fields is not transmitted to the national database.

In addition, the local CCR software supports fields completed by local users that are transmitted to the national database. Currently, such local user-completed fields are required for the CCR: HIV. The local coordinator must manually enter a patient's risk factors for HIV and whether the patient had a prior clinical AIDS diagnosis because this information cannot be obtained from existing VistA data fields.

Local Reporting

The local CCR software provides extensive local population-based and patient-based reports on patients confirmed into the local registry. These reports were developed to meet several basic needs of front-line clinicians and local administrators. There are 16 reports available, roughly categorized as (1) registry reports, (2) clinical reports, and (3) usage reports, although the uses overlap. The registry reports concern basic demographics of patients confirmed in the local registry. Usage reports concern inpatient stays, laboratory tests, outpatient prescriptions, outpatient visits, procedures, radiology and summary usage. Clinical reports concern patient follow-up at outpatient clinics, current hospitalization, diagnoses, medication histories, and combinations of medication and laboratory results.

To produce these local reports local VistA data are queried. The reporting software accounts for variability in local dictionaries and the addition of new medications. All local laboratory names ever used for key tests of the conditions of interest are identified by the local coordinator and stored as local report parameters. When new disease-specific medications become available, local report files of National Drug Code (NDC) numbers for registry-related medications are updated via release of a patch. In addition, for all reports, the user is prompted to select the data elements from the local dictionaries. Report-definition selections can be saved in a template format by a user for use in future queries. A report can be scheduled to run at a particular time of day (to limit impact during office hours), on a particular day, or at a user-defined interval. Improvements to the reports are made periodically and additional local reports have been added via the release of patches to the software.

Users can customize local reports in several ways. Users can define the target date range, select aggregate or individual data, and select particular groups of data elements (e.g., diagnoses or medications). The population can be set to include or exclude patients who have a particular diagnosis or who have been identified with a local flag as described above. The population can also be limited to those receiving care in a time period where “care” is defined as having any of the following types of usage: inpatient stay, laboratory test, outpatient prescription fill, outpatient visit or radiology procedure. In addition, users can create groups of medications with “or” logic within groups and “and” logic between groups. Moreover, CCR users with access to multiple registries can limit the report population to patients who are, or are not, in the other registry. Thus, a user of the CCR: HIV can opt to include only those patients also in the local CCR: HCV, effectively creating population reports for the HCV-HIV coinfected population.

The clinical “Combined Meds and Lab Report” permits the query of both pharmacy and laboratory data. The power of this report resides in the logic that can be applied between prescription and laboratory data. The user can qualify a query based on who did or did not receive a group of medications in combination with receiving or not receiving a laboratory test or receiving a test with results in a selected range. Using this report, for example, one can list all HIV patients whose most recent cholesterol level was above a particular threshold and who are not receiving any lipid-lowering medications.

Local Data Packaging and Transmission to the National Database

The goal of the national CCR database is to create a concise and timely snapshot of key health care data that maximizes the utility of the information for administrative, quality of care, safety, and research activities. Following consultation with both clinical and administrative end users, we narrowed the data selection for transmission to the national CCR database ([triangle]).

Table 1
Table 1 Key Data Elements Extracted by CCR

On a nightly basis, the local CCR software uses standard VistA application programming interfaces (APIs) to extract data from the VistA packages, assemble HL7 messages and transmit them via TCP/IP to the national database. The local CCR software extracts all new relevant data elements for all patients in the registry based on identified date fields in each package. Wherever possible, records include a patient identifier, usage date, record modification date, location, provider information, coding standard where applicable, result, and any qualifiers or comments. For additional data security, the patient identifiers are a locally assigned number and an encoded Social Security Number (SSN) and not a direct patient identifier such as name or the actual SSN. Extraction to the national CCR is lagged 7 days after a record's initial creation to allow for correction or expansion of a record shortly after its creation, which occurs frequently. For example, laboratory tests results are often added to a laboratory record a few days after the specimen was submitted and the initial record created. Since a single veteran may appear in multiple registries, the HL7 message is constructed to allow multiple copies of registry data elements but only one copy of clinical data elements. When a new patient is confirmed into the registry, the CCR software completes a one time “back pull” of all the VistA data elements included in the national database for that patient, thereby capturing his or her historical data. A HL7 message size limit of 5GB was required to accommodate network traffic policy and message processing restrictions in place at CCR launch.

National Database for Data Acquisition (Production)

At the national database, e*Gate (SeaBeyond, now Sun Microsystems) is used to receive the HL7 message, acknowledge receipt to the local site, and forward messages to customized parsing software. The acknowledgment receipt also confirms the period for the next nightly extract. The custom-coded parser software performs four key duties. First, after checking that the message structure is valid, the parser initiates message parsing into a staging database (currently Oracle 10 g). Second, because the HL7 messaging format is not space efficient, all valid messages are compressed and saved on a separate server for source data backup. The compressed files require 1/20th of the storage space required by the HL7 messages. Third, a copy of each message is forwarded to the development server where additional work can be conducted on data validation, integrity, and standardization. Finally, the parsing software completes the migration of data from the staging database to the clean database by merging new or revised data with existing data.

The national database structure was optimized over VistA to limit data duplication. Even with the 7-day delay in data extraction from local VistA files, later changes in data that result in retransmission of data could result in duplicate entries. To prevent duplication, record keys (outside of the patient identifier) used in the local VistA files are maintained in the national CCR database. When data are retransmitted, the unique record key identifies the record as pre-existing and overwrites the pre-existing record with the newly transmitted data.

As of January 2008, the production database was approximately 400 GB in size. The largest table is laboratory records with over 450 million rows; it dwarfs the next largest table, outpatient diagnoses with approximately 90 million rows. The amount of data collected per patient varies and depends on the amount of care a patient uses, which in turn depends on the severity of his or her target disease, the existence of other medical conditions and extent of pharmacological treatment.

National Database for Analysis

Because of the nightly addition of data, analyses requiring static data cannot use the production database. On a quarterly basis, static “snapshots” of the production database are created on a separate server for use in a variety of healthcare operations. For each patient, the national CCR receives both a locally assigned patient identifier and an encoded SSN such that all information for a unique patient can be aggregated based on the encoded SSN. If a change is made locally in a patient's SSN, the national database will receive the new encoded SSN without a change in the local patient identifier. Such inconsistencies are flagged and then reviewed for adjudication.

Prior to any analysis of these quarterly snapshots, additional work is required to standardize various data elements. Several data elements in the national database incorporate the standardized coding systems of Current Procedural Terminology (CPT-4) codes for procedures, ICD-9 codes for diagnoses, LOINC codes for laboratories and NDC numbers for medication names. For data fields that rely on local mapping to a coding standard, we have developed national custom-coded mapping tools to identify and repair errors made in the local maps. For example, each site must locally map every local laboratory test name to a LOINC code; inevitably errors are made in the process and these are corrected in the national database. Currently, remapping on the national database exists for laboratory tests and medication names. For many of the large, text only fields (e.g., radiology reports), we do not perform any standardization and must rely on ad hoc free text queries.

Using these snapshots, we conduct scheduled and ad hoc queries related to resource usage, administration, operations, quality and drug safety. Datasets from the static snapshots are provided to VA researchers through a regularly scheduled application process and to other VA program offices for operational activities such as budget planning.

National Database Monitoring and Data Validation

As with any secondary database involving complex data integration, the national CCR may not be fully consistent with the local source data. The size of the database and the nightly addition of data make full validation too costly to complete in a timely manner. Instead, we approach data validation in three ways. First, we conducted an extensive review of initial data received from every facility in every field. Second, on an ongoing basis, we undertake monthly, facility-level data field counts to assess trends for each facility with the goal of quickly identifying missing data. Third, we include validation in all aspects of our quality and safety national initiatives. For example, if we need to investigate a drug safety issue related to Drug X causing abnormal Lab Y, then we will conduct an initial validation analysis on both Drug X and Lab Y to ensure that we are collecting information from each local site at the expected rate.

Status Report

Implementation of CCR Software

The CCR software was launched in March 2004 to create a HIV registry. The HIV registry was selected as the initial registry given the experience with the ICR and to permit validation of both local and national registries with a relatively small cohort. The CCR: HIV was auto-populated with the patients in the ICR—a registry for which case identification was completely manual. The auto-populated data consisted of 73,171 local registry patients representing 56,320 unique patients. An historical data query using the diagnosis and laboratory case identification criteria of the CCR: HIV identified another 23,746 patients as pending. These patients had not been in the ICR and hence were not in the CCR: HIV. Of these not-previously identified pending patients, 5,326 (22%) were manually confirmed by local staff into the CCR: HIV after review of local information including text notes and scanned documents not amenable to machine review. Their addition increased the number of local registry patients by 7%, suggestive of the extent of undercounting in a completely manual system relative to an automated system with manual confirmation.

Validation checks identified some errors in data transmission to the national CCR and in local report functions. After three patch releases and additional data validation, a fourth patch was released in April 2005 to perform a one-time back pull of all national registry data elements available from 1/1/85 to the present on all CCR: HIV patients.

In February 2006, an HCV registry was added to the CCR software via a patch. The HCV patient lists were migrated from the HCCR which had automatic addition of patients based on diagnosis codes and laboratory test results. Overall, 380,862 local registry patients representing 366,527 unique patients were migrated from the HCCR to the CCR: HCV. Since the HCCR had on-going automatic addition of patients there was no historical sweep to identify missing cases. At the implementation of the CCR: HCV, a one-time historical back pull of all national registry data elements for HCV-infected veterans was conducted at each of the reporting VA facilities. Each reporting facility continues to transmit 1 or 2 HL7 messages per night with parsing and database load times completed within a few hours.

Continuing Case Identification

As detailed above, the local CCR software performs a nightly search to identify potential new cases. Since the initial installation of the CCR, the software has successfully identified thousands of patients who have subsequently been confirmed by local coordinators into the local registries and transmitted to the national CCR. Because we do not currently collect information about individual pending patients, we cannot determine exactly what proportion of pending patients have been confirmed into the registry.

After the installation of the CCR: HIV and the clearing of the initial historical backlog of pending CCR: HIV patients, approximately 3,000 new local registry patients have been added each year to the local CCR: HIV (3,069 in 2004, 3,222 in 2005, 2,965 in 2006 and 3,121 in 2007). These new local registry patients represent 12,298 new unique patients added to the national CCR: HIV. Since the initial installation of the CCR: HCV, 37,169 new local registry patients have been added (10,293 in 2006 and 26,876 in 2007); they represent 37,064 new unique patients added to the national CCR: HCV.

Populations in the CCR

As of Dec 31, 2007, 92,740 local registry patients had been entered in local CCR: HIV registries and transmitted to the national database, representing 63,109 unique patients. The total number of patients in the local CCR: HIV registry lists ranged from 2 to 5,228, median 448 and interquartile range 167–954.

In the CCR: HCV, 424,104 local registry patients had been entered in local CCR: HCV registries and transmitted to the national database as of Dec 31, 2007, representing 324,065 unique patients. The total number of patients in the local CCR: HCV registries ranged from 95 to 10,956, median 2,782 and interquartile range 1523 to 4440.

Validation of the Registry Populations

Some clinical data from the early 1990s is not available in the national CCR due to archiving of local patient records before the one-time historical back pulls. Given these data restrictions, we limited validation to the subset of patients who were in VA care at any time in the 10 years between 1998 and 2007. “In care” is defined as a record in the CCR during those 10 years of an outpatient visit, outpatient prescription fill or inpatient stay. The 10 year in-care populations consist of 36,841 patients in the CCR: HIV and 317,998 patients in the CCR: HCV.

Both HIV and HCV are diagnosed based in part on laboratory testing. To assess the accuracy with which patients are included in the national CCR, we first determined the number of patients who have laboratory data sufficient to inform their inclusion in the registry. For patients without such laboratory data, we analyzed outpatient medication information to identify patients who had received HIV and HCV antiviral medications that would support a diagnosis of HIV or HCV.


The laboratory tests analyzed for HIV infection were HIV antibody, western blot and HIV viral load. Results for antibody and western blot tests that could be categorized as negative or positive were considered informative, and results for viral load tests that could be categorized as detectable or not detectable were considered informative. Patients were categorized as positive (P) for HIV if the most recent informative western blot test had a positive result, or the most recent informative antibody test had a positive result (without a confirmatory western blot test) or the patient had a detectable HIV viral load at any time (without a negative antibody test or negative western blot on the same day or more recently). Patients were categorized as viral load only (VL) if all available results for HIV viral loads were undetectable and the patient had no informative antibody or western blot results. The remaining patients were categorized as indeterminate (I) if all available results were not informative (e.g., “see report” or “indeterminate”), untested (U) if the patient had no HIV tests or negative (N) if the most recent western blot test was negative or the most recent antibody test was negative.

Among patients in the CCR: HIV and in VA care in the 10 years from 1998–2007, 33,040 (89.7%) had CCR: HIV laboratory evidence of HIV infection ([triangle]). Among the 1,279 patients with laboratory status of VL, 1,124 (87.9%) had received HIV antiviral medication. The latter are likely patients who transfer to VA care already on HIV antiviral medication with undetectable viral loads who do not undergo antibody or western blot testing in VA. Smaller percentages of those categorized as I or U had received HIV antivirals. Patients with status of I may have laboratory test results (e.g., “Call lab for result”) that are available to the local coordinator who confirms the patient in the registry and to the local provider who prescribes the HIV antivirals but not to the national CCR. Similarly, patients with status of U may have laboratory tests done outside the VA that are not entered in the VA EMR but are available to the local coordinator and to the VA provider. Two hundred ninety-five (0.8%) of patients in the CCR: HIV in care from 1998–2007 have laboratory and pharmacy evidence that they are in the CCR: HIV registry in error; these veterans were identified as negative by laboratory criteria and had not received HIV antivirals from VA. Overall, of the 36,841 patients in the CCR: HIV in VA care in the past 10 years, 94.2% have laboratory or prescription confirmation of HIV infection, 5.0% have no laboratory or pharmacy confirmation in the national CCR and may be in the CCR: HIV in error and 0.8% definitively appear to be in the CCR: HIV in error.

Table 2
Table 2 Laboratory and Prescription Medication Validation of Patients in the CCR: HIV From 1998–2007 and in 2007 Only


The target cohort for the CCR: HCV is those veterans with a history of chronic hepatitis C infection. Unlike HIV, infection with HCV spontaneously clears in 20–25% of those infected. 4 Moreover, the available HCV antiviral treatments have a low but significant rate of “virologic cure” in those patients who tolerate and adhere to treatment. 5,6 The HCCR software automatically added a patient with a positive HCV antibody test result regardless of whether the patient had evidence of chronic HCV infection. Chronic HCV infection is defined by a detectable HCV viral load or an identifiable HCV genotype. Our decision to migrate all veterans in the HCCR into the CCR: HCV led to inclusion in the CCR: HCV of thousands of patients who were not chronically infected. At the time, we believed that many VA facilities did not have the resources to implement manual review of all 380,000 cases identified as possible HCV infection from the HCCR. Faced with this situation, our approach is a gradual one. With the implementation of the manual-confirmation process of the CCR: HCV, local coordinators were instructed to confirm only those patients with chronic HCV infection as evidenced by a detectable HCV viral load or an identifiable HCV genotype. In addition, national reports can exclude those patients without evidence of chronic HCV infection.

For HCV validation, we analyzed HCV antibody, recombinant immunoblot assay (RIBA), genotype and HCV viral load tests. Antibody and RIBA test results that could be categorized as negative or positive were considered informative. Viral load results that could be categorized as detectable or not detectable were considered informative. Genotype results that could be assigned to a known genotype were considered informative. CCR: HCV patients were categorized for the validation analysis based on (1) HCV infection status and (2) chronic HCV infection status. For HCV infection, patients were categorized as positive (P) if they had a positive result for an antibody or RIBA test, a detectable viral load at any time or an identifiable genotype at any time; indeterminate (I) if all the available results were not informative; untested (U) if the patient had no HCV tests; or negative (N) if the patient had only negative results for any antibody or RIBA tests or undetectable viral loads.

Among patients in the CCR: HCV and in care in the 10 years from 1998–2007, 292,205 (91.9%) have laboratory evidence of HCV infection ([triangle]). Of the 25,793 patients who do not have laboratory evidence of HCV infection (laboratory status of I, N or U), only 544 had prescription evidence of HCV infection. Overall, out of the 317,998 patients in the CCR: HCV in care in the 10 years from 1998–2007, 292,749 (92.1%) have laboratory or prescription evidence of past or present HCV infection.

Table 3
Table 3 Laboratory and Prescription Medication Validation of Patients in the CCR: HCV From 1998–2007 and in 2007 Only For Any Infection and Chronic Infection

Among patients in the CCR: HCV and in care in the 10 years from 1998–2007, 190,608 (59.9%) have laboratory evidence of chronic HCV infection. Of the remaining 127,390 patients, only 1,434 had received VA prescriptions for HCV antiviral medication. Like HIV, patients who were tested outside the VA may receive HCV antiviral medication without laboratory confirmation in the national database. In addition, patients with laboratory status of N who receive HCV antiviral medications may have started HCV antiviral medication outside VA and achieved an undetectable HCV viral load before receiving such medication from VA. Overall, for the cohort in the CCR: HCV in care in the past 10 years, 60.4% have laboratory or prescription evidence of chronic HCV infection.

Recent Additions

In any approach to patient addition, patients may be entered into a local registry in error for a variety of reasons. In the CCR: HIV, the change from the manual mode of the ICR to the automatic pending with manual confirmation mode of the CCR arguably would be expected to identify additional cases. Since both approaches rely on manual review for addition to the registry, the change from the ICR to the CCR would not necessarily be expected to reduce the number of uninfected patients added to the registry in error. Indeed, the rate of inclusion of uninfected patients among patients recently confirmed into the CCR: HIV is almost identical to the rate for all patients on the CCR: HIV in care in the last 10 years, some of whom were added to the CCR: HIV because of their inclusion in the ICR. In 2007, 2,860 unique patients were added to the national CCR: HIV; 93.7% of these had laboratory or medication validation of HIV infection which is comparable to the 10-year cohort rate of 94.2%. Additionally, 0.7% of those added to the national CCR in 2007 have a laboratory status of N and definitively appear to be in the registry in error, almost identical to the definitive error rate of 0.8% in the 10-year cohort.

The change from automatic addition in the HCCR to the CCR mode of automatic pending with manual confirmation would be expected to reduce the rate of inclusion of uninfected patients. We have evidence that is the case. In 2007, 25,561 unique patients were added to the national CCR: HCV, of these 78.4% had laboratory or medication evidence of chronic HCV infection, compared with 60.4% for the 10-year cohort.

Use of Local Reports

As described above, one of the main goals of the CCR was to provide clinicians with local reporting capabilities to foster local assessments of patient care. The local CCR was designed with 16 reports with user-defined parameters. As part of the nightly HL7 extract, we collect the number of reports that were run in each local registry. In 2007, an average of 2,204 reports were run each month in the local CCR: HIV and 639 reports were run each month in the local CCR: HCV. From anecdotal communication with clinicians, we believe that three reports are most frequently used. Several facilities run the Current Inpatient Report every day to identify registry patients who are inpatients. Several facilities run a Patient Medication History Report for each patient seen at a clinic visit, thereby obtaining a complete antiviral medication refill history which provides objective information on a patient's medication adherence. Finally, the Combined Meds and Lab Report is run frequently by numerous facilities to answer a variety of questions about the clinical care or condition of the registry population.

National Reports

Facility, VISN and national reports concerning the HIV and HCV populations have been made available on the VA intranet for individuals with access to content behind the VA firewall. Over 20 annual reports have been posted which cover demographics, pharmacological treatment rates and rates of selected comorbidities of the populations as well as several of the quality measures proposed by the National Quality Forum. 7 We plan to make national level data available on the Internet.


In the era of migration of paper medical records to various electronic solutions, practitioners are becoming accustomed to managing a broad range of conditions and issues with electronic reminders, prompts, and reports. The vast majority of these tools provide patient-centric information, perhaps used most efficiently when incorporated into a medical visit. At the same time, the desire for evidence-based medicine and rising health care costs have prompted development of population quality measures that relate to the process and costs of care and outcomes. To permit comparisons of population quality measures across facilities within a healthcare system, data must be comparable and in a standardized format amenable to aggregation. Such aggregated data provides a population view where process, cost and outcomes can be measured and compared.

We present here one solution to providing a population measurement tool that meets a variety of local and national healthcare system objectives. The combination of computer generated lists of potential cases with manual review for confirmation has proven to have both a high degree of sensitivity and specificity for identifying the target population. The local CCR has extensive capacity to run reports on a wide variety of usage and quality issues that can be customized by the end user. Local reports are being run as providers take advantage of this capacity. The national database allows data from all VA medical facilities to be aggregated to the unique patient level. Facility, VISN and national results on agreed-upon measures are available on the VA intranet to permit comparisons of local, regional and national performance. Finally, subsets of national CCR data have been provided to 25 researchers and used by CQM staff to address a range of questions in both HIV and HCV. 8–13

The CCR has proven useful in several additional ways beyond the original design objectives, particularly for rapidly addressing patient safety issues. For example, following an FDA alert about the contraindication of proton pump inhibitors and atazanavir, 14 CQM was able to quickly analyze national data to identify patients in the VA receiving the contraindicated combination and to provide individual facilities with lists of patients at their facility who might be receiving the contraindicated combination.

On a national level, the registry data provides information for other VA and non-VA entities to make informed decisions about HCV and HIV care. Aggregate data from the national CCR are used in VA cost modeling. Longitudinal data from the CCR regarding diagnoses and usage are used to anticipate increasing needs for specific types of care. Various ad hoc queries of national data are conducted to meet requests from the VA's pharmacy benefits management group, HIV and HCV advisory panels, Public Health Strategic Healthcare Group (the parent of CQM), and other regional and national advisory and policy offices in VA. Data also assists other Federal agencies including the Department of Human Health Services to which CCR: HIV data in aggregated nonidentifiable format are provided to assist in measuring met and unmet community needs.

Other healthcare organizations have provided varying components of the CCR depending on the existence and extent of the local EMR. In general, very few healthcare systems have an EMR as robust as VA's. 15 In a recent survey, only 1.5% of non-VA United States hospitals have a comprehensive EMR; including VA hospitals doubled the number of such hospitals. 16 For facilities with no or limited EMRs, available population management tools generally rely on databases separate from the EMR. 17–20 Initial case finding for such registries generally relies on ICD-9 codes from billing systems 18 or manual identification by healthcare providers. 19,20 Such systems tend to accrue data through manual data entry given the limitations of the EMR which forces data standardization. 18–20 For sites with EMR functionality that includes electronic laboratory results, applications have been designed to identify cases of reportable diseases using ICD-9 codes and laboratory results and then transmit relevant clinical, laboratory and demographic details to the local health authority. 21–23 These reportable disease are generally acute infections that require laboratory confirmation from the reporting facility and have been able to construct case identification algorithms that do not require local confirmation. These systems, however, provide no local reporting tools for health care providers and do not construct a central database with complete medical information on the identified cases. Intermountain Healthcare (IHC) is one of the few healthcare systems with a comprehensive EMR system comparable to VA's. 24 The IHC has created a centralized patient database, known as the central data repository to make a patient's entire IHC medical record available to clinicians at the point of care. 25 Population based reports can be generated from the inclusive central data repository. The IHC did not allow for the local data dictionary flexibility which has slowed VA efforts to construct a similar central data repository.

Lessons Learned

In our experience, the combination of computer generated lists of potential cases with manual confirmation has proven to be an excellent way to generate comprehensive yet accurate registry lists. Manual confirmation is particularly necessary for any registry lists created through use of diagnosis codes because of errors in diagnosis coding. However, manual confirmation requires local staff trained to perform the review. Given the inevitable turnover in local staff, on-going training is necessary; we provide monthly support calls, web-based trainings and frequently personal assistance via telephone or e-mail. In addition, CQM monitors the number of pending patients; if a list of pending patients at a facility becomes quite large CQM can assist local personnel with the review of pending patients. Transmission of additional information about the pending patients to the national database would be useful to provide oversight of pending patient processing and may be added in future enhancements. Similarly, transmission of additional information about which local reports were run—not just the total number of reports—would be useful to guide future report development.

Since the CCR software rests on an ever-changing local EMR, the software requires continual updates and maintenance. The CCR extracts data from a variety of clinical packages and any change in those packages potentially impacts the CCR data extraction. Thus, critical data elements have to be monitored repeatedly for missing data. Fortunately, the CCR is now generally recognized within VA as a user of data from multiple clinical packages. As such, CQM staff work with the developers of many of the VistA packages to proactively adapt CCR data extraction in anticipation of VistA package changes. In addition, as new laboratory tests and diagnosis codes relevant to the target conditions are developed, the software must be updated to add new LOINC codes and new diagnosis codes to the search for pending patients.


The CCR provides a major extension to the patient-centric EMR to enable both local and national views of patient populations. Local clinicians can use these tools to manage their local populations. National data can be used to target quality and safety issues at the local level and inform local providers. The CCR provides an extensible platform for a range of registries beyond the current HIV and HCV registries. The software could easily accommodate the creation of registries for any clinical conditions that can be identified through electronically available and well-defined data elements.


1. Department of Veterans Affairs Fact Sheet 2008 Jan 20, 2009.
2. Brown SH, Lincoln MJ, Groen PJ, Kolodner RM. VistA—US Department of Veterans Affairs national-scale HIS Int J Med Inform 2003;69(2–3):135-156. [PubMed]
3. Backus L, Mole L, Chang S, Deyton L. The Immunology Case Registry J Clin Epidemiol 2001;54(Suppl 1):S12-S15. [PubMed]
4. Armstrong GL, Wasley A, Simard EP, et al. The prevalence of hepatitis C virus infection in the United States, 1999 through 2002 Ann Intern Med 2006;144(10):705-714. [PubMed]
5. Manns MP, McHutchison JG, Gordon SC, et al. Peginterferon alfa-2b plus ribavirin compared with interferon alfa-2b plus ribavirin for initial treatment of chronic hepatitis C: A randomised trial Lancet 2001;358(9286):958-965. [PubMed]
6. Fried MW, Shiffman ML, Reddy KR, et al. Peginterferon alfa-2a plus ribavirin for chronic hepatitis C virus infection N Engl J Med 2002;347(13):975-982. [PubMed]
7. Draft Report National Voluntary Consensus Standards for Clinician level Infectious disease 2009 2002. Accessed Feb 2, 2009.
8. Backus LI, Boothroyd DB, Phillips BR, Mole LA. Pretreatment assessment and predictors of hepatitis C virus treatment in US veterans coinfected with HIV and hepatitis C virus J Viral Hepat 2006;13(12):799-810. [PubMed]
9. Backus LI, Boothroyd DB, Phillips BR, Mole LA. Predictors of response of US veterans to treatment for the hepatitis C virus J Hepatol 2007;46(1):37-47. [PubMed]
10. Backus LI, Phillips BR, Boothroyd DB, et al. Effects of hepatitis C virus coinfection on survival in veterans with HIV treated with highly active antiretroviral therapy J Acquir Immune Defic Syndr 2005;39(5):613-619. [PubMed]
11. Choi AI, Rodriguez RA, Bacchetti P, et al. Low rates of antiretroviral therapy among HIV-infected patients with chronic kidney disease Clin Infect Dis 2007;45(12):1633-1639. [PubMed]
12. Belperio PS, Mole LA, Halloran J, et al. Postmarketing use of enfuvirtide in veterans: Provider compliance with criteria for use, overall efficacy, and tolerability Ann Pharmacother 2008;42(11):1573-1580. [PubMed]
13. Belperio PS, Mole LA, Boothroyd DB, Backus LI. Provider prescribing of 4 antiretroviral agents after implementation of drug use guidelines in the Department of Veterans Affairs J Manag Care Pharm 2009;15(4):323-334. [PubMed]
14. Atazanavir Package Insert 2009. Accessed Feb 10, 2009.
15. Hurdle JF. Can the electronic medical record improve geriatric care? Geriatric Times. Vol. 2. Mar-Apr 2004[Special Report].
16. Jha AK, DesRoches CM, Campbell EG, et al. Use of electronic health records in U.S. hospitals N Engl J Med 16April2009;360(16):1628-1638. [PubMed]
17. CDEMS User Network 16April2009. Accessed July 13, 2009.
18. Pollard C, Bailey KA, Petitte T, et al. Electronic patient registries improve diabetes care and clinical outcomes in rural community health centers J Rural Health 2009;25(1):77-84. [PubMed]
19. Dostal C, Pavelka K, Zvarova J, Hanzlicek P, Olejarova M. Some principles of the development of a clinical database/national register of selected inflammatory rheumatic diseases in the Czech Republic Int J Med Inform 2006;75(3–4):216-223. [PubMed]
20. Zink A, Listing J, Klindworth C, Zeidler H. The national database of the German Collaborative Arthritis Centres. I. Structure, aims, and patients. Ann Rheum Dis 2001;60(3):199-206. [PMC free article] [PubMed]
21. Effler P, Ching-Lee M, Bogard A, et al. Statewide system of electronic notifiable disease reporting from clinical laboratories: Comparing automated reporting with conventional methods J Am Med Assoc 1999;282(19):1845-1850. [PubMed]
22. Overhage JM, Grannis S, McDonald CJ. A comparison of the completeness and timeliness of automated electronic laboratory reporting and spontaneous reporting of notifiable conditions Am J Pub Health 2008;98(2):344-350. [PubMed]
23. Lazarus R, Klompas M, Campion FX, et al. Electronic Support for Public Health: Validated case finding and reporting for notifiable diseases using electronic medical data J Am Med Inform Assoc 2009;16(1):18-24. [PMC free article] [PubMed]
24. Kuperman GJ, Gardner RM, Pryor TA. HELP: A Dynamic Hospital Information System SystemNew York: Springer-Verlag; 1991.
25. Clayton PD, Narus SP, Huff SM, et al. Building a comprehensive clinical information system from components. The approach at Intermountain Health Care. Methods Inf Med 2003;42(1):1-7. [PubMed]

Articles from Journal of the American Medical Informatics Association : JAMIA are provided here courtesy of American Medical Informatics Association