Search tips
Search criteria

Results 1-2 (2)

Clipboard (0)

Select a Filter Below

more »
Year of Publication
Document Types
1.  'Correction:'Peer chart audits: A tool to meet Accreditation Council on Graduate Medical Education (ACGME) competency in practice-based learning and improvement 
The Accreditation Council on Graduate Medical Education (ACGME) supports chart audit as a method to track competency in Practice-Based Learning and Improvement. We examined whether peer chart audits performed by internal medicine residents were associated with improved documentation of foot care in patients with diabetes mellitus.
A retrospective electronic chart review was performed on 347 patients with diabetes mellitus cared for by internal medicine residents in a university-based continuity clinic from May 2003 to September 2004. Residents abstracted information pertaining to documentation of foot examinations (neurological, vascular, and skin) from the charts of patients followed by their physician peers. No formal feedback or education was provided.
Significant improvement in the documentation of foot exams was observed over the course of the study. The percentage of patients receiving neurological, vascular, and skin exams increased by 20% (from 13% to 33%) (p = 0.001), 26% (from 45% to 71%) (p < 0.001), and 18% (51%–72%) (p = 0.005), respectively. Similarly, the proportion of patients receiving a well-documented exam which includes all three components – neurological, vascular and skin foot exam – increased over time (6% to 24%, p < 0.001).
Peer chart audits performed by residents in the absence of formal feedback were associated with improved documentation of the foot exam in patients with diabetes mellitus. Although this study suggests that peer chart audits may be an effective tool to improve practice-based learning and documentation of foot care in diabetic patients, evaluating the actual performance of clinical care was beyond the scope of this study and would be better addressed by a randomized controlled trial.
PMCID: PMC1959518  PMID: 17662124
2.  Reporting and Concordance of Methodologic Criteria Between Abstracts and Articles in Diagnostic Test Studies 
To evaluate the quality and concordance of methodologic criteria in abstracts versus articles regarding the diagnosis of trichomoniasis.
Survey of published literature.
Studies indexed in medline(1976–1998).
Studies that used culture as the gold or reference standard.
Data from abstract and articles were independently abstracted using 4 methodologic criteria: (1) prospective evaluation of consecutive patients; (2) test results did not influence the decision to do gold standard; (3) independent and blind comparison with gold standard; and (4) broad spectrum of patients used. The total number of criteria met for each report was calculated to create a quality score (0–4).
None of the 33 abstracts or full articles reported all 4 criteria. Three criteria were reported in none of the abstracts and in 18% of articles (95% confidence interval [95% CI] 8.6% to 34%). Two criteria were reported in 18% of abstracts (95% CI, 8.6% to 34%) and 42% of articles (95% CI, 27% to 59%). One criterion was reported in 42% of abstracts (95% CI, 27% to 59%) and 27% of articles (95% CI, 15% to 44%). No criteria were reported in 13 (39%) of 33 abstracts (95% CI, 25% to 56%) and 4 (12%) of 33 articles (95% CI, 4.8% to 27%). The agreement of the criteria between the abstract and the article was poor (κ−0.09; 95% CI, −0.18 to 0) to moderate (κ 0.53; 95% CI, 0.22 to 0.83).
Information on methods basic to study validity is often absent from both abstract and paper. The concordance of such criteria between the abstract and article needs to improve.
PMCID: PMC1495348  PMID: 10718899
evidence-based medicine; periodicals; publishing; quality control; sensitivity and specificity; diagnosis

Results 1-2 (2)