|Home | About | Journals | Submit | Contact Us | Français|
For his October 2008 article , Hendrix collected and analyzed a remarkable amount of data regarding medical school publications and funding. Sampling of several institutional article totals appears to verify that meeting abstracts were included in article counts. Their percentage of a school's article total in Table 1 ranged from 13%–25%. Should abstracts be considered “journal articles,” because typically they do not include such elements as datasets, statistical analyses, figures, or references and have very low citation rates?
Identifying institutional affiliation at the “school” level is not straightforward. Lengthy medical school names are likely to be abbreviated or condensed by authors in ways not identifiable using the Web of Science preferred abbreviation or the search term “Med.” For example, on April 9, 2009, limiting to the year 2007 in Science and Social Sciences Citation Indexes (omitting meeting abstracts), I retrieved 3,432 articles from the University of North Carolina–Chapel Hill whose address lacked a “Med*” term. Of those articles, 927 or 27% were from the school of medicine. Hours of tedious inspection were required to identify these. Use of multidisciplinary center names as addresses can obscure school affiliation. Consultation of an institutional directory may be necessary for identification of an author's department and school.
If bibliometric data are to be used in our biomedical “audit society”  to assess faculty and institutional performance, then librarians should be prepared to assist administrators in recognizing the limitations and pitfalls of citation studies. Weingart is correct in warning that bibliometrics requires professionals who can “deal with the raw data” .