PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-25 (1281276)

Clipboard (0)
None

Related Articles

1.  Historical Review of Cancer Risks in Medical Radiation Workers 
Radiation research  2010;174(6):793-808.
Epidemiologic studies of medical radiation workers have found excess risks of leukemia, skin and female breast cancer in those employed before 1950, but little consistent evidence of cancer risk increases subsequently. Occupational radiation-related dose-response, risk estimates for recent years, and lifetime cancer risk data are limited for radiologists and radiologic technologists and lacking for physicians and technologists performing or assisting with fluoroscopically-guided procedures. Based on data from 80 mostly small studies of cardiologists and substantially fewer studies of physicians in other specialties, estimated effective doses to physicians per interventional procedure vary by more than an order of magnitude. There is an urgent need to expand the limited base of information on average annual occupational radiation exposures and time-trends in doses received by medical radiation workers, to assess lifetime cancer risks of radiologists and radiologic technologists in the existing cohorts, and to initiate long-term follow-up studies of cancer and other radiation-associated disease risks in physicians and technologists performing or assisting with interventional procedures. Such studies will help to optimize standardized protocols for radiologic procedures, determine if current radiation protection measures are adequate, provide guidance on cancer screening needs, and yield valuable insights on cancer risks associated with chronic radiation exposure.
doi:10.1667/RR2014.1
PMCID: PMC4098897  PMID: 21128805
radiologists; interventional radiologists; radiologic technologists; interventional cardiologists; neoplasms; reviews
2.  Incidence of haematopoietic malignancies in US radiologic technologists 
Background: There are limited data on risks of haematopoietic malignancies associated with protracted low-to-moderate dose radiation.
Aims: To contribute the first incidence risk estimates for haematopoietic malignancies in relation to work history, procedures, practices, and protective measures in a large population of mostly female medical radiation workers.
Methods: The investigators followed up 71 894 (77.9% female) US radiologic technologists, first certified during 1926–80, from completion of a baseline questionnaire (1983–89) to return of a second questionnaire (1994–98), diagnosis of a first cancer, death, or 31 August 1998 (731 306 person-years), whichever occurred first. Cox proportional hazards regression was used to compute risks.
Results: Relative risks (RR) for leukaemias other than chronic lymphocytic leukaemia (non-CLL, 41 cases) were increased among technologists working five or more years before 1950 (RR = 6.6, 95% CI 1.0 to 41.9, based on seven cases) or holding patients 50 or more times for x ray examination (RR = 2.6, 95% CI 1.3 to 5.4). Risks of non-CLL leukaemias were not significantly related to the number of years subjects worked in more recent periods, the year or age first worked, the total years worked, specific procedures or equipment used, or personal radiotherapy. Working as a radiologic technologist was not significantly linked with risk of multiple myeloma (28 cases), non-Hodgkin's lymphoma (118 cases), Hodgkin's lymphoma (31 cases), or chronic lymphocytic leukaemia (23 cases).
Conclusion: Similar to results for single acute dose and fractionated high dose radiation exposures, there was increased risk for non-CLL leukaemias decades after initial protracted radiation exposure that likely cumulated to low-to-moderate doses.
doi:10.1136/oem.2005.020826
PMCID: PMC1740936  PMID: 16299095
3.  Surgeons' Exposure to Radiation in Single- and Multi-Level Minimally Invasive Transforaminal Lumbar Interbody Fusion; A Prospective Study 
PLoS ONE  2014;9(4):e95233.
Although minimally invasive transforaminal lumbar interbody fusion (MIS-TLIF) has widely been developed in patients with lumbar diseases, surgeons risk exposure to fluoroscopic radiation. However, to date, there is no studies quantifying the effective dose during MIS-TLIF procedure, and the radiation dose distribution is still unclear. In this study, the surgeons' radiation doses at 5 places on the bodies were measured and the effective doses were assessed during 31 consecutive 1- to 3-level MIS-TLIF surgeries. The operating surgeon, assisting surgeon, and radiological technologist wore thermoluminescent dosimeter on the unshielded thyroid, chest, genitals, right middle finger, and on the chest beneath a lead apron. The doses at the lens and the effective doses were also calculated. Mean fluoroscopy times were 38.7, 53.1, and 58.5 seconds for 1, 2, or 3 fusion levels, respectively. The operating surgeon's mean exposures at the lens, thyroid, chest, genitals, finger, and the chest beneath the shield, respectively, were 0.07, 0.07, 0.09, 0.14, 0.32, and 0.05 mSv in 1-level MIS-TLIF; 0.07, 0.08, 0.09, 0.18, 0.34, and 0.05 mSv in 2-level; 0.08, 0.09, 0.14, 0.15, 0.36, and 0.06 mSv in 3-level; and 0.07, 0.08, 0.10, 0.15, 0.33, and 0.05 mSv in all cases. Mean dose at the operating surgeon's right finger was significantly higher than other measurements parts (P<0.001). The operating surgeon's effective doses (0.06, 0.06, and 0.07 mSv for 1, 2, and 3 fusion levels) were low, and didn't differ significantly from those of the assisting surgeon or radiological technologist. Revision MIS-TLIF was not associated with higher surgeons' radiation doses compared to primary MIS-TLIF. There were significantly higher surgeons' radiation doses in over-weight than in normal-weight patients. The surgeons' radiation exposure during MIS-TLIF was within the safe level by the International Commission on Radiological Protection's guidelines. The accumulated radiation exposure, especially to surgeon's hands, should be carefully monitored.
doi:10.1371/journal.pone.0095233
PMCID: PMC3988176  PMID: 24736321
4.  Survey of terminology used for the intraoperative direction of C-arm fluoroscopy 
Canadian Journal of Surgery  2013;56(2):109-112.
Background
Orthopedic surgeons depend on the intraoperative use of fluoroscopy to facilitate procedures across all subspecialties. The versatility of the C-arm fluoroscope allows acquisition of nearly any radiographic view. This versatility, however, creates the opportunity for difficulty in communication between surgeon and radiation technologist. Poor communication leads to delays, frustration and increased exposure to ionizing radiation. There is currently no standard terminology employed by surgeons and technologists with regards to direction of the fluoroscope.
Methods
The investigation consisted of a web-based survey in 2 parts. Part 1 was administered to the membership of the Canadian Orthopedic Association, part 2 to the membership of the Canadian Association of Medical Radiation Technologists. The survey consisted of open-ended or multiple-choice questions examining experience with the C-arm fluoroscope and the terminology preferred by both orthopedic surgeons and radiation technologists.
Results
The survey revealed tremendous inconsistency in language used by orthopedic surgeons and radiation technologists. It also revealed that many radiation technologists were inexperienced in operating the fluoroscope.
Conclusion
Adoption of a common language has been demonstrated to increase efficiency in performing defined tasks with the fluoroscope. We offer a potential system to facilitate communication based on current terminology used among Canadian orthopedic surgeons and radiation technologists.
doi:10.1503/cjs.015311
PMCID: PMC3617115  PMID: 23351496
5.  Radiation awareness among radiology residents, technologists, fellows and staff: where do we stand? 
Insights into Imaging  2014;6(1):133-139.
Objectives
To investigate and compare the knowledge of radiation dose and risk incurred in common radiology examinations among radiology residents, fellows, staff radiologists and technologists.
Methods
A questionnaire containing 17 multiple choice questions was administered to all residents, technologists, fellows and staff radiologists of the department of medical imaging through the hospital group mailing list.
Results
A total of 92 responses was received. Mean score was 8.5 out of 17. Only 48 % of all participants scored more than 50 % correct answers. Only 23 % were aware of dose from both single-view and two-view chest X-ray; 50–70 % underestimated dose from common studies; 50–75 % underestimated the risk of fatal cancer. Awareness about radiation exposure in pregnancy is variable and particularly poor among technologists. A statistically significant comparative knowledge gap was found among technologists.
Conclusions
Our results show a variable level of knowledge about radiation dose and risk among radiology residents, fellows, staff radiologists and technologists, but overall knowledge is inadequate in all groups. There is significant underestimation of dosage and cancer risk from common examinations, which could potentially lead to suboptimal risk assessment and excessive or unwarranted studies posing significant radiation hazard to the patient and radiology workers.
Main Messages
• Knowledge of radiation dose and risk is poor among all radiology workers.
• Significant knowledge gap among technologists compared to residents, fellows and staff radiologists.
• Significant underestimation of radiation dose and cancer risk from common examinations.
doi:10.1007/s13244-014-0365-x
PMCID: PMC4330233  PMID: 25412827
Radiation dose; Radiation risk; Residents; Technologists; Cancer risk; Questionnaire
6.  Screening Mammography for Women Aged 40 to 49 Years at Average Risk for Breast Cancer 
Executive Summary
Objective
The aim of this review was to determine the effectiveness of screening mammography in women aged 40 to 49 years at average risk for breast cancer.
Clinical Need
The effectiveness of screening mammography in women aged over 50 years has been established, yet the issue of screening in women aged 40 to 49 years is still unsettled. The Canadian Task Force of Preventive Services, which sets guidelines for screening mammography for all provinces, supports neither the inclusion nor the exclusion of this screening procedure for 40- to 49-year-old women from the periodic health examination. In addition to this, 2 separate reviews, one conducted in Quebec in 2005 and the other in Alberta in 2000, each concluded that there is an absence of convincing evidence on the effectiveness of screening mammography for women in this age group who are at average risk for breast cancer.
In the United States, there is disagreement among organizations on whether population-based mammography should begin at the age of 40 or 50 years. The National Institutes of Health, the American Association for Cancer Research, and the American Academy of Family Physicians recommend against screening women in their 40s, whereas the United States Preventive Services Task Force, the National Cancer Institute, the American Cancer Society, the American College of Radiology, and the American College of Obstetricians and Gynecologists recommend screening mammograms for women aged 40 to 49 years. Furthermore, in comparing screening guidelines between Canada and the United States, it is also important to recognize that “standard care” within a socialized medical system such as Canada’s differs from that of the United States. The National Breast Screening Study (NBSS-1), a randomized screening trial conducted in multiple centres across Canada, has shown there is no benefit in mortality from breast cancer from annual mammograms in women randomized between the ages of 40 and 49, relative to standard care (i.e. physical exam and teaching of breast-self examination on entry to the study, with usual community care thereafter).
At present, organized screening programs in Canada systematically screen women starting at 50 years of age, although with a physician’s referral, a screening mammogram is an insured service in Ontario for women under 50 years of age.
International estimates of the epidemiology of breast cancer show that the incidence of breast cancer is increasing for all ages combined, whereas mortality is decreasing, though at a slower rate. These decreasing mortality rates may be attributed to screening and advances in breast cancer therapy over time. Decreases in mortality attributable to screening may be a result of the earlier detection and treatment of invasive cancers, in addition to the increased detection of ductal carcinoma in situ (DCIS), of which certain subpathologies are less lethal. Evidence from the SEER cancer registry in the United States indicates that the age-adjusted incidence of DCIS has increased almost 10-fold over a 20-year period (from 2.7 to 25 per 100,000).
The incidence of breast cancer is lower in women aged 40 to 49 years than in women aged 50 to 69 years (about 140 per 100,000 versus 500 per 100,000 women, respectively), as is the sensitivity (about 75% versus 85% for women aged under and over 50, respectively) and specificity of mammography (about 80% versus 90% for women aged under and over 50, respectively). The increased density of breast tissue in younger women is mainly responsible for the lower accuracy of this procedure in this age group. In addition, as the proportion of breast cancers that occur before the age of 50 are more likely to be associated with genetic predisposition as compared with those diagnosed in women after the age of 50, mammography may not be an optimal screening method for younger women.
Treatment options vary with the stage of disease (based on tumor size, involvement of surrounding tissue, and number of affected axillary lymph nodes) and its pathology, and may include a combination of surgery, chemotherapy, and/or radiotherapy.
Surgery is the first-line intervention for biopsy confirmed tumours. The subsequent use of radiation, chemotherapy, or hormonal treatments is dependent on the histopathologic characteristics of the tumor and the type of surgery. There is controversy regarding the optimal treatment of DCIS, which is noninvasive.
With such controversy as to the effectiveness of mammography and the potential risk associated with women being overtreated or actual cancers being missed, and the increased risk of breast cancer associated with exposure to annual mammograms over a 10-year period, the Ontario Health Technology Advisory Committee requested this review of screening mammography in women aged 40 to 49 years at average risk for breast cancer. This review is the first of 2 parts and concentrates on the effectiveness of screening mammography (i.e., film mammography, FM) for women at average risk aged 40 to 49 years. The second part will be an evaluation of screening by either magnetic resonance imaging or digital mammography, with the objective of determining the optimal screening modality in these younger women.
Review Strategy
The following questions were asked:
Does screening mammography for women aged 40 to 49 years who are at average risk for breast cancer reduce breast cancer mortality?
What is the sensitivity and specificity of mammography for this age group?
What are the risks associated with annual screening from ages 40 to 49?
What are the risks associated with false positive and false negative mammography results?
What are the economic considerations if evidence for effectiveness is established?
The Medical Advisory Secretariat followed its standard procedures and searched these electronic databases: Ovid MEDLINE, EMBASE, Ovid MEDLINE In-Process and Other Non-Indexed Citations, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews and the International Network of Agencies for Health Technology Assessment.
Keywords used in the search were breast cancer, breast neoplasms, mass screening, and mammography.
In total, the search yielded 6,359 articles specific to breast cancer screening and mammography. This did not include reports on diagnostic mammograms. The search was further restricted to English-language randomized controlled trials (RCTs), systematic reviews, and meta-analyses published between 1995 and 2005. Excluded were case reports, comments, editorials, and letters, which narrowed the results to 516 articles and previous health technology policy assessments.
These were examined against the criteria outlined below. This resulted in the inclusion of 5 health technology assessments, the Canadian Preventive Services Task Force report, the United States Preventive Services Task Force report, 1 Cochrane review, and 8 RCTs.
Inclusion Criteria
English-language articles, and English and French-language health technology policy assessments, conducted by other organizations, from 1995 to 2005
Articles specific to RCTs of screening mammography of women at average risk for breast cancer that included results for women randomized to studies between the ages of 40 and 49 years
Studies in which women were randomized to screening with or without mammography, although women may have had clinical breast examinations and/or may have been conducting breast self-examination.
UK Age Trial results published in December 2006.
Exclusion Criteria
Observational studies, including those nested within RCTs
RCTs that do not include results on women between the ages of 40 and 49 at randomization
Studies in which mammography was compared with other radiologic screening modalities, for example, digital mammography, magnetic resonance imaging or ultrasound.
Studies in which women randomized had a personal history of breast cancer.
Intervention
Film mammography
Comparators
Within RCTs, the comparison group would have been women randomized to not undergo screening mammography, although they may have had clinical breast examinations and/or have been conducting breast self-examination.
Outcomes of Interest
Breast cancer mortality
Summary of Findings
There is Level 1 Canadian evidence that screening women between the ages of 40 and 49 years who are at average risk for breast cancer is not effective, and that the absence of a benefit is sustained over a maximum follow-up period of 16 years.
All remaining studies that reported on women aged under 50 years were based on subset analyses. They provide additional evidence that, when all these RCTs are taken into account, there is no significant reduction in breast cancer mortality associated with screening mammography in women aged 40 to 49 years.
Conclusions
There is Level 1 evidence that screening mammography in women aged 40 to 49 years at average risk for breast cancer is not effective in reducing mortality.
Moreover, risks associated with exposure to mammographic radiation, the increased risk of missed cancers due to lower mammographic sensitivity, and the psychological impact of false positives, are not inconsequential.
The UK Age Trial results published in December 2006 did not change these conclusions.
PMCID: PMC3377515  PMID: 23074501
7.  The Effect of Tobacco Control Measures during a Period of Rising Cardiovascular Disease Risk in India: A Mathematical Model of Myocardial Infarction and Stroke 
PLoS Medicine  2013;10(7):e1001480.
In this paper from Basu and colleagues, a simulation of tobacco control and pharmacological interventions to prevent cardiovascular disease mortality in India predicted that Smokefree laws and increased tobacco taxation are likely to be the most effective measures to avert future cardiovascular deaths in India.
Please see later in the article for the Editors' Summary
Background
We simulated tobacco control and pharmacological strategies for preventing cardiovascular deaths in India, the country that is expected to experience more cardiovascular deaths than any other over the next decade.
Methods and Findings
A microsimulation model was developed to quantify the differential effects of various tobacco control measures and pharmacological therapies on myocardial infarction and stroke deaths stratified by age, gender, and urban/rural status for 2013 to 2022. The model incorporated population-representative data from India on multiple risk factors that affect myocardial infarction and stroke mortality, including hypertension, hyperlipidemia, diabetes, coronary heart disease, and cerebrovascular disease. We also included data from India on cigarette smoking, bidi smoking, chewing tobacco, and secondhand smoke. According to the model's results, smoke-free legislation and tobacco taxation would likely be the most effective strategy among a menu of tobacco control strategies (including, as well, brief cessation advice by health care providers, mass media campaigns, and an advertising ban) for reducing myocardial infarction and stroke deaths over the next decade, while cessation advice would be expected to be the least effective strategy at the population level. In combination, these tobacco control interventions could avert 25% of myocardial infarctions and strokes (95% CI: 17%–34%) if the effects of the interventions are additive. These effects are substantially larger than would be achieved through aspirin, antihypertensive, and statin therapy under most scenarios, because of limited treatment access and adherence; nevertheless, the impacts of tobacco control policies and pharmacological interventions appear to be markedly synergistic, averting up to one-third of deaths from myocardial infarction and stroke among 20- to 79-y-olds over the next 10 y. Pharmacological therapies could also be considerably more potent with further health system improvements.
Conclusions
Smoke-free laws and substantially increased tobacco taxation appear to be markedly potent population measures to avert future cardiovascular deaths in India. Despite the rise in co-morbid cardiovascular disease risk factors like hyperlipidemia and hypertension in low- and middle-income countries, tobacco control is likely to remain a highly effective strategy to reduce cardiovascular deaths.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Cardiovascular diseases (CVDs) are conditions that affect the heart and/or the circulation. In coronary heart disease, for example, narrowing of the heart's blood vessels by fatty deposits slows the blood supply to the heart and may eventually cause a heart attack (myocardial infarction). Stroke, by contrast, is a CVD in which the blood supply to the brain is interrupted. CVD has been a major cause of illness and death in high-income countries for many years, but the burden of CVD is now rapidly rising in low- and middle-income countries. Indeed, worldwide, three-quarters of all deaths from heart disease and stroke occur in low- and middle-income countries. Smoking, high blood pressure (hypertension), high blood cholesterol (hyperlipidemia), diabetes, obesity, and physical inactivity all increase an individual's risk of developing CVD. Prevention strategies and treatments for CVD include lifestyle changes (for example, smoking cessation) and taking drugs that lower blood pressure (antihypertensive drugs) or blood cholesterol levels (statins) or thin the blood (aspirin).
Why Was This Study Done?
Because tobacco use is a key risk factor for CVD and for several other noncommunicable diseases, the World Health Organization has developed an international instrument for tobacco control called the Framework Convention on Tobacco Control (FCTC). Parties to the FCTC (currently 176 countries) agree to implement a set of core tobacco control provisions including legislation to ban tobacco advertising and to increase tobacco taxes. But will tobacco control measures reduce the burden of CVD effectively in low- and middle-income countries as other risk factors for CVD are becoming more common? In this mathematical modeling study, the researchers investigated this question by simulating the effects of tobacco control measures and pharmacological strategies for preventing CVD on CVD deaths in India. Notably, many of the core FCTC provisions remain poorly implemented or unenforced in India even though it became a party to the convention in 2005. Moreover, experts predict that, over the next decade, this middle-income country will contribute more than any other nation to the global increase in CVD deaths.
What Did the Researchers Do and Find?
The researchers developed a microsimulation model (a computer model that operates at the level of individuals) to quantify the likely effects of various tobacco control measures and pharmacological therapies on deaths from myocardial infarction and stroke in India between 2013 and 2022. They incorporated population-representative data from India on risk factors that affect myocardial infarction and stroke mortality and on tobacco use and exposure to secondhand smoke into their model. They then simulated the effects of five tobacco control measures—smoke-free legislation, tobacco taxation, provision of brief cessation advice by health care providers, mass media campaigns, and advertising bans—and increased access to aspirin, antihypertensive drugs, and statins on deaths from myocardial infarction and stroke. Smoke-free legislation and tobacco taxation are likely to be the most effective strategies for reducing myocardial infarction and stroke deaths over the next decade, according to the model, and the effects of these strategies are likely to be substantially larger than those achieved by drug therapies under current health system conditions. If the effects of smoke-free legislation and tobacco taxation are additive, the model predicts that these two measures alone could avert about 9 million deaths, that is, a quarter of the expected deaths from myocardial infarction and stroke in India over the next 10 years, and that a combination of tobacco control policies and pharmacological interventions could avert up to a third of these deaths.
What Do These Findings Mean?
These findings suggest that the implementation of smoke-free laws and the introduction of increased tobacco taxes in India would yield substantial and rapid health benefits by averting future CVD deaths. The accuracy of these findings is likely to be affected by the many assumptions included in the mathematical model and by the quality of the data fed into it. Importantly, however, these finding suggest that, despite the rise in other CVD risk factors such as hypertension and hyperlipidemia, tobacco control is likely to be a highly effective strategy for the reduction of CVD deaths over the next decade in India and probably in other low- and middle-income countries. Policymakers in these countries should, therefore, work towards fuller and faster implementation of the core FCTC provisions to boost their efforts to reduce deaths from CVD.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001480.
The American Heart Association provides information on all aspects of cardiovascular disease; its website includes personal stories about heart attacks and stroke
The US Centers for Disease Control and Prevention has information on heart disease and on stroke (in English and Spanish
The UK National Health Service Choices website provides information about cardiovascular disease and stroke
MedlinePlus provides links to other sources of information on heart diseases, vascular diseases, and stroke (in English and Spanish)
The World Health Organization provides information (in several languages) about the dangers of tobacco, about the Framework Convention on Tobacco Control, and about noncommunicable diseases; its Global Noncommunicable Disease Network (NCDnet) aims to help low- and middle- income countries reduce illness and death caused by CVD and other noncommunicable diseases
SmokeFree, a website provided by the UK National Health Service, offers advice on quitting smoking and includes personal stories from people who have stopped smoking
Smokefree.gov, supported by the US National Cancer Institute and other US agencies, offers online tools and resources to help people quit smoking
doi:10.1371/journal.pmed.1001480
PMCID: PMC3706364  PMID: 23874160
8.  Red Blood Cell Transfusion and Mortality in Trauma Patients: Risk-Stratified Analysis of an Observational Study 
PLoS Medicine  2014;11(6):e1001664.
Using a large multicentre cohort, Pablo Perel and colleagues evaluate the association of red blood cell transfusion with mortality according to the predicted risk of death for trauma patients.
Please see later in the article for the Editors' Summary
Background
Haemorrhage is a common cause of death in trauma patients. Although transfusions are extensively used in the care of bleeding trauma patients, there is uncertainty about the balance of risks and benefits and how this balance depends on the baseline risk of death. Our objective was to evaluate the association of red blood cell (RBC) transfusion with mortality according to the predicted risk of death.
Methods and Findings
A secondary analysis of the CRASH-2 trial (which originally evaluated the effect of tranexamic acid on mortality in trauma patients) was conducted. The trial included 20,127 trauma patients with significant bleeding from 274 hospitals in 40 countries. We evaluated the association of RBC transfusion with mortality in four strata of predicted risk of death: <6%, 6%–20%, 21%–50%, and >50%. For this analysis the exposure considered was RBC transfusion, and the main outcome was death from all causes at 28 days. A total of 10,227 patients (50.8%) received at least one transfusion. We found strong evidence that the association of transfusion with all-cause mortality varied according to the predicted risk of death (p-value for interaction <0.0001). Transfusion was associated with an increase in all-cause mortality among patients with <6% and 6%–20% predicted risk of death (odds ratio [OR] 5.40, 95% CI 4.08–7.13, p<0.0001, and OR 2.31, 95% CI 1.96–2.73, p<0.0001, respectively), but with a decrease in all-cause mortality in patients with >50% predicted risk of death (OR 0.59, 95% CI 0.47–0.74, p<0.0001). Transfusion was associated with an increase in fatal and non-fatal vascular events (OR 2.58, 95% CI 2.05–3.24, p<0.0001). The risk associated with RBC transfusion was significantly increased for all the predicted risk of death categories, but the relative increase was higher for those with the lowest (<6%) predicted risk of death (p-value for interaction <0.0001). As this was an observational study, the results could have been affected by different types of confounding. In addition, we could not consider haemoglobin in our analysis. In sensitivity analyses, excluding patients who died early; conducting propensity score analysis adjusting by use of platelets, fresh frozen plasma, and cryoprecipitate; and adjusting for country produced results that were similar.
Conclusions
The association of transfusion with all-cause mortality appears to vary according to the predicted risk of death. Transfusion may reduce mortality in patients at high risk of death but increase mortality in those at low risk. The effect of transfusion in low-risk patients should be further tested in a randomised trial.
Trial registration
www.ClinicalTrials.gov NCT01746953
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Trauma—a serious injury to the body caused by violence or an accident—is a major global health problem. Every year, injuries caused by traffic collisions, falls, blows, and other traumatic events kill more than 5 million people (9% of annual global deaths). Indeed, for people between the ages of 5 and 44 years, injuries are among the top three causes of death in many countries. Trauma sometimes kills people through physical damage to the brain and other internal organs, but hemorrhage (serious uncontrolled bleeding) is responsible for 30%–40% of trauma-related deaths. Consequently, early trauma care focuses on minimizing hemorrhage (for example, by using compression to stop bleeding) and on restoring blood circulation after blood loss (health-care professionals refer to this as resuscitation). Red blood cell (RBC) transfusion is often used for the management of patients with trauma who are bleeding; other resuscitation products include isotonic saline and solutions of human blood proteins.
Why Was This Study Done?
Although RBC transfusion can save the lives of patients with trauma who are bleeding, there is considerable uncertainty regarding the balance of risks and benefits associated with this procedure. RBC transfusion, which is an expensive intervention, is associated with several potential adverse effects, including allergic reactions and infections. Moreover, blood supplies are limited, and the risks from transfusion are high in low- and middle-income countries, where most trauma-related deaths occur. In this study, which is a secondary analysis of data from a trial (CRASH-2) that evaluated the effect of tranexamic acid (which stops excessive bleeding) in patients with trauma, the researchers test the hypothesis that RBC transfusion may have a beneficial effect among patients at high risk of death following trauma but a harmful effect among those at low risk of death.
What Did the Researchers Do and Find?
The CRASH-2 trail included 20,127 patients with trauma and major bleeding treated in 274 hospitals in 40 countries. In their risk-stratified analysis, the researchers investigated the effect of RBC transfusion on CRASH-2 participants with a predicted risk of death (estimated using a validated model that included clinical variables such as heart rate and blood pressure) on admission to hospital of less than 6%, 6%–20%, 21%–50%, or more than 50%. That is, the researchers compared death rates among patients in each stratum of predicted risk of death who received a RBC transfusion with death rates among patients who did not receive a transfusion. Half the patients received at least one transfusion. Transfusion was associated with an increase in all-cause mortality at 28 days after trauma among patients with a predicted risk of death of less than 6% or of 6%–20%, but with a decrease in all-cause mortality among patients with a predicted risk of death of more than 50%. In absolute figures, compared to no transfusion, RBC transfusion was associated with 5.1 more deaths per 100 patients in the patient group with the lowest predicted risk of death but with 11.9 fewer deaths per 100 patients in the group with the highest predicted risk of death.
What Do These Findings Mean?
These findings show that RBC transfusion is associated with an increase in all-cause deaths among patients with trauma and major bleeding with a low predicted risk of death, but with a reduction in all-cause deaths among patients with a high predicted risk of death. In other words, these findings suggest that the effect of RBC transfusion on all-cause mortality may vary according to whether a patient with trauma has a high or low predicted risk of death. However, because the participants in the CRASH-2 trial were not randomly assigned to receive a RBC transfusion, it is not possible to conclude that receiving a RBC transfusion actually increased the death rate among patients with a low predicted risk of death. It might be that the patients with this level of predicted risk of death who received a transfusion shared other unknown characteristics (confounders) that were actually responsible for their increased death rate. Thus, to provide better guidance for clinicians caring for patients with trauma and hemorrhage, the hypothesis that RBC transfusion could be harmful among patients with trauma with a low predicted risk of death should be prospectively evaluated in a randomised controlled trial.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001664.
This study is further discussed in a PLOS Medicine Perspective by Druin Burch
The World Health Organization provides information on injuries and on violence and injury prevention (in several languages)
The US Centers for Disease Control and Prevention has information on injury and violence prevention and control
The National Trauma Institute, a US-based non-profit organization, provides information about hemorrhage after trauma and personal stories about surviving trauma
The UK National Health Service Choices website provides information about blood transfusion, including a personal story about transfusion after a serious road accident
The US National Heart, Lung, and Blood Institute also provides detailed information about blood transfusions
MedlinePlus provides links to further resources on injuries, bleeding, and blood transfusion (in English and Spanish)
More information in available about CRASH-2 (in several languages)
doi:10.1371/journal.pmed.1001664
PMCID: PMC4060995  PMID: 24937305
9.  SCAR Radiologic Technologist Survey: Analysis of the Impact of Digital Technologies on Productivity  
Journal of Digital Imaging  2002;15(3):132-140.
As medical reimbursements continue to decline, increasing financial pressures are placed upon medical imaging providers. This burden is exacerbated by the existing radiologic technologist (RT) crisis, which has caused RT salaries to trend upward. One strategy to address these trends is employing technology to improve technologist productivity. While industry-wide RT productivity benchmarks have been established for film-based operation, little to date has been published in the medical literature regarding similar productivity measures for filmless operation using PACS. This study was undertaken to document the complex relationship between technologist productivity and implementation of digital radiography and digital information technologies, including PACS and hospital/radiology information systems (HIS/RIS). A nationwide survey was conducted with 112 participating institutions, in varying degrees of digital technology implementation. Technologist productivity was defined as the number of annual exams performed per technologist full-time equivalent (FTE). Productivity analyses were performed among the different demographic and technology profile groups, with a focus on general radiography, which accounts for 65-70% of imaging department volumes. When evaluating the relationship between technologist productivity and digital technology implementation, improved productivity measures were observed for institutions implementing HIS/RIS, modality worklist, and PACS. The timing of PACS implementation was found to have a significant effect on technologist productivity measures, with an initial 10.8% drop in productivity during the first year of PACS implementation, followed by a 27.8% increase in productivity beyond year one. This suggests there is a "PACS learning curve" phenomenon, which should be considered when institutions are planning for PACS implementation.
doi:10.1007/s10278-002-0021-8
PMCID: PMC3613256  PMID: 12481227
10.  Combined Impact of Lifestyle-Related Factors on Total and Cause-Specific Mortality among Chinese Women: Prospective Cohort Study 
PLoS Medicine  2010;7(9):e1000339.
Findings from the Shanghai Women's Health Study confirm those derived from other, principally Western, cohorts regarding the combined impact of lifestyle-related factors on mortality.
Background
Although cigarette smoking, excessive alcohol drinking, obesity, and several other well-studied unhealthy lifestyle-related factors each have been linked to the risk of multiple chronic diseases and premature death, little is known about the combined impact on mortality outcomes, in particular among Chinese and other non-Western populations. The objective of this study was to quantify the overall impact of lifestyle-related factors beyond that of active cigarette smoking and alcohol consumption on all-cause and cause-specific mortality in Chinese women.
Methods and Findings
We used data from the Shanghai Women's Health Study, an ongoing population-based prospective cohort study in China. Participants included 71,243 women aged 40 to 70 years enrolled during 1996–2000 who never smoked or drank alcohol regularly. A healthy lifestyle score was created on the basis of five lifestyle-related factors shown to be independently associated with mortality outcomes (normal weight, lower waist-hip ratio, daily exercise, never exposed to spouse's smoking, higher daily fruit and vegetable intake). The score ranged from zero (least healthy) to five (most healthy) points. During an average follow-up of 9 years, 2,860 deaths occurred, including 775 from cardiovascular disease (CVD) and 1,351 from cancer. Adjusted hazard ratios for mortality decreased progressively with an increasing number of healthy lifestyle factors. Compared to women with a score of zero, hazard ratios (95% confidence intervals) for women with four to five factors were 0.57 (0.44–0.74) for total mortality, 0.29 (0.16–0.54) for CVD mortality, and 0.76 (0.54–1.06) for cancer mortality. The inverse association between the healthy lifestyle score and mortality was seen consistently regardless of chronic disease status at baseline. The population attributable risks for not having 4–5 healthy lifestyle factors were 33% for total deaths, 59% for CVD deaths, and 19% for cancer deaths.
Conclusions
In this first study, to our knowledge, to quantify the combined impact of lifestyle-related factors on mortality outcomes in Chinese women, a healthier lifestyle pattern—including being of normal weight, lower central adiposity, participation in physical activity, nonexposure to spousal smoking, and higher fruit and vegetable intake—was associated with reductions in total and cause-specific mortality among lifetime nonsmoking and nondrinking women, supporting the importance of overall lifestyle modification in disease prevention.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
It is well established that lifestyle-related factors, such as limited physical activity, unhealthy diets, excessive alcohol consumption, and exposure to tobacco smoke are linked to an increased risk of many chronic diseases and premature death. However, few studies have investigated the combined impact of lifestyle-related factors and mortality outcomes, and most of such studies of combinations of established lifestyle factors and mortality have been conducted in the US and Western Europe. In addition, little is currently known about the combined impact on mortality of lifestyle factors beyond that of active smoking and alcohol consumption.
Why Was This Study Done?
Lifestyles in regions of the world can vary considerably. For example, many women in Asia do not actively smoke or regularly drink alcohol, which are important facts to note when considering practical disease prevention measures for these women. Therefore, it is important to study the combination of lifestyle factors appropriate to this population.
What Did the Researchers Do and Find?
The researchers used the Shanghai Women's Health Study, an ongoing prospective cohort study of almost 75,000 Chinese women aged 40–70 years, as the basis for their analysis. The Shanghai Women's Health Study has comprehensive baseline data on anthropometric measurements, lifestyle habits (including the responses to validated food frequency and physical activity questionnaires), medical history, occupational history, and select information from each participant's spouse, such as smoking history and alcohol consumption. This information was used by the researchers to create a healthy lifestyle score on the basis of five lifestyle-related factors shown to be independently associated with mortality outcomes in this population: normal weight, lower waist-hip ratio, daily exercise, never being exposed to spouse's smoking, and higher daily fruit and vegetable intake. The score ranged from zero (least healthy) to five (most healthy) points. The researchers found that higher healthy lifestyle scores were significantly associated with decreasing mortality and that this association persisted for all women regardless of their baseline comorbidities. So in effect, healthier lifestyle-related factors, including normal weight, lower waist-hip ratio, participation in exercise, never being exposed to spousal smoking, and higher daily fruit and vegetable intake, were significantly and independently associated with lower risk of total, and cause-specific, mortality.
What Do These Findings Mean?
This large prospective cohort study conducted among lifetime nonsmokers and nonalcohol drinkers shows that lifestyle factors, other than active smoking and alcohol consumption, have a major combined impact on total mortality on a scale comparable to the effect of smoking—the leading cause of death in most populations. However, the sample sizes for some cause-specific analyses were relatively small (despite the overall large sample size), and extended follow-up of this cohort will provide the opportunity to further evaluate the impact of these lifestyle-related factors on mortality outcomes in the future.
The findings of this study highlight the importance of overall lifestyle modification in disease prevention, especially as most of the lifestyle-related factors studied here may be improved by individual motivation to change unhealthy behaviors. Further research is needed to design appropriate interventions to increase these healthy lifestyle factors among Asian women.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000339
The Vanderbilt Epidemiology Center has more information on the Shanghai Women's Health Study
The World Health Organization provides information on health in China
The document Health policy and systems research in Chinacontains information about health policy and health systems research in China
The Chinese Ministry of Healthalso provides health information
doi:10.1371/journal.pmed.1000339
PMCID: PMC2939020  PMID: 20856900
11.  A Prospective Study of Medical Diagnostic Radiography and Risk of Thyroid Cancer 
American Journal of Epidemiology  2013;177(8):800-809.
Although diagnostic x-ray procedures provide important medical benefits, cancer risks associated with their exposure are also possible, but not well characterized. The US Radiologic Technologists Study (1983–2006) is a nationwide, prospective cohort study with extensive questionnaire data on history of personal diagnostic imaging procedures collected prior to cancer diagnosis. We used Cox proportional hazard regressions to estimate thyroid cancer risks related to the number and type of selected procedures. We assessed potential modifying effects of age and calendar year of the first x-ray procedure in each category of procedures. Incident thyroid cancers (n = 251) were diagnosed among 75,494 technologists (1.3 million person-years; mean follow-up = 17 years). Overall, there was no clear evidence of thyroid cancer risk associated with diagnostic x-rays except for dental x-rays. We observed a 13% increase in thyroid cancer risk for every 10 reported dental radiographs (hazard ratio = 1.13, 95% confidence interval: 1.01, 1.26), which was driven by dental x-rays first received before 1970, but we found no evidence that the relationship between dental x-rays and thyroid cancer was associated with childhood or adolescent exposures as would have been anticipated. The lack of association of thyroid cancer with x-ray procedures that expose the thyroid to higher radiation doses than do dental x-rays underscores the need to conduct a detailed radiation exposure assessment to enable quantitative evaluation of risk.
doi:10.1093/aje/kws315
PMCID: PMC3668423  PMID: 23529772
radiation; radiography; thyroid gland; thyroid neoplasms; x-rays
12.  The Effect of PACS on the Time Required for Technologists to Produce Radiographic Images in the Emergency Department Radiology Suite  
Journal of Digital Imaging  2002;15(3):153-160.
The purpose of this study was to evaluate the effect of a switch to a filmless image management system on the time required for technologists to produce radiographic images in the emergency department (ED) after controlling for exam difficulty and a variable workload. Time and motion data were collected on patients who had radiographic images taken while being treated in the emergency department over the 3½-year period from April 1997 to November 2000. Event times and demographic data were obtained from the radiology information system, from the hospital information system, from emergency department records, or by observation by research coordinators. Multiple least squares regression analysis identified several independent predictors of the time required for technologists to produce radiographic images. These variables included the level of technologist experience, the number of trauma-alert patient arrivals, and whether a filmless image management system was used (all P <.05). Our regression model explained 22% of the variability in technologist time (R2 Adjusted, 0.22; F = 24.01; P <.0001). The regression model predicted a time saving of 2 to 3 minutes per patient in the elapsed time from notification of a needed examination until image availability because of the implementation of PACS, a delay of 4 to 6 minutes per patient who were imaged by technologists who spent less than 10% of their work assignments within the ED, and a delay of 18 to 27 minutes in radiology workflow because of the arrival of a trauma alert patient. A filmless system decreased the amount of time required to produce radiographs. The arrival of a trauma alert patient delayed radiology workflow in the ED. Inexperienced technologists require 4 to 6 minutes of additional time per patient to complete the same amount of work accomplished by an experienced technologist.
doi:10.1007/s10278-002-0024-5
PMCID: PMC3613261  PMID: 12415466
13.  Electronic imaging impact on image and report turnaround times 
Journal of Digital Imaging  1999;12(Suppl 1):155-159.
We prospectively compared image and report delivery times in our Urgent Care Center (UCC) during a film-based practice (1995) and after complete implementation of an electronic imaging practice in 1997. Before switching to a totally electronic and filmless practice, multiple time periods were consistently measured during a 1-week period in May 1995 and then again in a similar week in May 1997 after implementation of electronic imaging. All practice patterns were the same except for a film-based practice in 1995 versus a filmless practice in 1997. The following times were measured: (1) waiting room time, (2) technologist’s time of examination, (3) time to quality control, (4) radiology interpretation times, (5) radiology image and report delivery time, (6) total radiology turn-around time, (7) time to room the patient back in the UCC, and (8) time until the ordering physician views the film. Waiting room time was longer in 1997 (average time, 26∶47) versus 1995 (average time, 15∶54). The technologist’s examination completion time was approximately the same (1995 average time, 06∶12; 1997 average time, 05∶41). There was also a slight increase in the time of the technologist’s electronic verification or quality control in 1997 (average time, 7∶17) versus the film-based practice in 1995 (average time, 2∶35). However, radiology interpretation times dramatically improved (average time, 49∶38 in 1995 versus average time 13∶50 in 1997). There was also a decrease in image delivery times to the clinicians in 1997 (median, 53 minutes) versus the film based practice of 1995 (1 hour and 40 minutes). Reports were available with the images immediately upon completion by the radiologist in 1997, compared with a median time of 27 minutes in 1995. Importantly, patients were roomed back into the UCC examination rooms faster after the radiologic procedure in 1997 (average time, 13∶36) than they were in 1995 (29∶38). Finally, the ordering physicians viewed the diagnostic images and reports in dramatically less time in 1997 (median, 26 minutes) versus 1995 (median, 1 hour and 5 minutes). In conclusion, a filmless electronic imaging practice within our UCC greatly improved radiology image and report delivery times, as well as improved clinical efficiency.
doi:10.1007/BF03168787
PMCID: PMC3452886  PMID: 10342198
14.  Biomarker Profiling by Nuclear Magnetic Resonance Spectroscopy for the Prediction of All-Cause Mortality: An Observational Study of 17,345 Persons 
PLoS Medicine  2014;11(2):e1001606.
In this study, Würtz and colleagues conducted high-throughput profiling of blood specimens in two large population-based cohorts in order to identify biomarkers for all-cause mortality and enhance risk prediction. The authors found that biomarker profiling improved prediction of the short-term risk of death from all causes above established risk factors. However, further investigations are needed to clarify the biological mechanisms and the utility of these biomarkers to guide screening and prevention.
Please see later in the article for the Editors' Summary
Background
Early identification of ambulatory persons at high short-term risk of death could benefit targeted prevention. To identify biomarkers for all-cause mortality and enhance risk prediction, we conducted high-throughput profiling of blood specimens in two large population-based cohorts.
Methods and Findings
106 candidate biomarkers were quantified by nuclear magnetic resonance spectroscopy of non-fasting plasma samples from a random subset of the Estonian Biobank (n = 9,842; age range 18–103 y; 508 deaths during a median of 5.4 y of follow-up). Biomarkers for all-cause mortality were examined using stepwise proportional hazards models. Significant biomarkers were validated and incremental predictive utility assessed in a population-based cohort from Finland (n = 7,503; 176 deaths during 5 y of follow-up). Four circulating biomarkers predicted the risk of all-cause mortality among participants from the Estonian Biobank after adjusting for conventional risk factors: alpha-1-acid glycoprotein (hazard ratio [HR] 1.67 per 1–standard deviation increment, 95% CI 1.53–1.82, p = 5×10−31), albumin (HR 0.70, 95% CI 0.65–0.76, p = 2×10−18), very-low-density lipoprotein particle size (HR 0.69, 95% CI 0.62–0.77, p = 3×10−12), and citrate (HR 1.33, 95% CI 1.21–1.45, p = 5×10−10). All four biomarkers were predictive of cardiovascular mortality, as well as death from cancer and other nonvascular diseases. One in five participants in the Estonian Biobank cohort with a biomarker summary score within the highest percentile died during the first year of follow-up, indicating prominent systemic reflections of frailty. The biomarker associations all replicated in the Finnish validation cohort. Including the four biomarkers in a risk prediction score improved risk assessment for 5-y mortality (increase in C-statistics 0.031, p = 0.01; continuous reclassification improvement 26.3%, p = 0.001).
Conclusions
Biomarker associations with cardiovascular, nonvascular, and cancer mortality suggest novel systemic connectivities across seemingly disparate morbidities. The biomarker profiling improved prediction of the short-term risk of death from all causes above established risk factors. Further investigations are needed to clarify the biological mechanisms and the utility of these biomarkers for guiding screening and prevention.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
A biomarker is a biological molecule found in blood, body fluids, or tissues that may signal an abnormal process, a condition, or a disease. The level of a particular biomarker may indicate a patient's risk of disease, or likely response to a treatment. For example, cholesterol levels are measured to assess the risk of heart disease. Most current biomarkers are used to test an individual's risk of developing a specific condition. There are none that accurately assess whether a person is at risk of ill health generally, or likely to die soon from a disease. Early and accurate identification of people who appear healthy but in fact have an underlying serious illness would provide valuable opportunities for preventative treatment.
While most tests measure the levels of a specific biomarker, there are some technologies that allow blood samples to be screened for a wide range of biomarkers. These include nuclear magnetic resonance (NMR) spectroscopy and mass spectrometry. These tools have the potential to be used to screen the general population for a range of different biomarkers.
Why Was This Study Done?
Identifying new biomarkers that provide insight into the risk of death from all causes could be an important step in linking different diseases and assessing patient risk. The authors in this study screened patient samples using NMR spectroscopy for biomarkers that accurately predict the risk of death particularly amongst the general population, rather than amongst people already known to be ill.
What Did the Researchers Do and Find?
The researchers studied two large groups of people, one in Estonia and one in Finland. Both countries have set up health registries that collect and store blood samples and health records over many years. The registries include large numbers of people who are representative of the wider population.
The researchers first tested blood samples from a representative subset of the Estonian group, testing 9,842 samples in total. They looked at 106 different biomarkers in each sample using NMR spectroscopy. They also looked at the health records of this group and found that 508 people died during the follow-up period after the blood sample was taken, the majority from heart disease, cancer, and other diseases. Using statistical analysis, they looked for any links between the levels of different biomarkers in the blood and people's short-term risk of dying. They found that the levels of four biomarkers—plasma albumin, alpha-1-acid glycoprotein, very-low-density lipoprotein (VLDL) particle size, and citrate—appeared to accurately predict short-term risk of death. They repeated this study with the Finnish group, this time with 7,503 individuals (176 of whom died during the five-year follow-up period after giving a blood sample) and found similar results.
The researchers carried out further statistical analyses to take into account other known factors that might have contributed to the risk of life-threatening illness. These included factors such as age, weight, tobacco and alcohol use, cholesterol levels, and pre-existing illness, such as diabetes and cancer. The association between the four biomarkers and short-term risk of death remained the same even when controlling for these other factors.
The analysis also showed that combining the test results for all four biomarkers, to produce a biomarker score, provided a more accurate measure of risk than any of the biomarkers individually. This biomarker score also proved to be the strongest predictor of short-term risk of dying in the Estonian group. Individuals with a biomarker score in the top 20% had a risk of dying within five years that was 19 times greater than that of individuals with a score in the bottom 20% (288 versus 15 deaths).
What Do These Findings Mean?
This study suggests that there are four biomarkers in the blood—alpha-1-acid glycoprotein, albumin, VLDL particle size, and citrate—that can be measured by NMR spectroscopy to assess whether otherwise healthy people are at short-term risk of dying from heart disease, cancer, and other illnesses. However, further validation of these findings is still required, and additional studies should examine the biomarker specificity and associations in settings closer to clinical practice. The combined biomarker score appears to be a more accurate predictor of risk than tests for more commonly known risk factors. Identifying individuals who are at high risk using these biomarkers might help to target preventative medical treatments to those with the greatest need.
However, there are several limitations to this study. As an observational study, it provides evidence of only a correlation between a biomarker score and ill health. It does not identify any underlying causes. Other factors, not detectable by NMR spectroscopy, might be the true cause of serious health problems and would provide a more accurate assessment of risk. Nor does this study identify what kinds of treatment might prove successful in reducing the risks. Therefore, more research is needed to determine whether testing for these biomarkers would provide any clinical benefit.
There were also some technical limitations to the study. NMR spectroscopy does not detect as many biomarkers as mass spectrometry, which might therefore identify further biomarkers for a more accurate risk assessment. In addition, because both study groups were northern European, it is not yet known whether the results would be the same in other ethnic groups or populations with different lifestyles.
In spite of these limitations, the fact that the same four biomarkers are associated with a short-term risk of death from a variety of diseases does suggest that similar underlying mechanisms are taking place. This observation points to some potentially valuable areas of research to understand precisely what's contributing to the increased risk.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001606
The US National Institute of Environmental Health Sciences has information on biomarkers
The US Food and Drug Administration has a Biomarker Qualification Program to help researchers in identifying and evaluating new biomarkers
Further information on the Estonian Biobank is available
The Computational Medicine Research Team of the University of Oulu and the University of Bristol have a webpage that provides further information on high-throughput biomarker profiling by NMR spectroscopy
doi:10.1371/journal.pmed.1001606
PMCID: PMC3934819  PMID: 24586121
15.  Stenting for Peripheral Artery Disease of the Lower Extremities 
Executive Summary
Background
Objective
In January 2010, the Medical Advisory Secretariat received an application from University Health Network to provide an evidentiary platform on stenting as a treatment management for peripheral artery disease. The purpose of this health technology assessment is to examine the effectiveness of primary stenting as a treatment management for peripheral artery disease of the lower extremities.
Clinical Need: Condition and Target Population
Peripheral artery disease (PAD) is a progressive disease occurring as a result of plaque accumulation (atherosclerosis) in the arterial system that carries blood to the extremities (arms and legs) as well as vital organs. The vessels that are most affected by PAD are the arteries of the lower extremities, the aorta, the visceral arterial branches, the carotid arteries and the arteries of the upper limbs. In the lower extremities, PAD affects three major arterial segments i) aortic-iliac, ii) femoro-popliteal (FP) and iii) infra-popliteal (primarily tibial) arteries. The disease is commonly classified clinically as asymptomatic claudication, rest pain and critical ischemia.
Although the prevalence of PAD in Canada is not known, it is estimated that 800,000 Canadians have PAD. The 2007 Trans Atlantic Intersociety Consensus (TASC) II Working Group for the Management of Peripheral Disease estimated that the prevalence of PAD in Europe and North America to be 27 million, of whom 88,000 are hospitalizations involving lower extremities. A higher prevalence of PAD among elderly individuals has been reported to range from 12% to 29%. The National Health and Nutrition Examination Survey (NHANES) estimated that the prevalence of PAD is 14.5% among individuals 70 years of age and over.
Modifiable and non-modifiable risk factors associated with PAD include advanced age, male gender, family history, smoking, diabetes, hypertension and hyperlipidemia. PAD is a strong predictor of myocardial infarction (MI), stroke and cardiovascular death. Annually, approximately 10% of ischemic cardiovascular and cerebrovascular events can be attributed to the progression of PAD. Compared with patients without PAD, the 10-year risk of all-cause mortality is 3-fold higher in patients with PAD with 4-5 times greater risk of dying from cardiovascular event. The risk of coronary heart disease is 6 times greater and increases 15-fold in patients with advanced or severe PAD. Among subjects with diabetes, the risk of PAD is often severe and associated with extensive arterial calcification. In these patients the risk of PAD increases two to four fold. The results of the Canadian public survey of knowledge of PAD demonstrated that Canadians are unaware of the morbidity and mortality associated with PAD. Despite its prevalence and cardiovascular risk implications, only 25% of PAD patients are undergoing treatment.
The diagnosis of PAD is difficult as most patients remain asymptomatic for many years. Symptoms do not present until there is at least 50% narrowing of an artery. In the general population, only 10% of persons with PAD have classic symptoms of claudication, 40% do not complain of leg pain, while the remaining 50% have a variety of leg symptoms different from classic claudication. The severity of symptoms depends on the degree of stenosis. The need to intervene is more urgent in patients with limb threatening ischemia as manifested by night pain, rest pain, ischemic ulcers or gangrene. Without successful revascularization those with critical ischemia have a limb loss (amputation) rate of 80-90% in one year.
Diagnosis of PAD is generally non-invasive and can be performed in the physician offices or on an outpatient basis in a hospital. Most common diagnostic procedure include: 1) Ankle Brachial Index (ABI), a ratio of the blood pressure readings between the highest ankle pressure and the highest brachial (arm) pressure; and 2) Doppler ultrasonography, a diagnostic imaging procedure that uses a combination of ultrasound and wave form recordings to evaluate arterial flow in blood vessels. The value of the ABI can provide an assessment of the severity of the disease. Other non invasive imaging techniques include: Computed Tomography (CT) and Magnetic Resonance Angiography (MRA). Definitive diagnosis of PAD can be made by an invasive catheter based angiography procedure which shows the roadmap of the arteries, depicting the exact location and length of the stenosis / occlusion. Angiography is the standard method against which all other imaging procedures are compared for accuracy.
More than 70% of the patients diagnosed with PAD remain stable or improve with conservative management of pharmacologic agents and life style modifications. Significant PAD symptoms are well known to negatively influence an individual quality of life. For those who do not improve, revascularization methods either invasive or non-invasive can be used to restore peripheral circulation.
Technology Under Review
A Stent is a wire mesh “scaffold” that is permanently implanted in the artery to keep the artery open and can be combined with angioplasty to treat PAD. There are two types of stents: i) balloon-expandable and ii) self expandable stents and are available in varying length. The former uses an angioplasty balloon to expand and set the stent within the arterial segment. Recently, drug-eluting stents have been developed and these types of stents release small amounts of medication intended to reduce neointimal hyperplasia, which can cause re-stenosis at the stent site. Endovascular stenting avoids the problem of early elastic recoil, residual stenosis and flow limiting dissection after balloon angioplasty.
Research Questions
In individuals with PAD of the lower extremities (superficial femoral artery, infra-popliteal, crural and iliac artery stenosis or occlusion), is primary stenting more effective than percutaneous transluminal angioplasty (PTA) in improving patency?
In individuals with PAD of the lower extremities (superficial femoral artery, infra-popliteal, crural and iliac artery stenosis or occlusion), does primary stenting provide immediate success compared to PTA?
In individuals with PAD of the lower extremities (superficial femoral artery, infra-popliteal, crural and iliac artery stenosis or occlusion), is primary stenting associated with less complications compared to PTA?
In individuals with PAD of the lower extremities (superficial femoral artery, infra-popliteal, crural and iliac artery stenosis or occlusion), does primary stenting compared to PTA reduce the rate of re-intervention?
In individuals with PAD of the lower extremities (superficial femoral artery, infra-popliteal, crural and iliac artery stenosis or occlusion) is primary stenting more effective than PTA in improving clinical and hemodynamic success?
Are drug eluting stents more effective than bare stents in improving patency, reducing rates of re-interventions or complications?
Research Methods
Literature Search
A literature search was performed on February 2, 2010 using OVID MEDLINE, MEDLINE In-Process and Other Non-Indexed Citations, OVID EMBASE, the Cochrane Library, and the International Agency for Health Technology Assessment (INAHTA). Abstracts were reviewed by a single reviewer and, for those studies meeting the eligibility criteria, full-text articles were obtained. Reference lists were also examined for any additional relevant studies not identified through the search. The quality of evidence was assessed as high, moderate, low or very low according to GRADE methodology.
Inclusion Criteria
English language full-reports from 1950 to January Week 3, 2010
Comparative randomized controlled trials (RCTs), systematic reviews and meta-analyses of RCTs
Proven diagnosis of PAD of the lower extremities in all patients.
Adult patients at least 18 years of age.
Stent as at least one treatment arm.
Patency, re-stenosis, re-intervention, technical success, hemodynamic (ABI) and clinical improvement and complications as at least an outcome.
Exclusion Criteria
Non-randomized studies
Observational studies (cohort or retrospective studies) and case report
Feasibility studies
Studies that have evaluated stent but not as a primary intervention
Outcomes of Interest
The primary outcome measure was patency. Secondary measures included technical success, re-intervention, complications, hemodynamic (ankle brachial pressure index, treadmill walking distance) and clinical success or improvement according to Rutherford scale. It was anticipated, a priori, that there would be substantial differences among trials regarding the method of examination and definitions of patency or re-stenosis. Where studies reported only re-stenosis rates, patency rates were calculated as 1 minus re-stenosis rates.
Statistical Analysis
Odds ratios (for binary outcomes) or mean difference (for continuous outcomes) with 95% confidence intervals (CI) were calculated for each endpoint. An intention to treat principle (ITT) was used, with the total number of patients randomized to each study arm as the denominator for each proportion. Sensitivity analysis was performed using per protocol approach. A pooled odds ratio (POR) or mean difference for each endpoint was then calculated for all trials reporting that endpoint using a fixed effects model. PORs were calculated for comparisons of primary stenting versus PTA or other alternative procedures. Level of significance was set at alpha=0.05. Homogeneity was assessed using the chi-square test, I2 and by visual inspection of forest plots. If heterogeneity was encountered within groups (P < 0.10), a random effects model was used. All statistical analyses were performed using RevMan 5. Where sufficient data were available, these analyses were repeated within subgroups of patients defined by time of outcome assessment to evaluate sustainability of treatment benefit. Results were pooled based on the diseased artery and stent type.
Summary of Findings
Balloon-expandable stents vs PTA in superficial femoral artery disease
Based on a moderate quality of evidence, there is no significant difference in patency between primary stenting using balloon-expandable bare metal stents and PTA at 6, 12 and 24 months in patients with superficial femoral artery disease. The pooled OR for patency and their corresponding 95% CI are: 6 months 1.26 (0.74, 2.13); 12 months 0.95 (0.66, 1.38); and 24 months 0.72 (0.34. 1.55).
There is no significant difference in clinical improvement, re-interventions, peri and post operative complications, mortality and amputations between primary stenting using balloon-expandable bare stents and PTA in patients with superficial femoral artery. The pooled OR and their corresponding 95% CI are clinical improvement 0.85 (0.50, 1.42); ankle brachial index 0.01 (-0.02, 0.04) re-intervention 0.83 (0.26, 2.65); complications 0.73 (0.43, 1.22); all cause mortality 1.08 (0.59, 1.97) and amputation rates 0.41 (0.14, 1.18).
Self-expandable stents vs PTA in superficial femoral artery disease
Based on a moderate quality of evidence, primary stenting using self-expandable bare metal stents is associated with significant improvement in patency at 6, 12 and 24 months in patients with superficial femoral artery disease. The pooled OR for patency and their corresponding 95% CI are: 6 months 2.35 (1.06, 5.23); 12 months 1.54 (1.01, 2.35); and 24 months 2.18 (1.00. 4.78). However, the benefit of primary stenting is not observed for clinical improvement, re-interventions, peri and post operative complications, mortality and amputation in patients with superficial femoral artery disease. The pooled OR and their corresponding 95% CI are clinical improvement 0.61 (0.37, 1.01); ankle brachial index 0.01 (-0.06, 0.08) re-intervention 0.60 (0.36, 1.02); complications 1.60 (0.53, 4.85); all cause mortality 3.84 (0.74, 19.22) and amputation rates 1.96 (0.20, 18.86).
Balloon expandable stents vs PTA in iliac artery occlusive disease
Based on moderate quality of evidence, despite immediate technical success, 12.23 (7.17, 20.88), primary stenting is not associated with significant improvement in patency, clinical status, treadmill walking distance and reduction in re-intervention, complications, cardiovascular events, all cause mortality, QoL and amputation rates in patients with intermittent claudication caused by iliac artery occlusive disease. The pooled OR and their corresponding 95% CI are: patency 1.03 (0.56, 1.87); clinical improvement 1.08 (0.60, 1.94); walking distance 3.00 (12.96, 18.96); re-intervention 1.16 (0.71, 1.90); complications 0.56 (0.20, 1.53); all cause mortality 0.89 (0.47, 1.71); QoL 0.40 (-4.42, 5.52); cardiovascular event 1.16 (0.56, 2.40) and amputation rates 0.37 (0.11, 1.23). To date no RCTs are available evaluating self-expandable stents in the common or external iliac artery stenosis or occlusion.
Drug-eluting stent vs balloon-expandable bare metal stents in crural arteries
Based on a very low quality of evidence, at 6 months of follow-up, sirolimus drug-eluting stents are associated with a reduction in target vessel revascularization and re-stenosis rates in patients with atherosclerotic lesions of crural (tibial) arteries compared with balloon-expandable bare metal stent. The OR and their corresponding 95% CI are: re-stenosis 0.09 (0.03, 0.28) and TVR 0.15 (0.05, 0.47) in patients with atherosclerotic lesions of the crural arteries at 6 months follow-up. Both types of stents offer similar immediate success. Limitations of this study include: short follow-up period, small sample and no assessment of mortality as an outcome. Further research is needed to confirm its effect and safety.
PMCID: PMC3377569  PMID: 23074395
16.  Long-Term Exposure to Silica Dust and Risk of Total and Cause-Specific Mortality in Chinese Workers: A Cohort Study 
PLoS Medicine  2012;9(4):e1001206.
A retro-prospective cohort study by Weihong Chen and colleagues provides new estimates for the risk of total and cause-specific mortality due to long-term silica dust exposure among Chinese workers.
Background
Human exposure to silica dust is very common in both working and living environments. However, the potential long-term health effects have not been well established across different exposure situations.
Methods and Findings
We studied 74,040 workers who worked at 29 metal mines and pottery factories in China for 1 y or more between January 1, 1960, and December 31, 1974, with follow-up until December 31, 2003 (median follow-up of 33 y). We estimated the cumulative silica dust exposure (CDE) for each worker by linking work history to a job–exposure matrix. We calculated standardized mortality ratios for underlying causes of death based on Chinese national mortality rates. Hazard ratios (HRs) for selected causes of death associated with CDE were estimated using the Cox proportional hazards model. The population attributable risks were estimated based on the prevalence of workers with silica dust exposure and HRs. The number of deaths attributable to silica dust exposure among Chinese workers was then calculated using the population attributable risk and the national mortality rate. We observed 19,516 deaths during 2,306,428 person-years of follow-up. Mortality from all causes was higher among workers exposed to silica dust than among non-exposed workers (993 versus 551 per 100,000 person-years). We observed significant positive exposure–response relationships between CDE (measured in milligrams/cubic meter–years, i.e., the sum of silica dust concentrations multiplied by the years of silica exposure) and mortality from all causes (HR 1.026, 95% confidence interval 1.023–1.029), respiratory diseases (1.069, 1.064–1.074), respiratory tuberculosis (1.065, 1.059–1.071), and cardiovascular disease (1.031, 1.025–1.036). Significantly elevated standardized mortality ratios were observed for all causes (1.06, 95% confidence interval 1.01–1.11), ischemic heart disease (1.65, 1.35–1.99), and pneumoconiosis (11.01, 7.67–14.95) among workers exposed to respirable silica concentrations equal to or lower than 0.1 mg/m3. After adjustment for potential confounders, including smoking, silica dust exposure accounted for 15.2% of all deaths in this study. We estimated that 4.2% of deaths (231,104 cases) among Chinese workers were attributable to silica dust exposure. The limitations of this study included a lack of data on dietary patterns and leisure time physical activity, possible underestimation of silica dust exposure for individuals who worked at the mines/factories before 1950, and a small number of deaths (4.3%) where the cause of death was based on oral reports from relatives.
Conclusions
Long-term silica dust exposure was associated with substantially increased mortality among Chinese workers. The increased risk was observed not only for deaths due to respiratory diseases and lung cancer, but also for deaths due to cardiovascular disease.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Walk along most sandy beaches and you will be walking on millions of grains of crystalline silica, one of the commonest minerals on earth and a major ingredient in glass and in ceramic glazes. Silica is also used in the manufacture of building materials, in foundry castings, and for sandblasting, and respirable (breathable) crystalline silica particles are produced during quarrying and mining. Unfortunately, silica dust is not innocuous. Several serious diseases are associated with exposure to this dust, including silicosis (a chronic lung disease characterized by scarring and destruction of lung tissue), lung cancer, and pulmonary tuberculosis (a serious lung infection). Moreover, exposure to silica dust increases the risk of death (mortality). Worryingly, recent reports indicate that in the US and Europe, about 1.7 and 3.0 million people, respectively, are occupationally exposed to silica dust, figures that are dwarfed by the more than 23 million workers who are exposed in China. Occupational silica exposure, therefore, represents an important global public health concern.
Why Was This Study Done?
Although the lung-related adverse health effects of exposure to silica dust have been extensively studied, silica-related health effects may not be limited to these diseases. For example, could silica dust particles increase the risk of cardiovascular disease (diseases that affect the heart and circulation)? Other environmental particulates, such as the products of internal combustion engines, are associated with an increased risk of cardiovascular disease, but no one knows if the same is true for silica dust particles. Moreover, although it is clear that high levels of exposure to silica dust are dangerous, little is known about the adverse health effects of lower exposure levels. In this cohort study, the researchers examined the effect of long-term exposure to silica dust on the risk of all cause and cause-specific mortality in a large group (cohort) of Chinese workers.
What Did the Researchers Do and Find?
The researchers estimated the cumulative silica dust exposure for 74,040 workers at 29 metal mines and pottery factories from 1960 to 2003 from individual work histories and more than four million measurements of workplace dust concentrations, and collected health and mortality data for all the workers. Death from all causes was higher among workers exposed to silica dust than among non-exposed workers (993 versus 551 deaths per 100,000 person-years), and there was a positive exposure–response relationship between silica dust exposure and death from all causes, respiratory diseases, respiratory tuberculosis, and cardiovascular disease. For example, the hazard ratio for all cause death was 1.026 for every increase in cumulative silica dust exposure of 1 mg/m3-year; a hazard ratio is the incidence of an event in an exposed group divided by its incidence in an unexposed group. Notably, there was significantly increased mortality from all causes, ischemic heart disease, and silicosis among workers exposed to respirable silica concentrations at or below 0.1 mg/m3, the workplace exposure limit for silica dust set by the US Occupational Safety and Health Administration. For example, the standardized mortality ratio (SMR) for silicosis among people exposed to low levels of silica dust was 11.01; an SMR is the ratio of observed deaths in a cohort to expected deaths calculated from recorded deaths in the general population. Finally, the researchers used their data to estimate that, in 2008, 4.2% of deaths among industrial workers in China (231,104 deaths) were attributable to silica dust exposure.
What Do These Findings Mean?
These findings indicate that long-term silica dust exposure is associated with substantially increased mortality among Chinese workers. They confirm that there is an exposure–response relationship between silica dust exposure and a heightened risk of death from respiratory diseases and lung cancer. That is, the risk of death from these diseases increases as exposure to silica dust increases. In addition, they show a significant relationship between silica dust exposure and death from cardiovascular diseases. Importantly, these findings suggest that even levels of silica dust that are considered safe increase the risk of death. The accuracy of these findings may be affected by the accuracy of the silica dust exposure estimates and/or by confounding (other factors shared by the people exposed to silica such as diet may have affected their risk of death). Nevertheless, these findings highlight the need to tighten regulations on workplace dust control in China and elsewhere.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001206.
The American Lung Association provides information on silicosis
The US Centers for Disease Control and Prevention provides information on silica in the workplace, including links to relevant US National Institute for Occupational Health and Safety publications, and information on silicosis and other pneumoconioses
The US Occupational Safety and Health Administration also has detailed information on occupational exposure to crystalline silica
What does silicosis mean to you is a video provided by the US Mine Safety and Health Administration that includes personal experiences of silicosis; Dont let silica dust you is a video produced by the Association of Occupational and Environmental Clinics that identifies ways to reduce silica dust exposure in the workplace
The MedlinePlus encyclopedia has a page on silicosis (in English and Spanish)
The International Labour Organization provides information on health surveillance for those exposed to respirable crystalline silica
The World Health Organization has published a report about the health effects of crystalline silica and quartz
doi:10.1371/journal.pmed.1001206
PMCID: PMC3328438  PMID: 22529751
17.  Analysis of dose measurement other than the radiation protection during the radiographic examination 
SpringerPlus  2014;3:250.
Objectives
The study measured the dose on body regions that were not shielded to protect from radiation exposure during the general procedure, with the goal of providing basic radiation dose data for radiological technologists who perform the radiographic examination.
Materials and methods
Shooting parts with the phantom were similar to human tissues using general shooting equipment in the general examination room. The scattered rays were measured with the ion chamber. The hand received the highest average radiation dose and the kidney the lowest. The same pattern was evident for the average equivalent dose. The available daily shooting was highest in the anterior/posterior skull, followed by the posterior/anterior chest, abdomen, anterior/posterior spine and extremities.
Results
The daily available numbers for the eye were lower than other body regions (6-times, 4-times, 26-times, 3-times and 121-times) and the numbers on the foot were higher than for other regions (73-times, 48-times, 263-times, 39-times and 702-times).
Conclusions
Radiation should be thoroughly blocked by the apron to protect the radiological technologist from the radiation exposure, the proper distance from the irradiation source should be maintained exposure is inevitable and the exposure dose and working environment shall be regularly assessed to ensure minimal exposure dose of the radiological technologist in accordance with the International Commission on Radiological Protection recommendation.
doi:10.1186/2193-1801-3-250
PMCID: PMC4039667  PMID: 24892002
Exposure dose; General shooting; Scattered rays
18.  Association between Class III Obesity (BMI of 40–59 kg/m2) and Mortality: A Pooled Analysis of 20 Prospective Studies 
PLoS Medicine  2014;11(7):e1001673.
In a pooled analysis of 20 prospective studies, Cari Kitahara and colleagues find that class III obesity (BMI of 40–59) is associated with excess rates of total mortality, particularly due to heart disease, cancer, and diabetes.
Please see later in the article for the Editors' Summary
Background
The prevalence of class III obesity (body mass index [BMI]≥40 kg/m2) has increased dramatically in several countries and currently affects 6% of adults in the US, with uncertain impact on the risks of illness and death. Using data from a large pooled study, we evaluated the risk of death, overall and due to a wide range of causes, and years of life expectancy lost associated with class III obesity.
Methods and Findings
In a pooled analysis of 20 prospective studies from the United States, Sweden, and Australia, we estimated sex- and age-adjusted total and cause-specific mortality rates (deaths per 100,000 persons per year) and multivariable-adjusted hazard ratios for adults, aged 19–83 y at baseline, classified as obese class III (BMI 40.0–59.9 kg/m2) compared with those classified as normal weight (BMI 18.5–24.9 kg/m2). Participants reporting ever smoking cigarettes or a history of chronic disease (heart disease, cancer, stroke, or emphysema) on baseline questionnaires were excluded. Among 9,564 class III obesity participants, mortality rates were 856.0 in men and 663.0 in women during the study period (1976–2009). Among 304,011 normal-weight participants, rates were 346.7 and 280.5 in men and women, respectively. Deaths from heart disease contributed largely to the excess rates in the class III obesity group (rate differences = 238.9 and 132.8 in men and women, respectively), followed by deaths from cancer (rate differences = 36.7 and 62.3 in men and women, respectively) and diabetes (rate differences = 51.2 and 29.2 in men and women, respectively). Within the class III obesity range, multivariable-adjusted hazard ratios for total deaths and deaths due to heart disease, cancer, diabetes, nephritis/nephrotic syndrome/nephrosis, chronic lower respiratory disease, and influenza/pneumonia increased with increasing BMI. Compared with normal-weight BMI, a BMI of 40–44.9, 45–49.9, 50–54.9, and 55–59.9 kg/m2 was associated with an estimated 6.5 (95% CI: 5.7–7.3), 8.9 (95% CI: 7.4–10.4), 9.8 (95% CI: 7.4–12.2), and 13.7 (95% CI: 10.5–16.9) y of life lost. A limitation was that BMI was mainly ascertained by self-report.
Conclusions
Class III obesity is associated with substantially elevated rates of total mortality, with most of the excess deaths due to heart disease, cancer, and diabetes, and major reductions in life expectancy compared with normal weight.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
The number of obese people (individuals with an excessive amount of body fat) is increasing rapidly in many countries. Worldwide, according to the Global Burden of Disease Study 2013, more than a third of all adults are now overweight or obese. Obesity is defined as having a body mass index (BMI, an indicator of body fat calculated by dividing a person's weight in kilograms by their height in meters squared) of more than 30 kg/m2 (a 183-cm [6-ft] tall man who weighs more than 100 kg [221 lbs] is obese). Compared to people with a healthy weight (a BMI between 18.5 and 24.9 kg/m2), overweight and obese individuals (who have a BMI between 25.0 and 29.9 kg/m2 and a BMI of 30 kg/m2 or more, respectively) have an increased risk of developing diabetes, heart disease, stroke, and some cancers, and tend to die younger. Because people become unhealthily fat by consuming food and drink that contains more energy (kilocalories) than they need for their daily activities, obesity can be prevented or treated by eating less food and by increasing physical activity.
Why Was This Study Done?
Class III obesity (extreme, or morbid, obesity), which is defined as a BMI of more than 40 kg/m2, is emerging as a major public health problem in several high-income countries. In the US, for example, 6% of adults are now morbidly obese. Because extreme obesity used to be relatively uncommon, little is known about the burden of disease, including total and cause-specific mortality (death) rates, among individuals with class III obesity. Before we can prevent and treat class III obesity effectively, we need a better understanding of the health risks associated with this condition. In this pooled analysis of prospective cohort studies, the researchers evaluate the risk of total and cause-specific death and the years of life lost associated with class III obesity. A pooled analysis analyzes the data from several studies as if the data came from one large study; prospective cohort studies record the characteristics of a group of participants at baseline and follow them to see which individuals develop a specific condition.
What Did the Researchers Do and Find?
The researchers included 20 prospective (mainly US) cohort studies from the National Cancer Institute Cohort Consortium (a partnership that studies cancer by undertaking large-scale collaborations) in their pooled analysis. After excluding individuals who had ever smoked and people with a history of chronic disease, the analysis included 9,564 adults who were classified as class III obese based on self-reported height and weight at baseline and 304,011 normal-weight adults. Among the participants with class III obesity, mortality rates (deaths per 100,000 persons per year) during the 30-year study period were 856.0 and 663.0 in men and women, respectively, whereas the mortality rates among normal-weight men and women were 346.7 and 280.5, respectively. Heart disease was the major contributor to the excess death rate among individuals with class III obesity, followed by cancer and diabetes. Statistical analyses of the pooled data indicate that the risk of all-cause death and death due to heart disease, cancer, diabetes, and several other diseases increased with increasing BMI. Finally, compared with having a normal weight, having a BMI between 40 and 59 kg/m2 resulted in an estimated loss of 6.5 to 13.7 years of life.
What Do These Findings Mean?
These findings indicate that class III obesity is associated with a substantially increased rate of death. Notably, this death rate increase is similar to the increase associated with smoking among normal-weight people. The findings also suggest that heart disease, cancer, and diabetes are responsible for most of the excess deaths among people with class III obesity and that having class III obesity results in major reductions in life expectancy. Importantly, the number of years of life lost continues to increase for BMI values above 50 kg/m2, and beyond this point, the loss of life expectancy exceeds that associated with smoking among normal-weight people. The accuracy of these findings is limited by the use of self-reported height and weight measurements to calculate BMI and by the use of BMI as the sole measure of obesity. Moreover, these findings may not be generalizable to all populations. Nevertheless, these findings highlight the need to develop more effective interventions to combat the growing public health problem of class III obesity.
Additional Information
Please access these websites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1001673.
The US Centers for Disease Control and Prevention provides information on all aspects of overweight and obesity (in English and Spanish)
The World Health Organization provides information on obesity (in several languages); Malri's story describes the health risks faced by an obese child
The UK National Health Service Choices website provides information about obesity, including a personal story about losing weight
The Global Burden of Disease Study website provides the latest details about global obesity trends
The US Department of Agriculture's ChooseMyPlate.gov website provides a personal healthy eating plan; the Weight-Control Information Network is an information service provided for the general public and health professionals by the US National Institute of Diabetes and Digestive and Kidney Diseases (in English and Spanish)
MedlinePlus provides links to other sources of information on obesity (in English and Spanish)
doi:10.1371/journal.pmed.1001673
PMCID: PMC4087039  PMID: 25003901
19.  Mortality Patterns in the West Bank, Palestinian Territories, 1999-2003 
Preventing Chronic Disease  2008;5(4):A112.
Introduction
The West Bank in the Palestinian Territories is undergoing an epidemiologic transition. We provide a general description of mortality from all causes, focusing on chronic disease mortality in adults.
Methods
Mortality data analyzed for our study were obtained from the Palestinian Ministry of Health in the West Bank for 1999 through 2003. Individual information was obtained from death notification forms.
Results
A total of 27,065 deaths were reported for 1999 through 2003 in the West Bank, Palestinian Territories. Circulatory diseases were the main cause of death (45%), followed by cancer (10%) and unintentional injuries (7%). Among men, the highest age-standardized mortality rates (ASMRs) were due to diseases of the circulatory system, cancer, and unintentional injuries. Among women, the highest ASMRs were due to circulatory disease, cancer, and diabetes mellitus. Of the circulatory diseases, the highest ASMRs for men were due to acute myocardial infarction and cerebrovascular disease. ASMRs attributable to circulatory system diseases were similar for women. Lung cancer was the largest cause of cancer mortality for men; breast cancer was the largest cause for women.
Conclusion
Because of the high mortality rates, the risk factors associated with chronic diseases in the Palestinian Territories must be ascertained. Medical and public health policies and interventions need to be reassessed, giving due attention to this rise in modern-day diseases in this area.
PMCID: PMC2578778  PMID: 18793500
20.  Secondary prevention of ischaemic cardiac events 
Clinical Evidence  2011;2011:0206.
Introduction
Coronary artery disease is the leading cause of mortality in resource-rich countries, and is becoming a major cause of morbidity and mortality in resource-poor countries. Secondary prevention in this context is long-term treatment to prevent recurrent cardiac morbidity and mortality in people who have had either a prior acute myocardial infarction (MI) or acute coronary syndrome, or who are at high risk due to severe coronary artery stenoses or prior coronary surgical procedures. Secondary prevention in people with an acute MI or acute coronary syndrome within the past 6 months is not included.
Methods and outcomes
We conducted a systematic review and aimed to answer the following clinical questions: What are the effects of antithrombotic treatment; other drug treatments; cholesterol reduction; blood pressure reduction; non-drug treatments; and revascularisation procedures? We searched: Medline, Embase, The Cochrane Library, and other important databases up to May 2010 (Clinical Evidence reviews are updated periodically, please check our website for the most up-to-date version of this review). We included harms alerts from relevant organisations such as the US Food and Drug Administration (FDA) and the UK Medicines and Healthcare products Regulatory Agency (MHRA).
Results
We found 137 systematic reviews or RCTs that met our inclusion criteria. We performed a GRADE evaluation of the quality of evidence for interventions.
Conclusions
In this systematic review, we present information relating to the effectiveness and safety of the following interventions: advice to eat less fat, advice to eat more fibre, advice to increase consumption of fish oils, amiodarone, angiotensin-converting enzyme (ACE) inhibitors, angiotensin II receptor blockers, angiotensin II receptor blockers plus ACE inhibitors, antioxidant vitamin combinations, antiplatelet agents, aspirin, beta-blockers, beta-carotene, blood pressure reduction, calcium channel blockers, cardiac rehabilitation including exercise, class I antiarrhythmic agents, coronary artery bypass grafting (CABG), fibrates, hormone replacement therapy (HRT), Mediterranean diet, multivitamins, non-specific cholesterol reduction, oral anticoagulants, oral glycoprotein IIb/IIIa receptor inhibitors, percutaneous coronary intervention (PCI), psychosocial treatment, smoking cessation, statins, vitamin C, and vitamin E.
Key Points
Coronary artery disease is the leading cause of mortality in resource-rich countries, and is becoming a major cause of morbidity and mortality in resource-poor countries. Secondary prevention in this context is long-term treatment to prevent recurrent cardiac morbidity and mortality in people who have had either a prior MI or acute coronary syndrome, or who are at high risk due to severe coronary artery stenoses or prior coronary surgical procedures.
Of the antithrombotic treatments, there is good evidence that aspirin (especially combined with clopidogrel in people with acute coronary syndromes or MI), clopidogrel (more effective than aspirin), and anticoagulants all effectively reduce the risk of cardiovascular events. Oral anticoagulants substantially increase the risk of haemorrhage. These risks may outweigh the benefits when combined with antiplatelet treatments.Adding oral glycoprotein IIb/IIIa receptor inhibitors to aspirin seems to increase the risk of mortality compared with aspirin alone.
Other drug treatments that reduce mortality include beta-blockers (after MI and in people with left ventricular dysfunction), ACE inhibitors (in people at high risk, after MI, or with left ventricular dysfunction), and amiodarone (in people with MI and high risk of death from cardiac arrhythmia). There is conflicting evidence on the effect of calcium channel blockers. Some types may be effective at reducing mortality in the absence of heart failure, whereas others may be harmful.Contrary to decades of large observational studies, multiple RCTs show no cardiac benefit from HRT in postmenopausal women.
Lipid-lowering treatments effectively reduce the risk of cardiovascular mortality and non-fatal cardiovascular events in people with CHD.
There is good evidence that statins reduce the risk of mortality and cardiac events in people at high risk, but the evidence is less clear for fibrates.
The magnitude of cardiovascular risk reduction in people with coronary artery disease correlates directly with the magnitude of blood pressure reduction.
Cardiac rehabilitation (including exercise) and smoking cessation reduce the risk of cardiac events in people with CHD. Antioxidant vitamins (such as vitamin E, beta-carotene, or vitamin C) have no effect on cardiovascular events in high-risk people, and in some cases may actually increase risk of cardiac mortality.We don't know whether changing diet alters the risk of cardiac episodes, although a Mediterranean diet may have some survival benefit over a Western diet. Advice to increase fish oil consumption or fish oil consumption may be beneficial in some population groups. However, evidence was weak.Some psychological interventions may be more effective than usual care at improving some cardiovascular outcomes. However, evidence was inconsistent.
In selected people, such as those with more-extensive coronary disease and impaired left ventricular function, CABG may improve survival compared with an initial strategy of medical treatment. We don't know how PTCA compares with medical treatment.
We found no consistent difference in mortality or recurrent MI between CABG and PTCA with or without stenting, because of varied results among subgroups and insufficient evidence on stenting when comparing the interventions. CABG may be more effective than PTCA with or without stenting at reducing some composite outcomes, particularly those including repeat revascularisation rates. PTCA with stenting may be more effective than PTCA alone.
PMCID: PMC3217663  PMID: 21875445
21.  Continuing quality improvement procedures for a clinical PACS 
Journal of Digital Imaging  1998;11(Suppl 1):111-114.
The University of California at San Francisco (USCF) Department of Radiology currently has a clinically operational picture archiving and communication system (PACS) that is thirty-five percent filmless, with the goal of becoming seventy-five percent filmless within the year. The design and implementation of the clinical PACS has been a collaborative effort between an academic research laboratory and a commercial vendor partner. Images are digitally acquired from three computed radiography (CR) scanners, five computed tomography (CT) scanners, five magnetic resonance (MR) imagers, three digital fluoroscopic rooms, an ultrasound mini-PACS and a nuclear medicine mini-PACS. The DICOM (Digital Imaging and Communications in Medicine) standard communications protocol and image format is adhered to throughout the PACS. Images are archived in hierarchical staged fashion, on a RAID (redundant array of inexpensive disks) and on magneto-optical disk jukeboxes. The clinical PACS uses an object-oriented Oracle SQL (systems query language) database, and interfaces to the Radiology Information System using the HL7 (Health Languages 7) standard. Components are networked using a combination of switched and fast ethernet, and ATM (asynchronous transfer mode), all over fiber optics. The wide area network links six UCSF sites in San Francisco. A combination of high and medium resolution dual-monitor display stations have been placed throughout the Department of Radiology, the Emergency Department (ED) and Intensive Care Units (ICU). A continuing quality improvement (CQI) committee has been formed to facilitate the PACS installation and training, workflow modifications, quality assurance and clinical acceptance. This committee includes radiologists at all levels (resident, fellow, attending), radiology technologists, film library personnel, ED and ICU clinian end-users, and PACS team members. The CQI committee has proved vital in the creation of new management procedures, providing a means for user feedback and education, and contributing to the overall acceptance of, and user satisfaction with the system. Well developed CQI procedures have been essential to the successful clinical operation of the PACS as UCSF Radiology moves toward, a filmless department.
doi:10.1007/BF03168275
PMCID: PMC3453403  PMID: 9735446
PACS; continuing quality improvement (CQI); quality assurance (QA); filmless
22.  Secondary prevention of ischaemic cardiac events 
Clinical Evidence  2009;2009:0206.
Introduction
Coronary artery disease is the leading cause of mortality in resource-rich countries, and is becoming a major cause of morbidity and mortality in resource-poor countries. Secondary prevention in this context is long-term treatment to prevent recurrent cardiac morbidity and mortality in people who have had either a prior acute myocardial infarction (MI) or acute coronary syndrome, or who are at high risk due to severe coronary artery stenoses or prior coronary surgical procedures.
Methods and outcomes
We conducted a systematic review and aimed to answer the following clinical questions: What are the effects of antithrombotic treatment; other drug treatments; cholesterol reduction; blood pressure reduction; non-drug treatments; and revascularisation procedures? We searched: Medline, Embase, The Cochrane Library, and other important databases up to October 2007 (Clinical Evidence reviews are updated periodically, please check our website for the most up-to-date version of this review). We included harms alerts from relevant organisations such as the US Food and Drug Administration (FDA) and the UK Medicines and Healthcare products Regulatory Agency (MHRA).
Results
We found 154 systematic reviews or RCTs that met our inclusion criteria. We performed a GRADE evaluation of the quality of evidence for interventions.
Conclusions
In this systematic review. we present information relating to the effectiveness and safety of the following interventions: advice to eat less fat; advice to eat more fibre; advice to increase consumption of fish oils; amiodarone; angiotensin-converting enzyme (ACE) inhibitors; angiotensin II receptor blockers; angiotensin II receptor blockers plus ACE inhibitors; antioxidant vitamin combinations; antiplatelet agents; beta-blockers; beta-carotene; blood pressure reduction; calcium channel blockers; cardiac rehabilitation including exercise; class I antiarrhythmic agents; coronary artery bypass grafting (CABG); percutaneous coronary intervention (PCI); fibrates; hormone replacement therapy (HRT); Mediterranean diet; multivitamins; non-specific cholesterol reduction; oral anticoagulants; oral glycoprotein IIb/IIIa receptor inhibitors; psychosocial treatment; smoking cessation; statins; vitamin C; and vitamin E.
Key Points
Coronary artery disease is the leading cause of mortality in resource-rich countries, and is becoming a major cause of morbidity and mortality in resource-poor countries. Secondary prevention in this context is long-term treatment to prevent recurrent cardiac morbidity and mortality in people who have had either a prior MI or acute coronary syndrome, or who are at high risk due to severe coronary artery stenoses or prior coronary surgical procedures.
Of the antithrombotic treatments, there is good evidence that aspirin (especially combined with clopidogrel in people with acute coronary syndromes or MI), clopidogrel (more effective than aspirin), and anticoagulants all effectively reduce the risk of cardiovascular events. Oral anticoagulants substantially increase the risk of haemorrhage. These risks may outweigh the benefits when combined with antiplatelet treatments. Adding oral glycoprotein IIb/IIIa receptor inhibitors to aspirin seems to increase the risk of mortality compared with aspirin alone.
Other drug treatments that reduce mortality include beta-blockers (after MI and in people with left ventricular dysfunction), ACE inhibitors (in people at high risk, after MI, or with left ventricular dysfunction), angiotensin II receptor blockers (in people with coronary artery disease), and amiodarone (in people with MI and high risk of death from cardiac arrhythmia). There is conflicting evidence on the effect of calcium channel blockers. Some types may be effective at reducing mortality in the absence of heart failure, whereas other may be harmful.Contrary to decades of large observational studies, multiple RCTs show no cardiac benefit from HRT in postmenopausal women.
Lipid-lowering treatments effectively reduce the risk of cardiovascular mortality and non-fatal cardiovascular events in people with CHD.
There is good evidence that statins reduce the risk of mortality and cardiac events in people at high risk, but the evidence is less clear for fibrates.
The magnitude of cardiovascular risk reduction in people with coronary artery disease correlates directly with the magnitude of blood pressure reduction.
Cardiac rehabilitation (including exercise), and smoking cessation all reduce the risk of cardiac events in people with CHD. Antioxidant vitamins (such as vitamin E, beta-carotene, or vitamin C) have no effect on cardiovascular events in high-risk people, and in some cases may actually increase risk of cardiac mortality.We don't know whether changing diet alters the risk of cardiac episodes, although a Mediterranean diet may have some survival benefit over a Western diet.
In selected people, such as those with more-extensive coronary disease and impaired left ventricular function, CABG may improve survival compared with an initial strategy of medical treatment. We don't know how PTCA compares with medical treatment.
We found no consistent difference in mortality or recurrent MI between CABG and PTCA with or without stenting,due to varied results among subgroups and insufficient evidence on stenting when comparing the interventions. PTCA with stenting may be more effective than PTCA alone.
PMCID: PMC2907785
23.  A community approach to addressing excess breast and cervical cancer mortality among women of African descent in Boston. 
Public Health Reports  2003;118(4):338-347.
In 2000, the REACH Boston 2010 Breast and Cervical Cancer Coalition conducted a community needs assessment and found several factors that may have contributed to disproportionately high breast and cervical cancer mortality among black women: (a) Focus group participants reported that many women in their communities had limited awareness about risk factors for cancer as well as about screening. (b) Black women experienced barriers to care related to the cultural competence of providers and of institutions. (c) Black women were not receiving adequate follow-up for abnormal mammograms and Pap smears. The Coalition's Community Action Plan to address disparities includes a model primary care service for black women; scholarships to increase the number of black mammogram technologists; primary care provider and radiology technologist training about disparities and cultural competence; and education to increase awareness among black women and to increase leadership and advocacy skills.
PMCID: PMC1497561  PMID: 12815081
24.  ORGAN-SPECIFIC EXTERNAL DOSE COEFFICIENTS AND PROTECTIVE APRON TRANSMISSION FACTORS FOR HISTORICAL DOSE RECONSTRUCTION FOR MEDICAL PERSONNEL 
Health physics  2011;101(1):13-27.
While radiation absorbed dose (Gy) to the skin or other organs is sometimes estimated for patients from diagnostic radiologic examinations or therapeutic procedures, rarely is occupationally-received radiation absorbed dose to individual organs/tissues estimated for medical personnel, e.g., radiologic technologists or radiologists. Generally, for medical personnel, equivalent or effective radiation doses are estimated for compliance purposes. In the very few cases when organ doses to medical personnel are reconstructed, the data is usually for the purpose of epidemiologic studies, e.g., a study of historical doses and risks to a cohort of about 110,000 radiologic technologists presently underway at the U.S. National Cancer Institute. While ICRP and ICRU have published organ-specific external dose conversion coefficients (DCCs), i.e., absorbed dose to organs and tissues per unit air kerma and dose equivalent per unit air kerma, those factors have been primarily published for mono-energetic photons at selected energies. This presents two related problems for historical dose reconstruction, both of which are addressed here. It is necessary to derive conversion factors values for (i) continuous distributions of energy typical of diagnostic medical x rays (bremsstrahlung radiation), and (ii) for energies of particular radioisotopes used in medical procedures, neither of which are presented in published tables. For derivation of DCCs for bremsstrahlung radiation, combinations of x-ray tube potentials and filtrations were derived for different time periods based on a review of relevant literature. Three peak tube potentials (70 kV, 80 kV, and 90 kV) with four different amounts of beam filtration were determined to be applicable for historic dose reconstruction. The probability of these machine settings were assigned to each of the four time periods (earlier than 1949, 1949-1954, 1955-1968, and after 1968). Continuous functions were fit to each set of discrete values of the ICRP/ICRU mono-energetic DCCs and the functions integrated over the air-kerma weighted photon fluence of the 12 defined x-ray spectra. The air kerma-weighted DCCs in this work were developed specifically for an irradiation geometry of anterior to posterior (AP) and for the following tissues: thyroid, breast, ovary, lens of eye, lung, colon, testes, heart, skin (anterior side only), red bone marrow (RBM), heart, and brain. In addition, a series of functional relationships to predict DT per Ka values for RBM dependent on body mass index [BMI (kg m−2) ≡ weight per height2] and average photon energy were derived from a published analysis. Factors to account for attenuation of radiation by protective lead aprons were also developed. Because lead protective aprons often worn by radiology personnel not only reduce the intensity of x-ray exposure but also appreciably harden the transmitted fluence of bremsstrahlung x rays, DCCs were separately calculated for organs possibly protected by lead aprons by considering three cases: no apron, 0.25 mm Pb apron, and 0.5 mm Pb apron. For estimation of organ doses from conducting procedures with radioisotopes, continuous functions of the reported mono-energetic values were developed and DCCs were derived by estimation of the function at relevant energies. By considering the temporal changes in primary exposure-related parameters, e.g., energy distribution, the derived DCCs and transmission factors presented here allow for more realistic historical dose reconstructions for medical personnel when monitoring badge readings are the primary data on which estimation of an individual's organ doses are based.
doi:10.1097/HP.0b013e318204a60a
PMCID: PMC3964780  PMID: 21617389
25.  Workflow Optimization: Current Trends and Future Directions  
Journal of Digital Imaging  2002;15(3):141-152.
In an attempt to maximize productivity within the medical imaging department, increasing importance and attention is being placed on workflow. Workflow is the process of analyzing individual steps that occur during a single event, such as the performance of an MRI exam. The primary focus of workflow optimization within the imaging department is automation and task consolidation, however, a number of other factors should be considered including the stochastic nature of the workload, availability of human resources, and the specific technologies being employed. The purpose of this paper is to determine the complex relationship that exists between information technology and the radiologic technologist, in an attempt to determine how workflow can be optimized to improve technologist productivity. This relationship takes on greater importance as more imaging departments are undergoing the transition from film-based to filmless operation. A nationwide survey was conducted to compare technologist workflow in film-based and filmless operations, for all imaging modalities. The individual tasks performed by technologists were defined, along with the amount of time allocated to these tasks. The index of workflow efficiency was determined to be the percentage of overall technologist time allocated to image acquisition, since this is the primary responsibility of the radiologic technologist. Preliminary analysis indicates technologist workflow in filmless operation is enhanced when compared with film-based operation, for all imaging modalities. The specific tasks that require less technologist time in filmless operation are accessing data and retake rates (due to both technical factors and lost exams). Surprisingly, no significant differences were reported for the task of image processing, when comparing technologist workflow in film-based and filmless operations. Additional research is planned to evaluate the potential workflow gains achievable through workflow optimization software, improved systems integration, and automation of advanced image processing techniques.
doi:10.1007/s10278-002-0022-7
PMCID: PMC3613260  PMID: 12481228

Results 1-25 (1281276)