Treatment-related stomach cancer is an important cause of morbidity and mortality among the growing number of Hodgkin lymphoma (HL) survivors, but risks associated with specific HL treatments are unclear.
Patients and Methods
We conducted an international case-control study of stomach cancer nested in a cohort of 19,882 HL survivors diagnosed from 1953 to 2003, including 89 cases and 190 matched controls. For each patient, we quantified cumulative doses of specific alkylating agents (AAs) and reconstructed radiation dose to the stomach tumor location.
Stomach cancer risk increased with increasing radiation dose to the stomach (Ptrend < .001) and with increasing number of AA-containing chemotherapy cycles (Ptrend = .02). Patients who received both radiation to the stomach ≥ 25 Gy and high-dose procarbazine (≥ 5,600 mg/m2) had strikingly elevated stomach cancer risk (25 cases, two controls; odds ratio [OR], 77.5; 95% CI, 14.7 to 1452) compared with those who received radiation < 25 Gy and procarbazine < 5,600 mg/m2 (Pinteraction < .001). Risk was also elevated (OR, 2.8; 95% CI, 1.3 to 6.4) among patients who received radiation to the stomach ≥ 25 Gy but procarbazine < 5,600 mg/m2; however, no procarbazine-related risk was evident with radiation < 25 Gy. Treatment with dacarbazine also increased stomach cancer risk (12 cases, nine controls; OR, 8.8; 95% CI, 2.1 to 46.6), after adjustment for radiation and procarbazine doses.
Patients with HL who received subdiaphragmatic radiotherapy had dose-dependent increased risk of stomach cancer, with marked risks for patients who also received chemotherapy containing high-dose procarbazine. For current patients, risks and benefits of exposure to both procarbazine and subdiaphragmatic radiotherapy should be weighed carefully. For patients treated previously, GI symptoms should be evaluated promptly.
To assess the dose-response relationship for stomach cancer following radiotherapy for cervical cancer.
Methods and Materials
We conducted a nested, matched case-control study of 201 cases and 378 controls among 53,547 5-year survivors of cervical cancer diagnosed from 1943–1995, from five international, population-based cancer registries. We estimated individual radiation doses to the site of the stomach cancer for all cases and to corresponding sites for the matched controls (overall mean stomach tumor dose, 2.56 gray [Gy], range 0.03–46.1 and following parallel opposed pelvic fields, 1.63 Gy, range 0.12–6.3).
Over 90% of women received radiotherapy, mostly with external beam therapy in combination with brachytherapy. Stomach cancer risk was non-significantly increased (odds ratios [ORs] 1.27–2.28) for women receiving between 0.5–4.9 Gy to the stomach cancer site and significantly increased at doses ≥5 Gy (OR=4.20, 95% confidence interval, 1.41–13.4, Ptrend=0.047) compared to non-irradiated women. A highly significant radiation dose-response relationship was evident when analyses were restricted to the 131 cases (251 controls) whose stomach cancer was located in the middle and lower portions of the stomach (Ptrend=0.003), whereas there was no indication of increasing risk with increasing dose for 30 cases (57 controls) whose cancer was located in the upper stomach (Ptrend=0.23).
Our findings showed for the first time a significant linear dose-response relationship for risk of stomach cancer in long-term survivors of cervical cancer.
cervical cancer; stomach cancer; radiotherapy; case-control; second primary cancer
Epidemiologic studies of medical radiation workers have found excess risks of leukemia, skin and female breast cancer in those employed before 1950, but little consistent evidence of cancer risk increases subsequently. Occupational radiation-related dose-response, risk estimates for recent years, and lifetime cancer risk data are limited for radiologists and radiologic technologists and lacking for physicians and technologists performing or assisting with fluoroscopically-guided procedures. Based on data from 80 mostly small studies of cardiologists and substantially fewer studies of physicians in other specialties, estimated effective doses to physicians per interventional procedure vary by more than an order of magnitude. There is an urgent need to expand the limited base of information on average annual occupational radiation exposures and time-trends in doses received by medical radiation workers, to assess lifetime cancer risks of radiologists and radiologic technologists in the existing cohorts, and to initiate long-term follow-up studies of cancer and other radiation-associated disease risks in physicians and technologists performing or assisting with interventional procedures. Such studies will help to optimize standardized protocols for radiologic procedures, determine if current radiation protection measures are adequate, provide guidance on cancer screening needs, and yield valuable insights on cancer risks associated with chronic radiation exposure.
radiologists; interventional radiologists; radiologic technologists; interventional cardiologists; neoplasms; reviews
Although it is a rare cancer, retinoblastoma has served as an important model in our understanding of genetic cancer syndromes. All patients with a germinal rb1 mutation possess a risk of the development of second malignancies. Approximately 40-50% of all retinoblastoma cases are considered germinal cases and recent work has indicated that nearly all retinoblastoma patients probably demonstrate a degree of mosaicism for the rb1 mutation, and thus are at risk of secondary malignancies. The risk of the development of these cancers continues throughout the patients’ lives due to the loss of a functional RB1 protein and its critical tumor suppressive function in all cells. These cancers can develop in diverse anatomic locations, including the skull and long bones, soft tissues, nasal cavity, skin, orbit, brain, breast and lung. Treatments used for retinoblastoma such as external-beam radiation and chemotherapy can have a significant impact on the risk for and pattern of development of these secondary cancers. Second malignancies are the leading cause of death in germinal retinoblastoma survivors in the USA and thus continue to be an important subject of study in this patient population. Second malignancies following the germinal form of retinoblastoma are the subject of this review.
cancer risk; external-beam radiation; germinal hereditary; pinealoblastoma; radiation-induced neoplasm; retinoblastoma; sarcoma; second malignancy; Survivor
Rapid innovation in radiotherapy techniques has resulted in an urgent need for risk projection models for second cancer risks from high-dose radiation exposure, since direct observation of the late-effects of newer treatments will require patient follow-up for a decade or more. However, the patterns of cancer risk after fractionated high-dose radiation are much less well understood than those after lower-dose exposures (0.1–5 gray (Gy)). In particular, there is uncertainty about the shape of the dose-response curve at high doses, and the magnitude of the second cancer risk per unit dose. We reviewed the available evidence from epidemiologic studies of second solid cancers in organs that received high-dose exposure (>5Gy) from radiotherapy where dose-response curves were estimated from individual organ-specific doses. We included 28 eligible studies with 3,434 second cancer patients across 11 second solid cancers. Overall, there was little evidence that the dose-response curve was non-linear in the direction of a down-turn in risk, even at organ doses of ≥60Gy. Thyroid cancer was the only exception with evidence of a downturn after 20Gy. Generally the excess relative risk per Gy, taking account of age and sex, was 5–10 times lower than the risk from acute exposures of <2Gy among the Japanese atomic bomb survivors. However, the magnitude of the reduction in risk varied according to second cancer. The results from our review provide insights into radiation carcinogenesis from fractionated high-dose exposures, and are generally consistent with current theoretical models. The results can be used to refine the development of second solid cancer risk projection models for novel radiotherapy techniques.
The aim of the current study was to investigate the pattern of cancer
screening behavior in adult retinoblastoma survivors, who are at high risk
of developing second cancers.
Self-reported cancer screening practices were investigated in a
cohort of retinoblastoma survivors to evaluate whether they were receiving
adequate screening for specific cancers and compare these rates with those
of other adult survivors of childhood cancer and the general population. The
prevalence of breast self-examination, clinical breast examination,
mammography, Papanicolaou (Pap) test, testicular self-examination, and
magnetic resonance imaging (MRI) or computed tomography (CT) scanning was
determined from computer-aided telephone interviews with 836 retinoblastoma
survivors aged >18 years.
Among female survivors, 87% had a Pap test within the past 2
years, and 76% of females age >40 years reported having a
mammogram within the past 2 years; 17.4% of male survivors had
performed monthly testicular self-examinations. A significantly higher
proportion of hereditary compared with nonhereditary survivors reported
having undergone an MRI or CT scan in the past 5 years. Higher education,
greater contact with the medical care system, and having a second cancer
were found to be associated positively with most screening practices. Cancer
screening practices reported by retinoblastoma survivors were similar to
national screening rates for breast, cervical, and testicular cancer.
To the authors' knowledge, the current study provides the
first report of cancer screening practices of retinoblastoma survivors.
Survivors of hereditary retinoblastoma should be encouraged to maintain, if
not increase, their current screening practices to ensure early detection of
second cancers in this high-risk population.
Little is known about the validity of self-recorded sun exposure and time spent outdoors for epidemiological research. The aims of the current study were to assess how well participants’ self-recorded time outdoors compared to objective measurements of personal UVR doses. We enrolled 124 volunteers aged 40 and above who were identified from targeted subgroups of US radiologic technologists. Each volunteer was instructed to wear a polysulfone (PS) dosimeter to measure UVR on their left shoulder and to complete a daily activity diary, listing all activities undertaken in each 30 min interval between 9:00 A.M. and 5:00 P.M. during a 7 day period. In a linear regression model, self-recorded daily time spent outdoors was associated with an increase of 8.2% (95% CI: 7.3–9.2%) in the personal UVR exposure with every hour spent outdoors. The amount of self-recorded total daily time spent outdoors was better correlated with the personal daily UVR dose for activities conducted near noon time compared to activities conducted in the morning or late afternoon, and for activities often performed in the sun (e.g. gardening or recreation activities) compared to other outdoor activities (e.g. driving) in which the participant is usually shaded from the sun. Our results demonstrated a significant correlation between diary records of time spent outdoors with objective personal UVR dose measurements.
Skin cancer studies depend on questionnaires to estimate exposure to ultraviolet light and subsequent risk but are limited by recall bias. We investigate the feasibility of developing a short checklist of categories comprising outdoor activities that can improve sun exposure questionnaires for use in epidemiologic studies. We recruited 124 working and retired U.S. radiologic technologists (52% women). Each subject was instructed to complete a daily activity diary, listing main indoor and outdoor activities between 9:00 A.M. and 5:00 P.M. during a 7 day period. A total of 4697 entries were associated with 1408 h (21.1%) of the total 6944 h spent outdoors. We were able to classify the activities into seven main activity categories: driving, yard work, home-maintenance, walking or performing errands, water activities, other recreational or sports activities and leisure activities or relaxing outside. These activities accounted for more than 94% of time spent outdoors both for working and retired men and women. Our data document the feasibility and guidance for developing a short checklist of outdoor activities for use in epidemiologic questionnaires for estimating sunlight exposures of working and retired indoor workers.
In the past 30 years, the numbers and types of fluoroscopically-guided (FG) procedures have increased dramatically. The objective of the present study is to provide estimated radiation doses to physician specialists, other than cardiologists, who perform FG procedures. We searched Medline to identify English-language journal articles reporting radiation exposures to these physicians. We then identified several primarily therapeutic FG procedures that met specific criteria: well-defined procedures for which there were at least five published reports of estimated radiation doses to the operator, procedures performed frequently in current medical practice, and inclusion of physicians from multiple medical specialties. These procedures were percutaneous nephrolithotomy (PCNL), vertebroplasty, orthopedic extremity nailing for treatment of fractures, biliary tract procedures, transjugular intrahepatic portosystemic shunt creation (TIPS), head/neck endovascular therapeutic procedures, and endoscopic retrograde cholangiopancreatography (ERCP). We abstracted radiation doses and other associated data, and estimated effective dose to operators. Operators received estimated doses per patient procedure equivalent to doses received by interventional cardiologists. The estimated effective dose per case ranged from 1.7 – 56μSv for PCNL, 0.1 – 101 μSv for vertebroplasty, 2.5 – 88μSv for orthopedic extremity nailing, 2.0 – 46μSv for biliary tract procedures, 2.5 – 74μSv for TIPS, 1.8 – 53μSv for head/neck endovascular therapeutic procedures, and 0.2 – 49μSv for ERCP. Overall, mean operator radiation dose per case measured over personal protective devices at different anatomic sites on the head and body ranged from 19 – 800 (median = 113) μSv at eye level, 6 – 1180 (median = 75)μSv at the neck, and 2 – 1600 (median = 302) μSv at the trunk. Operators’ hands often received greater doses than the eyes, neck or trunk. Large variations in operator doses suggest that optimizing procedure protocols and proper use of protective devices and shields might reduce occupational radiation dose substantially.
interventional procedure; fluoroscopically-guided procedure; occupational exposure; radiation protection
To assess the shape of the dose response for various cancer endpoints, and modifiers by age and time.
Methods and Materials
Re-analysis of the US peptic ulcer data testing for heterogeneity of radiogenic risk by cancer endpoint (stomach, pancreas, lung, leukemia, all other).
There are statistically significant (p<0.05) excess risks for all cancer, and lung cancer, and borderline statistically significant risks for stomach cancer (p=0.07), and leukemia (p=0.06), with excess relative risks Gy−1 of 0.024 (95% CI 0.011, 0.039), 0.559 (95% CI 0.221, 1.021), 0.042 (95% CI −0.002, 0.119), and 1.087 (95% CI −0.018, 4.925), respectively. There is statistically significant (p=0.007) excess risk of pancreatic cancer when adjusted for dose-response curvature. General downward curvature is apparent in the dose response, statistically significant (p<0.05) for all cancers, pancreatic cancer and all other cancers (than stomach, pancreas, lung, leukemia). There are indications of reduction in risk with increasing age at exposure (for all cancers, pancreatic cancer), but no evidence for quadratic variations in relative risk with age at exposure. If a linear-exponential dose response is used there is no significant heterogeneity in the dose response between the five endpoints considered, or in the speed of variation of relative risk with age at exposure. The risks are generally consistent with those observed in the Japanese atomic bomb survivors and in groups of nuclear workers.
There are excess risks for various malignancies in this dataset. Generally there is marked downward curvature in the dose response, and significant reduction in relative risk with increasing age at exposure. The consistency of risks with those observed in the Japanese atomic bomb survivors, and in groups of nuclear workers, implies that there may be little sparing effect of fractionation of dose or low dose rate exposure.
stomach cancer; lung cancer; pancreatic cancer; leukemia
Retinoblastoma is the most common primary cancer of the eye in children. The incidence of second tumors in survivors of bilateral retinoblastoma and in survivors of unilateral retinoblastoma who presumably carry a germline RB1 mutation is documented. This paper describes the previously unrecognized association of sinonasal adenocarcinoma as a second malignancy in retinoblastoma survivors. We present three cases who received radiation therapy as a part of their treatment and developed sinonasal adenocarcinoma as a second malignancy. Sinonasal adenocarcinoma should be considered as a second malignancy in retinoblastoma survivors who present with vague sinus symptoms.
Retinoblastoma; Sinonasal adenocarcinoma; second malignancy
To assess the shape of the dose response for various circulatory disease endpoints, and modifiers by age and time since exposure.
Methods and Materials
Analysis of the US peptic ulcer data testing for heterogeneity of radiogenic risk by circulatory disease endpoint (ischemic heart, cerebrovascular, other circulatory disease).
There are significant excess risks for all circulatory disease, with an excess relative risk Gy−1 of 0.082 (95% CI 0.031, 0.140), and ischemic heart disease, with an excess relative risk Gy−1 of 0.102 (95% CI 0.039, 0.174) (both p<0.01), and indications of excess risk for stroke. There are no statistically significant (p>0.2) differences between risks by endpoint, and few indications of curvature in the dose response. There are significant modifications of relative risk by time since exposure, the magnitude of which does not vary between endpoints (p>0.2). Risk modifications are similar if analysis is restricted to those receiving radiation, although relative risks are slightly larger and the risk of stroke fails to be significant. The slopes of the dose response are generally consistent with those observed in the Japanese atomic bomb survivors and in occupationally and medically exposed groups.
There are excess risks for a variety of circulatory diseases in this dataset, with significant modification of risk by time since exposure. The consistency of the dose-response slopes with those observed in radiotherapeutically-treated groups at much higher dose, as well as in lower-dose exposed cohorts such as the Japanese atomic bomb survivors and nuclear workers implies that there may be little sparing effects of fractionation of dose or low dose-rate exposure.
circulatory disease; ischemic heart disease; stroke; peptic ulcer; benign disease
Autosomal dominant conditions are known to be associated with advanced paternal age, and it has been suggested that retinoblastoma (Rb) also exhibits a paternal age effect due to the paternal origin of most new germline RB1 mutations. To further our understanding of the association of parental age and risk of de novo germline Rb mutations, we evaluated the effect of parental age in a cohort of Rb survivors in the United States. A cohort of 262 retinoblastoma patients was retrospectively identified at one institution, and telephone interviews were conducted with parents of 160 survivors (65.3%). We built two sets of hierarchical stepwise logistic regression models to detect an increased odds of a de novo germline mutation related to older parental age compared to sporadic and familial Rb. The modeling strategy evaluated effects of continuous increasing maternal and paternal age and five-year age increases adjusted for the age of the other parent. Mean maternal ages for patients with de novo germline mutations and sporadic Rb were similar (28.3 and 28.5 respectively) as were mean paternal ages (31.9 and 31.2 respectively), and all were significantly higher than the weighted general U.S. population means. In contrast, maternal and paternal ages for familial Rb did not differ significantly from the weighted U.S. general population means. Although we noted no significant differences between mean maternal and paternal ages between each of the three Rb classification groups, we found increased odds of having a de novo germline mutation for each five-year increase in paternal age, but these findings were not statistically significant (de novo versus sporadic ORs: 30-34 = 1.65 [0.69-4], ≥35 = 1.34 [0.54-3.3]; de novo versus familial ORs: 30-34 = 2.82 [0.95 – 8.4], ≥35 = 1.61 [0.57-4.6]). Our study suggests a weak paternal age effect for Rb resulting from de novo germline mutations consistent with the paternal origin of most of these mutations.
The purpose of this study is to quantify cancer mortality in relationship to organ-specific radiation dose among women irradiated for benign gynecologic disorders. Included in this study are 12,955 women treated for benign gynecologic disorders at hospitals in the Northeastern U.S. between 1925 and 1965; 9,770 women treated by radiation and 3,186 women treated by other methods. The average age at treatment was 45.9 years (range, 13–88 years), and the average follow-up period was 30.1 years (maximum, 69.9 years). Radiation doses to organs and active bone marrow were reconstructed by medical physicists using original radiotherapy records. The highest doses were received by the uterine cervix (median, 120 Gy) and uterine corpus (median, 34 Gy), followed by the bladder, rectum and colon (median, 1.7–7.2 Gy), with other abdominal organs receiving median doses ≤1 Gy and organs in the chest and head receiving doses <0.1 Gy. Standardized mortality rate ratios relative to the general U.S. population were calculated. Radiation-related risks were estimated in internal analyses using Poisson regression models. Mortality was significantly elevated among irradiated women for cancers of the uterine corpus, ovary, bladder, rectum, colon and brain, as well as for leukemia (exclusive of chronic lymphocytic leukemia) but not for cancer of the cervix, Hodgkin or non-Hodgkin lymphoma, multiple myeloma, or chronic lymphocytic leukemia. Evidence of a dose-response was seen for cancers of the ovary [excess relative risk (ERR) 0.31/Gy, P < 0.001], bladder (ERR = 0.21/Gy, P = 0.02) and rectum (ERR = 0.23/Gy, P = 0.05) and suggested for colon (ERR = 0.09/Gy, P = 0.10), but not for cancers of the uterine corpus or brain nor for non-chronic lymphocytic leukemia. Relative risks of mortality due to cancers of the stomach, pancreas, liver and kidney were close to 1.0, with no evidence of dose-response over the range of 0–1.5 Gy. Breast cancer was not significantly associated with dose to the breast or ovary. Mortality due to cancers of heavily irradiated organs remained elevated up to 40 years after irradiation. Significantly elevated radiation-related risk was seen for cancers of organs proximal to the radiation source or fields (bladder, rectum and ovary), as well as for non-chronic lymphocytic leukemia. Our results corroborate those from previous studies that suggest that cells of the uterine cervix and lymphopoietic system are relatively resistant to the carcinogenic effects of radiation. Studies of women irradiated for benign gynecologic disorders, together with studies of women treated with higher doses of radiation for uterine cancers, provide quantitative information on cancer risks associated with a broad range of pelvic radiation exposures.
Childhood cancer five-year survival now exceeds 70–80%. Childhood exposure to radiation is a known thyroid carcinogen; however, data are limited for the evaluation of radiation dose-response at high doses, modifiers of the dose-response relationship and joint effects of radiotherapy and chemotherapy. To address these issues, we pooled two cohort and two nested case-control studies of childhood cancer survivors including 16,757 patients, with 187 developing primary thyroid cancer. Relative risks (RR) with 95% confidence intervals (CI) for thyroid cancer by treatment with alkylating agents, anthracyclines or bleomycin were 3.25 (0.9–14.9), 4.5 (1.4–17.8) and 3.2 (0.8–10.4), respectively, in patients without radiotherapy, and declined with greater radiation dose (RR trends, P = 0.02, 0.12 and 0.01, respectively). Radiation dose-related RRs increased approximately linearly for <10 Gy, leveled off at 10–15-fold for 10–30 Gy and then declined, but remained elevated for doses >50 Gy. The fitted RR at 10 Gy was 13.7 (95% CI: 8.0–24.0). Dose-related excess RRs increased with decreasing age at exposure (P < 0.01), but did not vary with attained age or time-since-exposure, remaining elevated 25+ years after exposure. Gender and number of treatments did not modify radiation effects. Thyroid cancer risks remained elevated many decades following radiotherapy, highlighting the need for continued follow up of childhood cancer survivors.
The 600% increase in medical radiation exposure to the US population since 1980 has provided immense benefit, but potential future cancer risks to patients. Most of the increase is from diagnostic radiologic procedures. The objectives of this review are to summarize epidemiologic data on cancer risks associated with diagnostic procedures, describe how exposures from recent diagnostic procedures relate to radiation levels linked with cancer occurrence, and propose a framework of strategies to reduce radiation from diagnostic imaging in patients. We briefly review radiation dose definitions, mechanisms of radiation carcinogenesis, key epidemiologic studies of medical and other radiation sources and cancer risks, and dose trends from diagnostic procedures. We describe cancer risks from experimental studies, future projected risks from current imaging procedures, and the potential for higher risks in genetically susceptible populations. To reduce future projected cancers from diagnostic procedures, we advocate widespread use of evidence-based appropriateness criteria for decisions about imaging procedures, oversight of equipment to deliver reliably the minimum radiation required to attain clinical objectives, development of electronic lifetime records of imaging procedures for patients and their physicians, and commitment by medical training programs, professional societies, and radiation protection organizations to educate all stakeholders in reducing radiation from diagnostic procedures.
Radiotherapy decreases cancer mortality, but is associated with an increased incidence of second primary cancers, including osteosarcomas, especially after exposure in childhood. It remains uncertain whether radiation is related to other histologic types of bone sarcomas such as chondrosarcomas that are more common in adulthood.
Using data from 1973–2008 SEER registries, we evaluated long-term risk of bone cancer in 1,284,537 adult 5-year cancer survivors. We used standardized incidence ratios (SIRs) to compare second bone sarcoma rates to the general population for each histologic type. We also used multivariate Poisson regression to estimate the relative risk (RR) associated with radiotherapy for the most common subtypes, osteosarcoma and chondrosarcoma.
By the end of 2008, 159 second bone sarcomas were reported. Compared with the general population, the risk of developing any bone sarcoma was increased by 25% in patients with no history of radiotherapy (Observed(O)=89, SIR=1.25(1.00–1.54)) and by 257% in patients with a history of radiotherapy (O=70, SIR=3.57(2.78–4.50)). For each histologic subtype SIRs were higher among patients who had previously received radiotherapy than among those who had not. The RR for radiotherapy for osteosarcoma(n=63) was 5.08(3.05–8.59) and for chondrosarcoma(n=69) was 1.54(0.88–2.59), and these risks were even greater for second sarcomas that arose in the radiotherapy field used to treat the first cancer (osteosarcoma RR=10.35(4.96–23.66), chondrosarcoma RR=8.21(2.09–39.89)).
Our findings provide the first evidence of a likely association between radiation exposure and chondrosarcoma.
These results further our understanding of radiotherapy-related cancer risks and will potentially direct practices in long-term surveillance of cancer survivors.
Second cancers; bone sarcomas; radiation
To evaluate the risk of second cancer (SC) in long-term survivors of retinoblastoma (Rb) according to classification of germline mutation, based on family history of Rb and laterality.
Patients and Methods
We assembled a cohort of 1,852 1-year survivors of Rb (bilateral, n = 1,036; unilateral, n = 816). SCs were ascertained by medical records and self-reports and confirmed by pathology reports. Classification of RB1 germline mutation, inherited or de novo, was inferred by laterality of Rb and positive family history of Rb. Standardized incidence ratios and cumulative incidence for all SCs combined and for soft tissue sarcomas, bone cancers, and melanoma were calculated. The influence of host- and therapy-related risk factors for SC was assessed by Poisson regression for bilateral survivors.
We observed a relative risk (RR) of 1.37 (95% CI, 1.00 to 1.86) for SCs in bilateral survivors associated with a family history of Rb, adjusted for treatment, age, and length of follow-up. The risk for melanoma was significantly elevated for survivors with a family history of Rb (RR, 3.08; 95% CI, 1.23 to 7.16), but risks for bone or soft tissue sarcomas were not elevated. The cumulative incidence of SCs 50 years after diagnosis of bilateral Rb, with adjustment for competing risk of death, was significantly higher for survivors with a family history (47%; 95% CI, 35% to 59%) than survivors without a family history (38%; 95% CI, 32% to 44%; P = .004).
Rb survivors with bilateral disease and an inherited germline mutation are at slightly higher risk of an SC compared with those with a de novo germline mutation, in particular melanoma, perhaps because of shared genetic alterations.
Retinoblastoma (RB) is an important ocular malignancy of childhood. It has been commonly accepted for some time that knockout of the two alleles of the RB1 gene is the principal molecular target associated with the occurrence of RB. In this paper, we examine the validity of the two-hit theory for retinoblastoma by comparing the fit of a stochastic model with two or more mutational stages. Unlike many such models, our model assumes a fully stochastic stem cell compartment, which is crucial to its behavior. Models are fitted to a population-based dataset comprising 1,553 cases of retinoblastoma for the period 1962–2000 in Great Britain (England, Scotland, Wales). The population incidence of retinoblastoma is best described by a fully stochastic model with two stages, although models with a deterministic stem cell compartment yield equivalent fit; models with three or more stages fit much less well. The results strongly suggest that knockout of the two alleles of the RB1 gene is necessary and may be largely sufficient for the development of retinoblastoma, in support of Knudson’s two-hit hypothesis.
Retinoblastoma; carcinogenesis modeling; two-hit theory; stochastic MVK model; RB1 gene
In the US, second non-ocular malignancies are the primary cause of death in retinoblastoma survivors with the germline RB1 mutation. Soft tissue sarcomas are one of the most likely malignancies to pose a risk to these patients, with leiomyosarcoma (LMS) being the most common subtype. As our cohort is followed for a longer period, we discover new second malignancy risks for these patients.
We estimated the risk for uterine leiomyosarcoma (ULMS) in a cohort of 1854 patients with retinoblastoma who were diagnosed at two US institutions from 1914 through 1996. The standardized incidence ratio and excess absolute risk were calculated by comparison with population data from the Connecticut Tumor Registry or from National Cancer Institute Surveillance, Epidemiology, and End Results (SEER) database. The cumulative risk at 50 years of age was also calculated.
Seven of 525 female hereditary retinoblastoma patients developed ULMS. Five of these patients were used in the risk analysis, resulting in an excess risk of 3.87 per 10,000 women. Among hereditary patients who developed ULMS the excess risk increases dramatically with age: to 20/10,000 for female hereditary retinoblastoma patients aged between 30–39 years, and to 27/10,000 for patients aged 40+ years.
There is a substantial excess risk of ULMS in female hereditary retinoblastoma patients. As more patients survive into their thirties, this number is likely to increase. These findings raise the question of early childbearing, screening and prophylactic measures in hereditary retinoblastoma patients: all issues that would benefit from confirmation from other retinoblastoma cohorts, to allow for better guided counsel of these patients.
Retinoblastoma; Uterine Leiomyosarcoma; Secondary Cancers
Children diagnosed with the hereditary form of retinoblastoma (Rb), a rare eye cancer caused by a germline mutation in the RB1 tumor suppressor gene, have excellent survival, but face an increased risk of bone and soft tissue sarcomas. This predisposition to sarcomas has been attributed to genetic susceptibility due to inactivation of the RB1 gene as well as past radiotherapy for Rb. The majority of bone and soft tissue sarcomas among hereditary Rb survivors occur in the head, within the radiation field, but they also occur outside the radiation field. Sarcomas account for almost half of the second primary cancers in hereditary Rb survivors, but they are very rare following non-hereditary Rb. Sarcomas among hereditary Rb survivors arise at ages similar to the pattern of occurrence in the general population. There has been a trend over the past two decades to replace radiotherapy with chemotherapy and other focal therapies (laser or cryosurgery), and most recently, chemosurgery in order to reduce the incidence of sarcomas and other second cancers in Rb survivors. Given the excellent survival of most Rb patients treated in the past, it is important for survivors, their families and health care providers to be aware of the heightened risk for sarcomas in hereditary patients.
Retinoblastoma; Soft tissue sarcoma; Bone sarcoma; Radiotherapy; Epidemiology; RB1 gene; Hereditary
We report the pathology and outcome of secondary skull base tumors in patients previously treated with external beam radiation for retinoblastoma (Rb). Rb patients are at increased risk of second head and neck primary malignancies due to early radiation exposure during treatment and loss of RB1 protein in genetic carriers. An institutional database was reviewed for patients with retinoblastoma who had previously received radiation therapy and subsequently developed skull base tumors. Seventeen patients met the selection criteria. The median age of Rb diagnosis was 12 months. Thirteen cases underwent enucleation in addition to radiation therapy as part of initial Rb treatment. A median of 19 years elapsed between the diagnosis of Rb and diagnosis of skull base malignancy. The most common tumors were osteogenic sarcoma (39%) and leiomyosarcoma (22%). Eleven (71%) patients received postoperative chemotherapy, and 7 (41%) received postoperative radiotherapy. Three (24%) patients underwent salvage surgery for recurrent disease. Five-year survival was 68%, and 10-year survival was 51% by Kaplan-Meier analysis. Secondary malignancy in Rb patients is a well-defined event. The use of surgery with appropriate adjuvant therapy was associated with a 51% 10-year survival in this study population.
Skull base neoplasms; retinoblastoma; neoplasms; second primary; radiotherapy
Biodosimetry measurements can potentially be an important and integral part of the dosimetric methods used in long-term studies of health risk following radiation exposure. Such studies rely on accurate estimation of doses to the whole body or to specific organs of individuals in order to derive reliable estimates of cancer risk. However, dose estimates based on analytical dose reconstruction (i.e., models) or personnel monitoring measurements, e.g., film-badges, can have substantial uncertainty. Biodosimetry can potentially reduce uncertainty in health risk studies by corroboration of model-based dose estimates or by using them to assess bias in dose models. While biodosimetry has begun to play a more significant role in long-term health risk studies, its use is still generally limited in that context due to one or more factors including, inadequate limits of detection, large inter-individual variability of the signal measured, high per-sample cost, and invasiveness. Presently, the most suitable biodosimetry methods for epidemiologic studies are chromosome aberration frequencies from fluorescence in situ hybridization (FISH) of peripheral blood lymphocytes and electron paramagnetic resonance (EPR) measurements made on tooth enamel. Both types of measurements, however, are usually invasive and require difficult to obtain biological samples. Moreover, doses derived from these methods are not always directly relevant to the tissues of interest. To increase the value of biodosimetry to epidemiologic studies, a number of issues need to be considered including limits of detection, effects of inhomogenous exposure of the body, how to extrapolate from the tissue sampled to the tissues of interest, and how to adjust dosimetry models applied to large populations based on sparse biodosimetry measurements. The requirements of health risk studies suggest a set of characteristics that, if satisfied by new biodosimetry methods, would increase the overall usefulness of biodosimetry to determining radiation health risks.
Background and Objectives
For over 100 years, radiotherapy has been widely used to treat retinoblastoma. One of the most common adverse effects of orbital radiotherapy is cataract formation. The objective of this study was to investigate the risk of cataract extraction among adult retinoblastoma survivors.
Methods and Design
A retrospective cohort study was performed on survivors who were diagnosed with retinoblastoma from 1914 to 1984 and responded to a telephone interview in 2000. The interview elicited information about medical outcomes including cataract extraction, demographic and several lifestyle factors. Doses to the lens of each eye for individuals were estimated from available radiotherapy (external beam therapy or brachytherapy) records. The cumulative time interval to cataract extraction between dose groups was compared using the log-rank test, and Cox regression was used to estimate hazard ratios of cataract extraction in multivariate analyses.
753 subjects (828 eyes) were available for analysis for an average of 32 years of follow-up per eye. During this period, 51 cataract extractions were reported. One extraction was reported in an eye with no radiotherapy compared to 36 extractions in 306 eyes with one course of radiotherapy, and 14 among 38 eyes with two or three courses of treatment. The average time interval to cataract extraction in irradiated eyes was 51 years (95% CI: 48–54) following one treatment and 32 years (95% CI: 27–37) for 2 or 3 treatments. Eyes exposed to a therapeutic radiation dose of 5 Gray (Gy) or more (mean exposure of 8.1 Gy) had a six-fold increased risk (95% CI: 1.3–27.2) for cataract extraction compared to eyes exposed to 2.5 Gy or less.
Nearly all cataracts that were extracted within 30 years after diagnosis of retinoblastoma could be associated with radiotherapy and more than 75% of the eyes treated with two or more radiotherapy treatments had a cataract extracted. The results emphasize the importance of ophthalmologic examination throughout adulthood of retinoblastoma survivors who have undergone radiotherapy. In contrast, the annual risk of cataract extraction in non treated eyes is comparable with the risk of the general population.