PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-18 (18)
 

Clipboard (0)
None

Select a Filter Below

Journals
more »
Year of Publication
Document Types
author:("fedele, Ugo")
1.  A worksite intervention to reduce the cardiovascular risk: proposal of a study design easy to integrate within Italian organization of occupational health surveillance 
BMC Public Health  2015;15:12.
Background
Despite the substantial amount of knowledge on effectiveness of worksite health promotion (WHP) in reducing cardiovascular disease (CVD) risk, WHP programs are not systematically applied in Italy. The aim was to design an intervention easy to integrate within the Italian organization of workplace health surveillance.
Methods
We used the “pretest-posttest design”. Workers were employed in multiple occupations and resident in Veneto region, Italy. Occupational physicians (OPs) performed all examinations, including laboratory evaluation (capillary blood sampling and measure of glycaemia and cholesterolemia with portable devices), during the normal health surveillance at worksite. CVD risk was computed based on sex, age, smoking habit, diabetes, systolic pressure and cholesterol level. After excluding those with <40 years of age, missing consent, CVD diagnosis or current therapy for CVD, missing information, CVD risk <5%, out of 5,536 workers 451 underwent the intervention and 323 male workers were re-examined at 1 year. CVD risk was the most compelling argument for changing lifestyle. The counseling was based on the individual risk factors. Individuals examined at posttest were a small fraction of the whole (6% = 323/5,536). In these workers we computed the ratio pretest/posttest of proportions (such as percent of subjects with cardiovascular risk >5%) as well as the exact McNemar significance probability or the exact test of table symmetry.
Results
CVD risk decreased by 24% (McNemar p = 0.0000) after the intervention; in a sensitivity analysis assuming that all subjects lost to follow-up kept their pretest cardiovascular risk value, the effect (−18%) was still significant (symmetry p < 0.0000). Each prevented CVD case was expected to cost about 5,700 euro.
Conclusions
The present worksite intervention promoted favorable changes of CVD risk that were reasonably priced and consistent across multiple occupations.
doi:10.1186/s12889-015-1375-4
PMCID: PMC4310171  PMID: 25604904
Worksite health promotion; Cardiovascular disease; Cardiovascular risk; Pretest-posttest design
2.  Mitochondrial DNA Copy Number and Exposure to Polycyclic Aromatic Hydrocarbons 
Background
Increased mitochondrial DNA copy number (mtDNAcn) is a biological response to mtDNA damage and dysfunction predictive of lung cancer risk. Polycyclic aromatic hydrocarbons (PAHs) are established lung carcinogens and may cause mitochondrial toxicity. Whether PAH exposure and PAH-related nuclear DNA (nDNA) genotoxic effects are linked with increased mtDNAcn has never been evaluated.
Methods
We investigated the effect of chronic exposure to PAHs on mtDNAcn in peripheral blood lymphocytes (PBLs) of 46 Polish male non-current smoking cokeoven workers and 44 matched controls, who were part of a group of 94 study individuals examined in our previous work. Subjects PAH exposure and genetic alterations were characterized through measures of internal dose (urinary 1-pyrenol), target dose [anti-benzo[a]pyrene diolepoxide (anti-BPDE)-DNA adduct], genetic instability (micronuclei, MN and telomere length [TL]) and DNA methylation [p53 promoter] in PBLs. mtDNAcn (MT/S) was measured using a validated real-time PCR method.
Results
Workers with PAH exposure above the median value (>3 µmol 1-pyrenol/mol creatinine) showed higher mtDNAcn [geometric means (GM) of 1.06 (unadjusted) and 1.07 (age-adjusted)] compared to controls [GM 0.89 (unadjusted); 0.89 (age-adjusted)] (p=0.029 and 0.016), as well as higher levels of genetic and chromosomal [i.e. anti-BPDE-DNA adducts (p<0.001), MN (p<0.001) and TL (p=0.053)] and epigenetic [i.e., p53 gene-specific promoter methylation (p<0.001)] alterations in the nDNA. In the whole study population, unadjusted and age-adjusted mtDNAcn was positively correlated with 1-pyrenol (p=0.043 and 0.032) and anti-BPDE-DNA adducts (p=0.046 and 0.049).
Conclusions
PAH exposure and PAH-related nDNA genotoxicity are associated with increased mtDNAcn.
Impact
The present study is suggestive of potential roles of mtDNAcn in PAH-induced carcinogenesis.
doi:10.1158/1055-9965.EPI-13-0118
PMCID: PMC3799962  PMID: 23885040
Polycyclic Aromatic Hydrocarbon; pulmonary carcinogens; mitochondrial DNA copy number; nuclear DNA damage; occupational exposure
3.  Short-term health service utilization after a paediatric injury: a population-based study 
Background
The aim of the study is to identify which types of injuries are responsible for a major component of the health burden in a population-based children cohort in North-Eastern Italy.
Methods
All children (1–13 years) residing in Veneto region, who were hospitalized in 2008 with a International Classification of Diseases, ninth edition, Clinical Modification (ICD-9-CM) code for injury in the first diagnostic field were considered. The outcome was defined as the difference in hospital use in the 12 months following the injury and it was compared to the year preceding the injury occurrence. We computed hospitalization rates by gender, age class and injury type.
Results
Hospitalization rates for injury are highest in males, especially among school-aged children. Rates for intracranial injury exhibit a more pronounced decline with age in females, whereas a more marked rise in upper limb fracture rates among school-aged males is observed. Overall, 3 days of hospital stay per child are attributable to injury. Burns, skull fracture and a high injury severity are associated with a greater number of additional inpatient days.
Conclusions
The impact of specific injury types on health services utilization varies with gender, age and severity. These observed patterns contribute to build a clearer picture of this leading global public health problem and deserve more attention in planning preventive strategies and resource allocation.
doi:10.1186/1824-7288-39-66
PMCID: PMC4016021  PMID: 24148101
Injury burden; Childhood; Administrative data
4.  Descriptive epidemiology of chronic liver disease in northeastern Italy: an analysis of multiple causes of death 
Background
The analysis of multiple causes of death data has been applied in the United States to examine the population burden of chronic liver disease (CLD) and to assess time trends of alcohol-related and hepatitis C virus (HCV)-related CLD mortality. The aim of this study was to assess the mortality for CLD by etiology in the Veneto Region (northeastern Italy).
Methods
Using the 2008–2010 regional archive of mortality, all causes registered on death certificates were extracted and different descriptive epidemiological measures were computed for HCV-related, alcohol-related, and overall CLD-related mortality.
Results
The crude mortality rate of all CLD was close to 40 per 100,000 residents. In middle ages (35 to 74 years) CLD was mentioned in about 10% and 6% of all deaths in males and females, respectively. Etiology was unspecified in about half of CLD deaths. In females and males, respectively, HCV was mentioned in 44% and 21% and alcohol in 11% and 26% of overall CLD deaths. A bimodal distribution with age was observed for HCV-related proportional mortality among females, reflecting the available seroprevalence data.
Conclusions
Multiple causes of death analyses can provide useful insights into the burden of CLD mortality according to etiology among different population subgroups.
doi:10.1186/1478-7954-11-20
PMCID: PMC3852117  PMID: 24112320
Liver cirrhosis; Alcoholic liver disease; Hepatitis C; Mortality
5.  The impact of drug eluting stents availability on the treatment choice among medical therapy, percutaneous or surgical revascularisation and on 4-year clinical outcome in patients with coronary artery disease: a cohort study 
BMJ Open  2012;2(5):e001926.
Objective
To investigate the influence of the availability of drug eluting stents (DES) on treatment choice (TC) among medical therapy (MT), coronary by-pass surgery (CABG) or percutaneous coronary interventions (PCI) and the consequent clinical outcomes in patients hospitalised because of coronary artery disease (CAD).
Design
Observational study comparing two cohorts hospitalised immediately before, and 3 years after DES availability.
Setting
Thirteen hospitals with cardiology facilities.
Patients
2131 consecutive patients with at least one coronary stenosis >50% at coronary angiography (CA) after exclusion of those with acute myocardial infarction or previous CABG or associated relevant valvular disease.
Main outcome measures
Treatment choice after CA and 4-year clinical outcomes.
Results
TC among MT (27% vs 29.2%), PCI (58.6% vs 55.5%) and CABG (14.5% vs 15.3%) was similar in the DES and bare metal stent (BMS) periods (p = 0.51). At least one DES was implanted in 57% of patients treated with PCI in 2005. After 4 years, no difference in mortality (13.8% vs 13.2%, p = 0.72), hospital admissions for myocardial infarction (6.6% vs 5.2%, p = 0.26), stroke (2.2% vs 1.7%, p = 0.49) and further revascularisations (22.3% vs 19.7%, p = 0.25) were observed in patients enrolled in the DES and BMS periods. Only in patients with Syntax score 23–32 a significant change of TC (p = 0.0002) occurred in the DES versus BMS period: MT in 17.4% vs 31%, PCI in 62.2% vs 35.8%, CABG in 20.3% vs 33.2%, with similar 4-year combined end-point of mortality, stroke, myocardial infarction and further revascularisations (45.3% vs 34.2%, p = 0.087).
Conclusions
Three years after DES availability, the TC in patients with CAD has not changed significantly as well as the 4-year incidence of death, myocardial infarction, stroke and further revascularisations. In subgroup with Syntax score 23–32, a significant increase of indications to PCI was observed in the DES period, without any improvement of the 4-year clinical outcome.
doi:10.1136/bmjopen-2012-001926
PMCID: PMC3488738  PMID: 23103608
6.  Osteopontin, asbestos exposure and pleural plaques: a cross-sectional study 
BMC Public Health  2011;11:220.
Background
Osteopontin (OPN) is a plasma protein/cytokine produced in excess in several malignancies. In a recent study OPN was reported as being related to the duration of asbestos exposure and presence of benign asbestos-related diseases; however, it was unclear whether this protein was an indicator of exposure or effect.
Methods
In 193 workers, 50 with pleural plaques (PP), in whom different indicators of past asbestos exposure were estimated, OPN plasma levels were assessed using commercial quantitative sandwich enzyme immunoassays according to the manufacturer's instructions.
Results
Osteopontin increased with increasing age and several aspects of asbestos exposure, without differences related to the presence of pleural plaques. At multivariable regression analysis, the explanatory variables with a significant independent influence on OPN were length of exposure (positive correlation) and time elapsed since last exposure (positive correlation).
Conclusions
Since asbestos in lung tissue tends to wane over time, OPN should decrease (rather than increase) with time since last exposure. Therefore, OPN cannot be a reliable biomarker of exposure nor effect (presence of pleural plaques).
doi:10.1186/1471-2458-11-220
PMCID: PMC3094241  PMID: 21477289
7.  Increasing incidence and mortality of infective endocarditis: a population-based study through a record-linkage system 
Background
Few population-based studies provide epidemiological data on infective endocarditis (IE). Aim of the study is to analyze incidence and outcomes of IE in the Veneto Region (North-Eastern Italy).
Methods
Residents with a first hospitalization for IE in 2000-2008 were extracted from discharge data and linked to mortality records to estimate 365-days survival. Etiology was retrieved in subsets of this cohort by discharge codes and by linkage to a microbiological database. Risk factors for mortality were assessed through logistic regression.
Results
1,863 subjects were hospitalized for IE, with a corresponding crude rate of 4.4 per 100,000 person-years, increasing from 4.1 in 2000-2002 to 4.9 in 2006-2008 (p = 0.003). Median age was 68 years; 39% of subjects were hospitalized in the three preceding months. 23% of patients underwent a cardiac valve procedure in the index admission or in the following year. Inhospital mortality was 14% (19% including hospital transfers); 90-days and 365-days mortality rose through the study years. Mortality increased with age and the Charlson comorbidity index, in subjects with previous hospitalizations for heart failure, and (in the subcohort with microbiological data) in IE due to Staphylococci (40% of IE).
Conclusions
The study demonstrates an increasing incidence and mortality for IE over the last decade. Analyses of electronic archives provide a region-wide picture of IE, overcoming referral biases affecting single clinic or multicentric studies, and therefore represent a first fundamental step to detect critical issues related to IE.
doi:10.1186/1471-2334-11-48
PMCID: PMC3051911  PMID: 21345185
8.  Tuberculosis Incidence in Prisons: A Systematic Review 
PLoS Medicine  2010;7(12):e1000381.
A systematic review by Iacopo Baussano and colleagues synthesizes published research to show that improved tuberculosis (TB) control in prisons could significantly reduce the burden of TB both inside and outside prisons.
Background
Transmission of tuberculosis (TB) in prisons has been reported worldwide to be much higher than that reported for the corresponding general population.
Methods and Findings
A systematic review has been performed to assess the risk of incident latent tuberculosis infection (LTBI) and TB disease in prisons, as compared to the incidence in the corresponding local general population, and to estimate the fraction of TB in the general population attributable (PAF%) to transmission within prisons. Primary peer-reviewed studies have been searched to assess the incidence of LTBI and/or TB within prisons published until June 2010; both inmates and prison staff were considered. Studies, which were independently screened by two reviewers, were eligible for inclusion if they reported the incidence of LTBI and TB disease in prisons. Available data were collected from 23 studies out of 582 potentially relevant unique citations. Five studies from the US and one from Brazil were available to assess the incidence of LTBI in prisons, while 19 studies were available to assess the incidence of TB. The median estimated annual incidence rate ratio (IRR) for LTBI and TB were 26.4 (interquartile range [IQR]: 13.0–61.8) and 23.0 (IQR: 11.7–36.1), respectively. The median estimated fraction (PAF%) of tuberculosis in the general population attributable to the exposure in prisons for TB was 8.5% (IQR: 1.9%–17.9%) and 6.3% (IQR: 2.7%–17.2%) in high- and middle/low-income countries, respectively.
Conclusions
The very high IRR and the substantial population attributable fraction show that much better TB control in prisons could potentially protect prisoners and staff from within-prison spread of TB and would significantly reduce the national burden of TB. Future studies should measure the impact of the conditions in prisons on TB transmission and assess the population attributable risk of prison-to-community spread.
Please see later in the article for the Editors' Summary
Editors' Summary
Background
Every year, nearly 10 million people develop tuberculosis (TB)—a contagious bacterial infection usually of the lungs—and nearly two million people die from the disease. TB is caused by Mycobacterium tuberculosis, which spreads in airborne droplets when people with the disease cough or sneeze. Most people infected with M. tuberculosis never become ill—their immune system contains the infection. However, the bacteria remain dormant (latent) within the body, and a latent TB infection (LTBI) can cause active disease many years after the initial infection if host immunity declines. The symptoms of TB include a persistent cough, weight loss, and night sweats. Infection with M. tuberculosis can be diagnosed using the tuberculin skin test; tests for TB itself include chest X-rays and sputum cultures (in which bacteriologists try to grow M. tuberculosis from sputum samples, mucus brought up from the lungs by coughing). TB can usually be cured by taking several powerful antibiotics daily for several months.
Why Was This Study Done?
Last century, global control efforts began to reduce the incidence (number of new cases in a population in a given time) and prevalence (the number of affected people in a population) of LTBI and TB in many countries. Now, the emergence of antibiotic-resistant bacterial strains is thwarting these efforts. Consequently, it is important to identify settings where TB transmission is particularly high. One such setting is thought to be prisons. In these facilities, overcrowding, late case detection, inadequate treatment, and poor implementation of infection control measures (including incomplete segregation of people with active TB) might increase the TB transmission rate. However, it is not known how many people in prison become infected with M. tuberculosis or develop TB each year compared to the general population nor what percentage of LTBI and TB in the general population is attributable to exposure to M. tuberculosis in prison (the population attributable fraction or PAF%). Here, the researchers undertake a systematic review (a study that uses predefined criteria to identify all the research on a given topic) to investigate the incidence of TB in prisons.
What Did the Researchers Do and Find?
The researchers identified 23 studies that reported the incidence of LTBI and/or TB in prisons among both staff and prisoners. They estimated the incidence of TB in relevant general populations using World Health Organization data; estimates of the incidence of LTBI in the general population came from the studies themselves. The researchers then calculated the ratio between the incidence rates for LTBI and TB in prison and in the general population (incidence rate ratios or IRRs) for each study. For both LTBI and TB, the IRR varied widely between studies. The average IRR for LTBI was 26.4. That is, the average incidence of LTBI in prisons was 26.4 times higher than in the general population; the average IRR for TB was 23.0. The researchers also estimated the fraction of TB in the general population attributable to within-prison exposure to M. tuberculosis for each study. Again, there was considerable heterogeneity between the studies but, on average, the PAF% for TB in high-income countries was 8.5% (that is, one in 11 cases of TB in the general population was attributable to within-prison spread of TB); in middle-to-low–income countries, the average PAF% was 6.3%.
What Do These Findings Mean?
These findings suggest that the risk of LTBI and TB is at least an order of magnitude higher in prisons than in the general population and that the within-prison spread of LTBI and TB is likely to substantially affect the incidence of LTBI and TB in the general population. The accuracy and generalizability of these findings are limited by the small number of studies identified, by the relative paucity of studies from countries other than the USA, by study heterogeneity, and by assumptions made in the calculation of PAF%. Even so, these findings suggest that improvements in TB control in prisons would not only help to protect prisoners and staff from within-prison spread of TB but would also reduce national TB burdens. Further studies are now needed to identify the specific conditions in prisons that influence TB transmission so that rational policies can be developed to improve TB control in correctional facilities.
Additional Information
Please access these Web sites via the online version of this summary at http://dx.doi.org/10.1371/journal.pmed.1000381.
This study is discussed in the December 2010 PLoS Medicine Editorial
The World Health Organization provides information on all aspects of TB, including information on TB in prisons and on the Stop TB Partnership (some information is in several languages)
The US Centers for Disease Control and Prevention has information about TB and on TB in prisons
The US National Institute of Allergy and Infectious Diseases also has detailed information on all aspects of TB
doi:10.1371/journal.pmed.1000381
PMCID: PMC3006353  PMID: 21203587
9.  Work related injuries: estimating the incidence among illegally employed immigrants 
BMC Research Notes  2010;3:331.
Background
Statistics on occupational accidents are based on data from registered employees. With the increasing number of immigrants employed illegally and/or without regular working visas in many developed countries, it is of interest to estimate the injury rate among such unregistered workers.
Findings
The current study was conducted in an area of North-Eastern Italy. The sources of information employed in the present study were the Accidents and Emergencies records of a hospital; the population data on foreign-born residents in the hospital catchment area (Health Care District 4, Primary Care Trust 20, Province of Verona, Veneto Region, North-Eastern Italy); and the estimated proportion of illegally employed workers in representative samples from the Province of Verona and the Veneto Region. Of the 419 A&E records collected between January and December 2004 among non European Union (non-EU) immigrants, 146 aroused suspicion by reporting the home, rather than the workplace, as the site of the accident. These cases were the numerator of the rate. The number of illegally employed non-EU workers, denominator of the rate, was estimated according to different assumptions and ranged from between 537 to 1,338 individuals. The corresponding rates varied from 109.1 to 271.8 per 1,000 non-EU illegal employees, against 65 per 1,000 reported in Italy in 2004.
Conclusions
The results of this study suggest that there is an unrecorded burden of illegally employed immigrants suffering from work related injuries. Additional efforts for prevention of injuries in the workplace are required to decrease this number. It can be concluded that the Italian National Institute for the Insurance of Work Related Injuries (INAIL) probably underestimates the incidence of these accidents in Italy.
doi:10.1186/1756-0500-3-331
PMCID: PMC3019162  PMID: 21143866
10.  Evidence‐based policy on road safety: the effect of the demerit points system on seat belt use and health outcomes 
Objective: To assess the effect of a demerit points system, introduced in Italy in July 2003, on the prevalence of seat belt use (intermediate outcome) and the number of road traffic deaths and injuries (health outcomes).
Design: Pre‐ and post‐intervention regional observational study for seat belt investigation (April 2003, October 2004); national time‐series analysis of road traffic deaths and injuries between 1999 and 2004 for health outcomes.
Setting: Veneto region, Italy.
Participants: 19 551 drivers, 19 057 front passengers and 8123 rear passengers estimated to be aged over 11 years were included in the investigation into seat belt use. 38 154 fatalities and 1 938 550 injured subjects were examined for the time‐series analysis.
Interventions: Demerit points system.
Main outcome measures: The proportions of drivers and front and rear passengers observed to be using seat belts before and after the intervention; estimates of lives and injuries saved through the implementation of a penalty points system.
Results: The demerit points system was followed by an increase in observed seat belt use of 51.8% (95% confidence interval 48.7% to 54.9%) among drivers, of 42.3% (95% confidence interval 39.2% to 45.5%) among front passengers and of 120.7% (95% confidence interval 99.4% to 144.3%) among rear passengers. It is estimated that 1545 (95% confidence interval 1387 to 1703; p<0.0001) deaths and 91 772 (95% confidence interval 67 762 to 115 783; p<0.0001) injuries were prevented in the 18 months after the introduction of the legislation, i.e. an 18% reduction (1545/8570) in fatalities and a 19% reduction (91 772/473 048) in injuries.
Conclusions: The demerit points system is effective both in encouraging drivers and passengers to adhere to the law and in terms of health outcomes, substantially contributing to road safety.
doi:10.1136/jech.2006.057729
PMCID: PMC2652965  PMID: 17873223
demerit points system; road traffic injuries; seat belts; traffic law enforcement; evidence‐based policy
11.  A European project on incidence, treatment, and outcome of sarcoma 
BMC Public Health  2010;10:188.
Background
Sarcomas are rare tumors (1-2% of all cancers) of mesenchymal origin that may develop in soft tissues and viscera. Since the International Classification of Disease (ICD) attributes visceral sarcomas (VS) to the organ of origin, the incidence of sarcoma is grossly underestimated. The rarity of the disease and the variety of histological types (more than 70) or locations account for the difficulty in acquiring sufficient personal experience. In view of the above the European Commission funded the project called Connective Tissues Cancers Network (CONTICANET), to improve the prognosis of sarcoma patients by increasing the level of standardization of diagnostic and therapeutic procedures through a multicentre collaboration.
Methods/Design
Two protocols of epidemiological researches are here presented. The first investigation aims to build the population-based incidence of sarcoma in a two-year period, using the new 2002 WHO classification and the "second opinion" given by an expert regional pathologist on the initial diagnosis by a local pathologist. A three to five year survival rate will also be determined. Pathology reports and clinical records will be the sources of information.
The second study aims to compare the effects on survival or relapse-free period - allowing for histological subtypes, clinical stage, primary site, age and gender - when the disease was treated or not according to the clinical practice guidelines (CPGs).
Discussion
Within CONTICANET, each group was asked to design a particular study on a specific objective, the partners of the network being free to accept or not the proposed protocol. The first protocol was accepted by the other researchers, therefore the incidence of sarcoma will be assessed in three European regions, Rhone-Alpes and Aquitaine (France) and Veneto (Italy), where the geographic distribution of sarcoma will be compared after taking into account age and gender. The conformity of the clinical practice with the recommended guidelines will be investigated in a French (Rhone Alps) and Italian (Veneto) region since the CPGs were similar in both areas.
doi:10.1186/1471-2458-10-188
PMCID: PMC2882909  PMID: 20384990
12.  Variability of adenoidectomy/tonsillectomy rates among children of the Veneto Region, Italy 
Background
Despite national guidelines in 2003 aimed at limiting the recourse to tonsillectomy and/or adenoidectomy (A/T), the latter are among the most frequent pediatric surgeries performed in Italy. Aim of the study is to investigate variability of A/T rates among children of the Veneto Region, Italy.
Methods
All discharges of Veneto residents with Diagnosis-Related Groups 57–60 and ICD9-CM intervention codes 28.2 (tonsillectomy), 28.3 (adenotonsillectomy), 28.6 (adenoidectomy) were selected in the period 2000–2006 for a descriptive analysis. A multilevel Poisson regression model was applied to estimate Incidence Rate Ratios (IRR) with 95% Confidence Intervals (CI) for A/T surgery among children aged 2–9 years in 2004–2006, while taking into account clustering of interventions within the 21 Local Health Units.
Results
Through 2000–2006, the overall number of A/T surgeries decreased (-8%); there was a decline of adenoidectomies (-20%) and tonsillectomies (-8%), whereas adenotonsillectomies raised (+18%). Analyses on children aged 2–9 resulted in an overall rate of 14.4 surgeries per 1000 person-years (16.1 among males and 12.5 among females), with a wide heterogeneity across Local Health Units (range 8.1–27.6). At random intercept Poisson regression, while adjusting for sex and age, intervention rates were markedly lower among foreign than among Italian children (IRR = 0.57, CI 0.53–0.61). A/T rates in the 10–40 age group (mainly tonsillectomies) computed for each Local Health Unit and introduced in the regression model accounted for 40% of the variance at Local Health Unit level of pediatric rates (mainly adenoidectomies and adenotonsillectomies).
Conclusion
A/T rates in the Veneto Region, especially adenoidectomies among children aged 2–9 years, remain high notwithstanding a decrease through 2000–2006. A wide heterogeneity according to nationality and Local Health Units is evident. The propensity to A/T surgery of each Local Health Unit is similar in different age groups and for different surgical indications.
doi:10.1186/1472-6963-9-25
PMCID: PMC2647536  PMID: 19200396
13.  Does language matter? A case study of epidemiological and public health journals, databases and professional education in French, German and Italian 
Epidemiology and public health are usually context-specific. Journals published in different languages and countries play a role both as sources of data and as channels through which evidence is incorporated into local public health practice. Databases in these languages facilitate access to relevant journals, and professional education in these languages facilitates the growth of native expertise in epidemiology and public health. However, as English has become the lingua franca of scientific communication in the era of globalisation, many journals published in non-English languages face the difficult dilemma of either switching to English and competing internationally, or sticking to the native tongue and having a restricted circulation among a local readership. This paper discusses the historical development of epidemiology and the current scene of epidemiological and public health journals, databases and professional education in three Western European languages: French, German and Italian, and examines the dynamics and struggles they have today.
doi:10.1186/1742-7622-5-16
PMCID: PMC2570667  PMID: 18826570
14.  Seat belt use among rear passengers: validity of self-reported versus observational measures 
BMC Public Health  2008;8:233.
Background
The effects of seat belt laws and public education campaigns on seat belt use are assessed on the basis of observational or self-reported data on seat belt use.
Previous studies focusing on front seat occupants have shown that self-reports indicate a greater seat belt usage than observational findings.
Whether this over-reporting in self reports applies to rear seat belt usage, and to what extent, have yet to be investigated.
We aimed to evaluate the over-reporting factor for rear seat passengers and whether this varies by gender and under different compulsory seat belt use conditions.
Methods
The study was conducted in the Veneto Region, an area in the North-East of Italy with a population of 4.7 million.
The prevalence of seat belt use among rear seat passengers was determined by means of a cross-sectional self-report survey and an observational study.
Both investigations were performed in two time periods: in 2003, when rear seat belt use was not enforced by primary legislation, and in 2005, after rear seat belt use had become compulsory (June 2003).
Overall, 8138 observations and 7902 interviews were recorded.
Gender differences in the prevalence of rear seat belt use were examined using the chi-square test. The over-reporting factor, defined as the ratio of the self-reported to the observed prevalence of rear seat belt use, was calculated by gender before and after the rear seat belt legislation came into effect.
Results
Among rear seat passengers, self-reported rates were always higher than the observational findings, with an overall over-reporting factor of 1.4.
We registered no statistically significant changes over time in the over-reporting factor, nor any major differences between genders.
Conclusion
Self-reported seat belt usage by rear passengers represents an efficient alternative to observational studies for tracking changes in actual behavior, although the reported figures need to be adjusted using an appropriate over-reporting factor in order to gain an idea of genuine seat belt use.
doi:10.1186/1471-2458-8-233
PMCID: PMC2483976  PMID: 18613955
15.  Diffusion of good practices of care and decline of the association with case volume: the example of breast conserving surgery 
Background
Several previous studies conducted on cancer registry data and hospital discharge records (HDR) have found an association between hospital volume and the recourse to breast conserving surgery (BCS) for breast cancer. The aim of the current study is to depict concurrent time trends in the recourse to BCS and its association with hospital volume.
Methods
Admissions of breast cancer patients for BCS or mastectomy in the period 2000–2004 were identified from the discharge database of the Veneto Region (Italy). The role of procedural volume (low < 50, medium 50–100, high > 100 breast cancer surgeries/year), and of individual risk factors obtainable from HDR was assessed through a hierarchical log-binomial regression.
Results
Overall, the recourse to BCS was higher in medium (risk ratio = 1.12, 95% confidence interval 1.07–1.18) and high-volume (1.09, 1.03–1.14) compared to low-volume hospitals. The proportion of patients treated in low-volume hospitals dropped from 22% to 12%, with a concurrent increase in the activity of medium-volume providers. The increase over time in breast conservation (globally from 56% to 67%) was steeper in the categories of low- and medium-volume hospitals with respect to high caseload.
Conclusion
The growth in the recourse to BCS was accompanied by a decline of the association with hospital volume; larger centers probably acted as early adopters of breast conservation strategies that subsequently spread to smaller providers.
doi:10.1186/1472-6963-7-167
PMCID: PMC2121646  PMID: 17945000
16.  Pattern and determinants of hospitalization during heat waves: an ecologic study 
BMC Public Health  2007;7:200.
Background
Numerous studies have investigated mortality during a heatwave, while few have quantified heat associated morbidity. Our aim was to investigate the relationship between hospital admissions and intensity, duration and timing of heatwave across the summer months.
Methods
The study area (Veneto Region, Italy) holds 4577408 inhabitants (on January 1st, 2003), and is subdivided in seven provinces with 60 hospitals and about 20000 beds for acute care. Five consecutive heatwaves (three or more consecutive days with Humidex above 40°C) occurred during summer 2002 and 2003 in the region. From the regional computerized archive of hospital discharge records, we extracted the daily count of hospital admissions for people aged ≥75, from June 1 through August 31 in 2002 and 2003. Among people aged over 74 years, daily hospital admissions for disorders of fluid and electrolyte balance, acute renal failure, and heat stroke (grouped in a single nosologic entity, heat diseases, HD), respiratory diseases (RD), circulatory diseases (CD), and a reference category chosen a priori (fractures of the femur, FF) were independently analyzed by Generalized Estimating Equations.
Results
Heatwave duration, not intensity, increased the risk of hospital admissions for HD and RD by, respectively, 16% (p < .0001) and 5% (p < .0001) with each additional day of heatwave duration. At least four consecutive hot humid days were required to observe a major increase in hospital admissions, the excesses being more than twofold for HD (p < .0001) and about 50% for RD (p < .0001). Hospital admissions for HD peaked equally at the first heatwave (early June) and last heatwave (August) in 2004 as did RD. No correlation was found for FF or CD admissions.
Conclusion
The first four days of an heatwave had only minor effects, thus supporting heat health systems where alerts are based on duration of hot humid days. Although the finding is based on a single late summer heatwave, adaptations to extreme temperature in late summer seem to be unlikely.
doi:10.1186/1471-2458-7-200
PMCID: PMC1988820  PMID: 17688689
17.  Monitoring the occurrence of diabetes mellitus and its major complications: the combined use of different administrative databases 
Objective
Diabetes mellitus is a growing public health problem, for which efficient and timely surveillance is a key policy. Administrative databases offer relevant opportunities for this purpose. We aim to monitor the incidence of diabetes and its major complications using administrative data.
Study design and methods
We study a population of about 850000 inhabitants in the Veneto Region (Italy) from the end of year 2001 to the end of year 2004. We use four administrative databases with record linkage. Databases of drug prescriptions and of exemptions from medical charge were linked to identify diabetic subjects; hospital discharge records and mortality data were used for the assessment of macrovascular and renal complications and vital status.
Results
We identified 30230 and 34620 diabetic subjects at the start and at the end of the study respectively. The row prevalence increased from 38.3/1000 (95% CI 37.2 – 39.5) to 43.2/1000 (95% CI 42.3 – 44) for males and from 34.7/1000 (95% CI 33.9 – 35.5) to 38.1/1000 (95% CI 37.4 – 39) for females. The mean row incidence is 5.3/1000 (95% CI 5 – 5.6) person years for males and 4.8/1000 (95% CI 4.4 – 5.2) person years for females. The rate of hospitalisations for cardiovascular or kidney diseases is greatly increased in diabetic people with respect to non diabetics for both genders. The mortality relative risk is particularly important in younger age classes: diabetic males and females aged 45–64 years present relative risk for death of 1.7 (95% CI 1.58 – 1.88) and 2.6 (95% CI 2.29 – 2.97) respectively.
Conclusion
This study provides a feasible and efficient method to determine and monitor the incidence and prevalence of diabetes and the occurrence of its complications along with indexes of morbidity and mortality.
doi:10.1186/1475-2840-6-5
PMCID: PMC1804263  PMID: 17302977
18.  Increased Risk of Hepatocellular Carcinoma and Liver Cirrhosis in Vinyl Chloride Workers: Synergistic Effect of Occupational Exposure with Alcohol Intake 
Environmental Health Perspectives  2004;112(11):1188-1192.
Hepatocellular carcinoma (HCC) and liver cirrhosis (LC) are not well-established vinyl chloride monomer (VCM)–induced diseases. Our aim was to appraise the role of VCM, alcohol intake, and viral hepatitis infection, and their interactions, in the etiology of HCC and LC. Thirteen cases of HCC and 40 cases of LC were separately compared with 139 referents without chronic liver diseases or cancer in a case–referent study nested in a cohort of 1,658 VCM workers. The odds ratios (ORs) and the 95% confidence intervals (CIs) were estimated by common methods and by fitting models of logistic regression. We used Rothman’s synergy index (S) to evaluate interactions. By holding the confounding factors constant at logistic regression analysis, each extra increase of 1,000 ppm × years of VCM cumulative exposure was found to increase the risk of HCC by 71% (OR = 1.71; 95% CI, 1.28–2.44) and the risk of LC by 37% (OR = 1.37; 95% CI, 1.13–1.69). The joint effect of VCM exposure above 2,500 ppm × years and alcohol intake above 60 g/day resulted in ORs of 409 (95% CI, 19.6–8,553) for HCC and 752 (95% CI, 55.3–10,248) for LC; both S indexes suggested a synergistic effect. The joint effect of VCM exposure above 2,500 ppm × years and viral hepatitis infection was 210 (95% CI, 7.13–6,203) for HCC and 80.5 (95% CI, 3.67–1,763) for LC; both S indexes suggested an additive effect. In conclusion, according to our findings, VCM exposure appears to be an independent risk factor for HCC and LC interacting synergistically with alcohol consumption and additively with viral hepatitis infection.
doi:10.1289/ehp.6972
PMCID: PMC1247480  PMID: 15289165
alcohol; case–referent studies; cirrhosis; hepatocellular carcinoma; occupational diseases; vinyl chloride

Results 1-18 (18)