Assessment of the gluten-induced small-intestinal mucosal injury remains the cornerstone of celiac disease diagnosis. Usually the injury is evaluated using grouped classifications (e.g. Marsh groups), but this is often too imprecise and ignores minor but significant changes in the mucosa. Consequently, there is a need for validated continuous variables in everyday practice and in academic and pharmacological research.
We studied the performance of our standard operating procedure (SOP) on 93 selected biopsy specimens from adult celiac disease patients and non-celiac disease controls. The specimens, which comprised different grades of gluten-induced mucosal injury, were evaluated by morphometric measurements. Specimens with tangential cutting resulting from poorly oriented biopsies were included. Two accredited evaluators performed the measurements in blinded fashion. The intraobserver and interobserver variations for villus height and crypt depth ratio (VH:CrD) and densities of intraepithelial lymphocytes (IELs) were analyzed by the Bland-Altman method and intraclass correlation.
Unevaluable biopsies according to our SOP were correctly identified. The intraobserver analysis of VH:CrD showed a mean difference of 0.087 with limits of agreement from −0.398 to 0.224; the standard deviation (SD) was 0.159. The mean difference in interobserver analysis was 0.070, limits of agreement −0.516 to 0.375, and SD 0.227. The intraclass correlation coefficient in intraobserver variation was 0.983 and that in interobserver variation 0.978. CD3+ IEL density countings in the paraffin-embedded and frozen biopsies showed SDs of 17.1% and 16.5%; the intraclass correlation coefficients were 0.961 and 0.956, respectively.
Using our SOP, quantitative, reliable and reproducible morphometric results can be obtained on duodenal biopsy specimens with different grades of gluten-induced injury. Clinically significant changes were defined according to the error margins (2SD) of the analyses in VH:CrD as 0.4 and in CD3+-stained IELs as 30%.
Urokinase-type plasminogen activator receptor is a multifunctional glycoprotein, the expression of which is increased during inflammation. It is known to bind to β3-integrins, which are elementary for the cellular entry of hantaviruses. Plasma soluble form of the receptor (suPAR) levels were evaluated as a predictor of severe Puumala hantavirus (PUUV) infection and as a possible factor involved in the pathogenesis of the disease.
A single-centre prospective cohort study.
Subjects and Methods
Plasma suPAR levels were measured twice during the acute phase and once during the convalescence in 97 patients with serologically confirmed acute PUUV infection using a commercial enzyme-linked immunosorbent assay (ELISA).
The plasma suPAR levels were significantly higher during the acute phase compared to the control values after the hospitalization (median 8.7 ng/ml, range 4.0–18.2 ng/ml vs. median 4.7 ng/ml, range 2.4–12.2 ng/ml, P<0.001). The maximum suPAR levels correlated with several variables reflecting the severity of the disease. There was a positive correlation with maximum leukocyte count (r = 0.475, p<0.001), maximum plasma creatinine concentration (r = 0.378, p<0.001), change in weight during the hospitalization (r = 0.406, p<0.001) and the length of hospitalization (r = 0.325, p = 0.001), and an inverse correlation with minimum platelet count (r = −0.325, p = 0.001) and minimum hematocrit (r = −0.369, p<0.001).
Plasma suPAR values are markedly increased during acute PUUV infection and associate with the severity of the disease. The overexpression of suPAR possibly activates β3-integrin in PUUV infection, and thus might be involved in the pathogenesis of the disease.
A 2-year-old boy found in cardiac arrest secondary to drowning received standard CPR for 35 minutes and was transported to a tertiary hospital for rewarming from hypothermia.
Chest compressions in hospital were started using two-thumb encircling hands technique. Subsequently two-thumbs direct sternal compression technique and after sternal force/depth sensor placement, chest compression with classic one-hand technique were done. By using CPR recording/feedback defibrillator, quantitative CPR quality data and invasive arterial pressures were available for analyses for 5 hours and 35 minutes.
316 compressions with the two-thumb encircling hands technique provided a mean (SD) systolic arterial pressure (SAP) of 24 (4) mmHg, mean arterial pressure (MAP) 18 (3) and diastolic arterial pressure (DAP) of 15 (3) mmHg. ~6000 compressions with the two thumbs direct compression technique created a mean SAP of 45 (7) mmHg, MAP 35 (4) mmHg and DAP of 30 (3) mmHg. ~20,000 compressions with the sternal accelerometer in place produced SAP 50 (10) mmHg, MAP 32 (5) mmHg and DAP 24 (4) mmHg.
Restoration of spontaneous circulation (ROSC) was achieved at the point when the child achieved normothermia by using peritoneal dialysis. Unfortunately, the child died ten hours after ROSC without any signs of neurological recovery.
This case demonstrates improved hemodynamic parameters with classic one-handed technique with real-time quantitative quality of CPR feedback compared to either the two-thumbs encircling hands or two-thumbs direct sternal compression techniques. We speculate that the improved arterial pressures were related to improved chest compression depth when a real-time CPR recording/feedback device was deployed.
Cardiac Arrest; Child; Quality; CPR
To evaluate the quality of cardiopulmonary resuscitation (CPR) in a physician staffed helicopter emergency medical service (HEMS) using a monitor-defibrillator with a quality analysis feature. As a post hoc analysis, the potential barriers to implementation were surveyed.
The quality of CPR performed by the HEMS from November 2008 to April 2010 was analysed. To evaluate the implementation rate of quality analysis, the HEMS database was screened for all cardiac arrest missions during the study period. As a consequence of the observed low implementation rate, a survey was sent to physicians working in the HEMS to evaluate the possible reasons for not utilizing the automated quality analysis feature.
During the study period, the quality analysis was used for 52 out of 187 patients (28%). In these cases the mean compression depth was < 40 mm in 46% and < 50 mm in 96% of the 1-min analysis intervals, but otherwise CPR quality corresponded with the 2005 resuscitation guidelines. In particular, the no-flow fraction was remarkably low 0.10 (0.07, 0.16). The most common reasons for not using quality-controlled CPR were that the device itself was not taken to the scene, or not applied to the patient, because another EMS unit was already treating the patient with another defibrillator.
When quality-controlled CPR technology was used, the indicators of good quality CPR as described in the 2005 resuscitation guidelines were mostly achieved albeit with sufficient compression depth. The use of the well-described technology in improving patient care was low. Wider implementation of the automated quality control and feedback feature in defibrillators could further improve the quality of CPR on the field.
CPR; Quality; Resuscitation; Cardiac arrest; Pre-hospital; HEMS
Maternal enterovirus infections during pregnancy have been linked to an increased risk of type 1 diabetes in the offspring. The aim of this study was to evaluate this association in a unique series of pregnant mothers whose child progressed to clinical type 1 diabetes.
RESEARCH DESIGN AND METHODS
Maternal and in utero enterovirus infections were studied in 171 offspring who presented with type 1 diabetes before the age of 11 years and in 316 control subjects matched for date and place of birth, sex, and HLA-DQ risk alleles for diabetes. Acute enterovirus infections were diagnosed by increases in enterovirus IgG and IgM in samples taken from the mother at the end of the first trimester of pregnancy and cord blood samples taken at delivery.
Signs of maternal enterovirus infection were observed in altogether 19.3% of the mothers of affected children and in 12.0% of the mothers of control children (P = 0.038). This difference was seen in different HLA risk groups and in both sexes of the offspring, and it was unrelated to the age of the child at the diagnosis of diabetes or the age of the mother at delivery.
These results suggest that an enterovirus infection during pregnancy is not a major risk factor for type 1 diabetes in childhood but may play a role in some susceptible subjects.
Evidence suggests that many coeliac disease patients suffer from persistent clinical symptoms and reduced health-related quality of life despite a strict gluten-free diet. We aimed to find predictors for these continuous health concerns in long-term treated adult coeliac patients.
In a nationwide study, 596 patients filled validated Gastrointestinal Symptom Rating Scale and Psychological General Well-Being questionnaires and were interviewed regarding demographic data, clinical presentation and treatment of coeliac disease, time and place of diagnosis and presence of coeliac disease-associated or other co-morbidities. Dietary adherence was assessed by a combination of self-reported adherence and serological tests. Odds ratios and 95% confidence intervals were calculated by binary logistic regression.
Diagnosis at working age, long duration and severity of symptoms before diagnosis and presence of thyroidal disease, non-coeliac food intolerance or gastrointestinal co-morbidity increased the risk of persistent symptoms. Patients with extraintestinal presentation at diagnosis had fewer current symptoms than subjects with gastrointestinal manifestations. Impaired quality of life was seen in patients with long duration of symptoms before diagnosis and in those with psychiatric, neurologic or gastrointestinal co-morbidities. Patients with persistent symptoms were more likely to have reduced quality of life.
There were a variety of factors predisposing to increased symptoms and impaired quality of life in coeliac disease. Based on our results, early diagnosis of the condition and consideration of co-morbidities may help in resolving long-lasting health problems in coeliac disease.
Coeliac disease; Symptoms; Quality of life; Gluten-free diet; Adults
Chronic rhinosinusitis (CRS) is an inflammation of the nose and paranasal sinuses lasting for ≥12 weeks. Endoscopic sinus surgery (ESS) is considered during difficult to treat CRS. The minimally invasive technique focuses on the transition areas rather than on the ostia. The aim of this study was to evaluate symptoms, the number of acute sinusitis episodes, and satisfaction after ESS with either preservation or enlargement of the maxillary sinus ostium. Thirty patients with moderate nonpolypous CRS were enrolled. Uncinectomy only and additional middle meatal antrostomy were randomized for each side of each patient and performed single blindly. The symptoms questionnaires were filled at four time intervals. Significant symptom reduction was achieved independently of operation technique. The number of acute sinusitis episodes indicating the exacerbation rate decreased significantly at 9 and, on average, 68 months postoperatively. However, the exacerbation rate began to increase after 9 months postoperatively. Three revisions were performed on the side with uncinectomy only and one on the side with additional antrostomy. Most patients reported good satisfaction with both procedures. There was a trend for patients with asthma and/or job exposure to report insignificantly more frequently no satisfaction with surgery, especially with the uncinectomy-only procedure. Both procedures seem to be efficient in providing symptom relief and satisfaction. More studies are needed to evaluate if patients with risk factors benefit more from an ostium-enlarging procedure.
Antrostomy; chronic sinusitis; endoscopic sinus surgery; ethmoidectomy; maxillary sinus; minimally invasive technique; mucosa; outcomes; quality of life; symptoms
Currently, human adipose stem cells (hASCs) are differentiated towards osteogenic lineages using culture medium supplemented with L-ascorbic acid 2-phosphate (AsA2-P), dexamethasone (Dex) and beta-glycerophosphate (β-GP). Because this osteogenic medium (OM1) was initially generated for the differentiation of bone marrow-derived mesenchymal stem cells, the component concentrations may not be optimal for the differentiation of hASCs. After preliminary screening, two efficient osteogenic media (OM2 and OM3) were chosen to be compared with the commonly used osteogenic medium (OM1). To further develop the culture conditions towards clinical usage, the osteo-inductive efficiencies of OM1, OM2 and OM3 were compared using human serum (HS)-based medium and a defined, xeno-free medium (RegES), with fetal bovine serum (FBS)-based medium serving as a control.
To compare the osteo-inductive efficiency of OM1, OM2 and OM3 in FBS-, HS- and RegES-based medium, the osteogenic differentiation was assessed by alkaline phosphatase (ALP) activity, mineralization, and expression of osteogenic marker genes (runx2A, DLX5, collagen type I, osteocalcin, and ALP).
In HS-based medium, the ALP activity increased significantly by OM3, and mineralization was enhanced by both OM2 and OM3, which have high AsA2-P and low Dex concentrations. ALP activity and mineralization of hASCs was the weakest in FBS-based medium, with no significant differences between the OM compositions due to donor variation. However, the qRT-PCR data demonstrated significant upregulation of runx2A mRNA under osteogenic differentiation in FBS- and HS-based medium, particularly by OM3 under FBS conditions. Further, the expression of DLX5 was greatly stimulated by OM1 to 3 on day 7 when compared to control. The regulation of collagen type I, ALP, and osteocalcin mRNA was modest under induction by OM1 to 3. The RegES medium was found to support the proliferation and osteogenic differentiation of hASCs, but the composition of the RegES medium hindered the comparison of OM1, OM2 and OM3.
Serum conditions affect hASC proliferation and differentiation significantly. The ALP activity and mineralization was the weakest in FBS-based medium, although osteogenic markers were upregulated on mRNA level. When comparing the OM composition, the commonly used OM1 was least effective. Accordingly, higher concentration of AsA2-P and lower concentration of Dex, as in OM2 and OM3, should be used for the osteogenic differentiation of hASCs in vitro.
Human pluripotent stem cells (hPSCs), including human embryonic stem cells (hESCs) and human induced pluripotent stem cells (hiPSCs), are capable of differentiating into any cell type in the human body and thus can be used in studies of early human development, as cell models for different diseases and eventually also in regenerative medicine applications. Since the first derivation of hESCs in 1998, a variety of culture conditions have been described for the undifferentiated growth of hPSCs. In this study, we cultured both hESCs and hiPSCs in three different culture conditions: on mouse embryonic fibroblast (MEF) and SNL feeder cell layers together with conventional stem cell culture medium containing knockout serum replacement and basic fibroblast growth factor (bFGF), as well as on a Matrigel matrix in mTeSR1 medium. hPSC lines were subjected to cardiac differentiation in mouse visceral endodermal-like (END-2) co-cultures and the cardiac differentiation efficiency was determined by counting both the beating areas and Troponin T positive cells, as well as studying the expression of OCT-3/4, mesodermal Brachyury T and NKX2.5 and endodermal SOX-17 at various time points during END-2 differentiation by q-RT-PCR analysis. The most efficient cardiac differentiation was observed with hPSCs cultured on MEF or SNL feeder cell layers in stem cell culture medium and the least efficient cardiac differentiation was observed on a Matrigel matrix in mTeSR1 medium. Further, hPSCs cultured on a Matrigel matrix in mTeSR1 medium were found to be more committed to neural lineage than hPSCs cultured on MEF or SNL feeder cell layers. In conclusion, culture conditions have a major impact on the propensity of the hPSCs to differentiate into a cardiac lineage.
Approximately 1% of the population suffer from coeliac disease. However, the disease is heavily underdiagnosed. Unexplained symptoms may lead to incremented medical consultations and productivity losses. The aim here was to estimate the possible concealed burden of untreated coeliac disease and the effects of a gluten-free diet.
A nationwide cohort of 700 newly detected adult coeliac patients were prospectively evaluated. Health care service use and sickness absence from work during the year before diagnosis were compared with those in the general population; the data obtained from an earlier study. Additionally, the effect of one year on dietary treatment on the aforementioned parameters and on consumption of pharmaceutical agents was assessed.
Untreated coeliac patients used primary health care services more frequently than the general population. On a gluten-free diet, visits to primary care decreased significantly from a mean 3.6 to 2.3. The consumption of medicines for dyspepsia (from 3.7 to 2.4 pills/month) and painkillers (6.8-5.5 pills/month) and the number of antibiotic courses (0.6-0.5 prescriptions/year) was reduced. There were no changes in hospitalizations, outpatient visits to secondary and tertiary care, use of other medical services, or sickness absence, but the consumption of nutritional supplements increased on treatment.
Coeliac disease was associated with excessive health care service use and consumption of drugs before diagnosis. Dietary treatment resulted in a diminished burden to the health care system and lower use of on-demand medicines and antibiotic treatment. The results support an augmented diagnostic approach to reduce underdiagnosis of coeliac disease.
Coeliac disease; Gluten-free diet; Burden of illness; Health care service use; Sickness absence
AIM: To investigate the association between serum antibody levels and a subsequent celiac disease diagnosis in a large series of children and adults.
METHODS: Besides subjects with classical gastrointestinal presentation of celiac disease, the study cohort included a substantial number of individuals with extraintestinal symptoms and those found by screening in at-risk groups. Altogether 405 patients underwent clinical, serological and histological evaluations. After collection of data, the antibody values were further graded as low [endomysial (EmA) 1:5-200, transglutaminase 2 antibodies (TG2-ab) 5.0-30.0 U/L] and high (EmA 1: ≥ 500, TG2-ab ≥ 30.0 U/L), and the serological results were compared with the small intestinal mucosal histology and clinical presentation.
RESULTS: In total, 79% of the subjects with low and 94% of those with high serum EmA titers showed small-bowel mucosal villous atrophy. Furthermore, 96% of the 47 EmA positive subjects who had normal mucosal villi and remained on follow-up either subsequently developed mucosal atrophy while on a gluten-containing diet, or responded positively to a gluten-free diet.
CONCLUSION: Irrespective of the initial serum titers or clinical presentation, EmA positivity as such is a very strong predictor of a subsequent celiac disease diagnosis.
Celiac disease; Diagnosis; Endomysial antibodies; Transglutaminase 2 antibodies; Clinical presentations
Norovirus (NoV) GII-4 has emerged as the predominant NoV genotype in outbreaks of gastroenteritis worldwide. We determined clinical features of NoV GII-4 associated acute gastroenteritis (AGE) in comparison with AGE associated with other NoV types in infants during seasons 2001 and 2002. During the prospective follow-up period, 128 primary infections of AGE due to NoV were identified in 405 infants; of these, GII-4 was found in 40 cases (31%). NoV GII-4 was associated with longer duration of diarrhea and vomiting than other NoV genotypes, suggesting greater virulence of NoV GII-4.
To assess whether the detection of enterovirus RNA in blood predicts the development of clinical type 1 diabetes in a prospective birth cohort study. Further, to study the role of enteroviruses in both the initiation of the process and the progression to type 1 diabetes.
RESEARCH DESIGN AND METHODS
This was a nested case-control study where all case children (N = 38) have progressed to clinical type 1 diabetes. Nondiabetic control children (N = 140) were pairwise matched for sex, date of birth, hospital district, and HLA-DQ–conferred genetic susceptibility to type 1 diabetes. Serum samples, drawn at 3- to 12-month intervals, were screened for enterovirus RNA using RT-PCR.
Enterovirus RNA–positive samples were more frequent among the case subjects than among the control subjects. A total of 5.1% of the samples (17 of 333) in the case group were enterovirus RNA–positive compared with 1.9% of the samples (19 of 993) in the control group (P < 0.01). The strongest risk for type 1 diabetes was related to enterovirus RNA positivity during the 6-month period preceding the first autoantibody-positive sample (odds ratio 7.7 [95% CI 1.9–31.5]). This risk effect was stronger in boys than in girls.
The present study supports the hypothesis that enteroviruses play a role in the pathogenesis of type 1 diabetes, especially in the initiation of the β-cell damaging process. The enterovirus-associated risk for type 1 diabetes may be stronger in boys than in girls.
Due to the restrictive nature of a gluten-free diet, celiac patients are looking for alternative therapies. While drug-development programs include gluten challenges, knowledge regarding the duration of gluten challenge and gluten dosage is insufficient.
We challenged adult celiac patients with gluten with a view to assessing the amount needed to cause some small-bowel mucosal deterioration.
Twenty-five celiac disease adults were challenged with low (1-3 g) or moderate (3-5g) doses of gluten daily for 12 weeks. Symptoms, small-bowel morphology, densities of CD3+ intraepithelial lymphocytes (IELs) and celiac serology were determined.
Both moderate and low amounts of gluten induced small-bowel morphological damage in 67% of celiac patients. Moderate gluten doses also triggered mucosal inflammation and more gastrointestinal symptoms leading to premature withdrawals in seven cases. In 22% of those who developed significant small- intestinal damage, symptoms remained absent. Celiac antibodies seroconverted in 43% of the patients.
Low amounts of gluten can also cause significant mucosal deterioration in the majority of the patients. As there are always some celiac disease patients who will not respond within these conditions, sample sizes must be sufficiently large to attain to statistical power in analysis.
A poly-70L/30DL-lactide (PLA70)–β-tricalcium phosphate (β-TCP) composite implant reinforced by continuous PLA-96L/4D-lactide (PLA96) fibers was designed for in vivo spinal fusion. The pilot study was performed with four sheep, using titanium cage implants as controls. The composite implants failed to direct bone growth as desired, whereas the bone contact and the proper integration were evident with controls 6 months after implantation. Therefore, the PLA70/β-TCP composite matrix material was further analyzed in the in vitro experiment by human and ovine adipose stem cells (hASCs and oASCs). The composites proved to be biocompatible as confirmed by live/dead assay. The proliferation rate of oASCs was higher than that of hASCs at all times during the 28 d culture period. Furthermore, the composites had only a minor osteogenic effect on oASCs, whereas the hASC osteogenesis on PLA70/β-TCP composites was evident. In conclusion, the composite implant material can be applied with hASCs for tissue engineering but not be evaluated in vivo with sheep.
In preclinical studies, human adipose stem cells (ASCs) have been shown to have therapeutic applicability, but standard expansion methods for clinical applications remain yet to be established. ASCs are typically expanded in the medium containing fetal bovine serum (FBS). However, sera and other animal-derived culture reagents stage safety issues in clinical therapy, including possible infections and severe immune reactions. By expanding ASCs in the medium containing human serum (HS), the problem can be eliminated. To define how allogeneic HS (alloHS) performs in ASC expansion compared to FBS, a comparative in vitro study in both serum supplements was performed. The choice of serum had a significant effect on ASCs. First, to reach cell proliferation levels comparable with 10% FBS, at least 15% alloHS was required. Second, while genes of the cell cycle pathway were overexpressed in alloHS, genes of the bone morphogenetic protein receptor–mediated signaling on the transforming growth factor beta signaling pathway regulating, for example, osteoblast differentiation, were overexpressed in FBS. The result was further supported by differentiation analysis, where early osteogenic differentiation was significantly enhanced in FBS. The data presented here underscore the importance of thorough investigation of ASCs for utilization in cell therapies. This study is a step forward in the understanding of these potential cells.
Recent studies have shown that apoptosis plays a critical role in the pathogenesis of sepsis. High plasma cell free DNA (cf-DNA) concentrations have been shown to be associated with sepsis outcome. The origin of cf-DNA is unclear.
Total plasma cf-DNA was quantified directly in plasma and the amplifiable cf-DNA assessed using quantitative PCR in 132 patients with bacteremia caused by Staphylococcus aureus, Streptococcus pneumoniae, ß-hemolytic streptococcae or Escherichia coli. The quality of cf-DNA was analyzed with a DNA Chip assay performed on 8 survivors and 8 nonsurvivors. Values were measured on days 1–4 after positive blood culture, on day 5–17 and on recovery.
The maximum cf-DNA values on days 1–4 (n = 132) were markedly higher in nonsurvivors compared to survivors (2.03 vs 1.26 ug/ml, p<0.001) and the AUCROC in the prediction of case fatality was 0.81 (95% CI 0.69–0.94). cf-DNA at a cut-off level of 1.52 ug/ml showed 83% sensitivity and 79% specificity for fatal disease. High cf-DNA (>1.52 ug/ml) remained an independent risk factor for case fatality in a logistic regression model. Qualitative analysis of cf-DNA showed that cf-DNA displayed a predominating low-molecular-weight cf-DNA band (150–200 bp) in nonsurvivors, corresponding to the size of the apoptotic nucleosomal DNA. cf-DNA concentration showed a significant positive correlation with visually graded apoptotic band intensity (R = 0.822, p<0.001).
Plasma cf-DNA concentration proved to be a specific independent prognostic biomarker in bacteremia. cf-DNA displayed a predominating low-molecular-weight cf-DNA band in nonsurvivors corresponding to the size of apoptotic nucleosomal DNA.
Long pentraxin 3 (PTX3) is an acute-phase protein secreted by various cells, including leukocytes and endothelial cells. Like C-reactive protein (CRP), it belongs to the pentraxin superfamily. Recent studies indicate that high levels of PTX3 may be associated with mortality in sepsis. The prognostic value of plasma PTX3 in bacteremic patients is unknown.
Plasma PTX3 levels were measured in 132 patients with bacteremia caused by Staphylococcus aureus, Streptococcus pneumoniae, β-hemolytic streptococcae and Escherichia coli, using a commercial solid-phase enzyme-linked immunosorbent assay (ELISA). Values were measured on days 1–4 after positive blood culture, on day 13–18 and on recovery.
The maximum PTX3 values on days 1–4 were markedly higher in nonsurvivors compared to survivors (44.8 vs 6.4 ng/ml, p<0.001) and the AUCROC in the prediction of case fatality was 0.82 (95% CI 0.73–0.91). PTX3 at a cut-off level of 15 ng/ml showed 72% sensitivity and 81% specificity for fatal disease. High PTX3 (>15 ng/ml) was associated with hypotension (MAP <70 mmHg)(OR 7.9;95% CI 3.3–19.0) and high SOFA score (≥4)(OR 13.2; 95% CI 4.9–35.4). The CRP level (maximum value on days 1 to 4) did not predict case fatality at any cut-off level in the ROC curve (p = 0.132). High PTX3 (>15 ng/ml) remained an independent risk factor for case fatality in a logistic regression model adjusted for potential confounders.
PTX3 proved to be a specific independent prognostic biomarker in bacteremia. PTX3 during the first days after diagnosis showed better prognostic value as compared to CRP, a widely used biomarker in clinical settings. PTX3 measurement offers a novel opportunity for the prognostic stratification of bacteremia patients.
Background and purpose
The use of hip arthroplasties is evidently increasing, but there are few published data on the incidence in young patients.
We used data on total and resurfacing hip arthroplasties (THAs and RHAs) from the Finnish Arthroplasty Register and population data from Statistics Finland to analyze the incidences of THA and RHA in patients aged 30–59 years in Finland, for the period 1980 through 2007.
The combined incidences of THAs and RHAs among 30- to 59-year-old inhabitants increased from 9.5 per 105 inhabitants in 1980 to 61 per 105 inhabitants in 2007. Initially, the incidence of THA was higher in women than men, but since the mid-90s the incidences were similar. The incidence increased in all age groups studied (30–39, 40–49, and 50–59 years) but the increase was 6-fold and 36-fold higher in the latter two groups than in the first. The incidence of THA was constant; the increased incidence of overall hip arthroplasty was due to the increasing number of RHAs performed.
We have found a steady increase in the incidence of hip arthroplasty in patients with primary hip osteoarthritis in Finland, with an accelerating trend in the past decade, due to an increase in the incidence of RHA. As the incidence of hip osteoarthritis has not increased, the indications for hip arthroplasty appear to have become broader.
Nephropathia epidemica (NE) is a Scandinavian type of hemorrhagic fever with renal syndrome caused by Puumala hantavirus. The clinical course of the disease varies greatly in severity. The aim of the present study was to evaluate whether plasma C-reactive protein (CRP) and interleukin (IL)-6 levels associate with the severity of NE.
A prospectively collected cohort of 118 consecutive hospital-treated patients with acute serologically confirmed NE was examined. Plasma IL-6, CRP, and creatinine, as well as blood cell count and daily urinary protein excretion were measured on three consecutive days after admission. Plasma IL-6 and CRP levels higher than the median were considered high.
We found that high IL-6 associated with most variables reflecting the severity of the disease. When compared to patients with low IL-6, patients with high IL-6 had higher maximum blood leukocyte count (11.9 vs 9.0 × 109/l, P = 0.001) and urinary protein excretion (2.51 vs 1.68 g/day, P = 0.017), as well as a lower minimum blood platelet count (55 vs 80 × 109/l, P < 0.001), hematocrit (0.34 vs 0.38, P = 0.001), and urinary output (1040 vs 2180 ml/day, P < 0.001). They also stayed longer in hospital than patients with low IL-6 (8 vs 6 days, P < 0.001). In contrast, high CRP did not associate with severe disease.
High plasma IL-6 concentrations associate with a clinically severe acute Puumala hantavirus infection, whereas high plasma CRP as such does not reflect the severity of the disease.
To study changes in adolescent snus use from 1981 to 2003, the effects of the total snus sales ban (1995) and snus acquisition.
Biennial postal surveys in 1981–2003.
Setting and participants
Entire Finland; 12‐, 14‐, 16‐, and 18‐year‐olds (n = 73 946; 3105−8390 per year).
Main outcome measures
Snus use (experimental, daily/occasionally), snus acquisition (2001, 2003).
Snus experimentation grew in popularity before the total sales ban in 16‐ and 18‐year‐old boys and after the ban in all age and sex groups. A decrease was seen between 2001 and 2003, except for 18‐year‐old boys. Daily/occasional use mainly followed the same pattern in boys while in girls the daily/occasional use was rare and no significant changes were observed. In 2003, boys experimented with snus more often than girls (12‐year‐olds 1% v 0%, 14‐year‐olds 9% v 4%, 16‐year‐olds 30% v 12%, 18‐year‐olds 44% v 18%). Hardly any girls used snus daily/occasionally, but 1% of 14‐year‐old boys, 7% of 16‐year‐olds, and 9% of 18‐year‐olds did. Of daily/occasional users, 84% acquired snus from friends or acquaintances, 55% from tourist trips to neighbouring countries (Estonia, Sweden), and 7% through sport teams; 24% obtained it from under‐the‐counter sources. For experimenters, the corresponding figures were 79%, 18%, 0.3%, and 5%.
The total sales ban did not stop snus use; instead, the increase continued after the ban. Friends who travel to neighbouring countries act as go‐betweens reselling snus. Snus is used even by the youngest adolescents, thus contributing to the nicotine dependence process.
adolescents; snus; sales ban; snus acquisition
Background and purpose Specialist hospitals have reported an incidence of early deep infections of < 1% following primary knee replacement. The purpose of this study was to estimate the infection rate in a nationwide series using register-based data.
Methods The Finnish Arthroplasty Register (FAR) was searched for primary unicompartmental, total, and revision knee arthroplasties performed in 1997 through 2003 and eventual revision arthroplasties. The FAR data on revision arthroplasties was supplemented by a search of the national Hospital Discharge Register (HDR) for debridements, partial and total revision knee replacements, resection arthroplasties, arthrodeses, and amputations.
Results During the first postoperative year, 0.33% (95% CI: 0.13–0.84), 0.52% (0.45–0.60) and 1.91% (1.40–2.61) of the primary UKAs, primary TKAs, and revision TKAs, respectively, were reoperated due to infection. The 1-year rate of reoperations due to infection remained constant in all arthroplasty groups over the observation period.
The overall infection rate calculated using FAR data only was 0.77% (95% CI: 0.69–0.86), which was lower, but was not, however, statistically significantly different from the overall infection rate calculated using endpoint data combined from FAR and HDR records (0.89%; 95% CI: 0.80–0.99). FAR registered revision arthroplasties and patellar resurfacing arthroplasties reliably but missed a considerable proportion of other reoperations.
Interpretation More reoperations performed due to infection can be expected as the numbers of knee arthroplasties increase, since there has been no improvement in the early infection rate. Finnish Arthroplasty Register data appear to underestimate the incidence of reoperations performed due to infection.
Myocardial diastolic tissue velocities are reduced already in newly onset Type 2 diabetes mellitus (T2D). Poor disease control may lead to left ventricular (LV) systolic dysfunction and heart failure. The aim of this study was to assess the effects of exercise training on myocardial diastolic function in T2D patients without ischemic heart disease.
48 men (52.3 ± 5.6 yrs) with T2D were randomized to supervised training four times a week and standard therapy (E), or standard treatment alone (C) for 12 months. Glycated hemoglobin (HbA1c), oxygen consumption (VO2max), and muscle strength (Sit-up) were measured. Tissue Doppler Imaging (TDI) was used to determine the average maximal mitral annular early (Ea) and late (Aa) diastolic as well as systolic (Sa) velocities, systolic strain (ε) and strain rate (έ) from the septum, and an estimation of left ventricular end diastolic pressure (E/Ea).
Exercise capacity (VO2max, E 32.0 to 34.7 vs. C 32.6 to 31.5 ml/kg/min, p = .001), muscle strength (E 12.7 to 18.3 times vs. C 14.6 to 14.7 times, p < .001), and HbA1c (E 8.2 to 7.5% vs. C 8.0 to 8.4%, p = .006) improved significantly in the exercise group compared to the controls (ANOVA). Systolic blood pressure decreased in the E group (E 144 to 138 mmHg vs. C 146 to 144 mmHg, p = .04). Contrary to risk factor changes diastolic long axis relaxation did not improve significantly, early diastolic velocity Ea from 8.1 to 7.9 cm/s for the E group vs. C 7.4 to 7.8 cm/s (p = .85, ANOVA). Likewise, after 12 months the mitral annular systolic velocity, systolic strain and strain rate, as well as E/Ea were unchanged.
Exercise training improves endurance and muscle fitness in T2D, resulting in better glycemic control and reduced blood pressure. However, myocardial diastolic tissue velocities did not change significantly. Our data suggest that a much longer exercise intervention may be needed in order to reverse diastolic impairment in diabetics, if at all possible.
In earlier studies, one in six adults had overactive bladder which may impair quality of life. However, earlier studies have either not been population-based or have suffered from methodological limitations. Our aim was to assess the prevalence of overactive bladder symptoms, based on a representative study population and using consistent definitions and exclusions.
The aim of the study was to assess the age-standardized prevalence of overactive bladder defined as urinary urgency, with or without urgency incontinence, usually with urinary frequency and nocturia in the absence of urinary tract infection or other obvious pathology. In 2003–2004, a questionnaire was mailed to 6,000 randomly selected Finns aged 18–79 years who were identified from the Finnish Population Register Centre. Information on voiding symptoms was collected using the validated Danish Prostatic Symptom Score, with additional frequency and nocturia questions. Corrected prevalence was calculated with adjustment for selection bias due to non-response. The questionnaire also elicited co-morbidity and socio-demographic information. Of the 6,000 subjects, 62.4% participated. The prevalence of overactive bladder was 6.5% (95% CI, 5.5% to 7.6%) for men and 9.3% (CI, 7.9% to 10.6%) for women. Exclusion of men with benign prostatic hyperplasia reduced prevalence among men by approximately one percentage point (to 5.6% [CI, 4.5% to 6.6%]). Among subjects with overactive bladder, urgency incontinence, frequency, and nocturia were reported by 11%, 23%, and 56% of men and 27%, 38%, and 40% of women, respectively. However, only 31% of men and 35% of women with frequency, and 31% of subjects of both sexes with nocturia reported overactive bladder.
Our results indicate a prevalence of overactive bladder as low as 8% suggesting that, in previous studies, occurrence has been overestimated due to vague criteria and selected study populations regarding age distribution and low participation.
This study was based on data from the Finnish Arthroplasty Register. From 1990 to 1999, 33,154 primary hip arthroplasties were performed in Finland. Only periprosthetic fractures treated by a revision arthroplasty were registered. The six most used femoral components were compared using survival analysis and Cox's regression model. The incidence of periprosthetic fractures was calculated separately for the years 1990–1994 and 1995–1999. The incidence in the first period was greater than in the latter. Survival analysis and Cox's regression model showed that gender, prosthesis type and age were of no significance as risk factors for periprosthetic fractures.