PMCC PMCC

Search tips
Search criteria

Advanced
Results 1-12 (12)
 

Clipboard (0)
None

Select a Filter Below

Journals
Authors
more »
Year of Publication
1.  Argyria: permanent skin discoloration following protracted colloid silver ingestion 
BMJ Case Reports  2009;2009:bcr08.2008.0606.
doi:10.1136/bcr.08.2008.0606
PMCID: PMC3029119  PMID: 21686727
2.  Key Issues in Conducting a Meta-Analysis of Gene Expression Microarray Datasets 
PLoS Medicine  2008;5(9):e184.
Adaikalavan Ramasamy and colleagues outline seven key issues and suggest a stepwise approach in conducting a meta-analysis of microarray datasets.
doi:10.1371/journal.pmed.0050184
PMCID: PMC2528050  PMID: 18767902
3.  The "impact factor" revisited 
The number of scientific journals has become so large that individuals, institutions and institutional libraries cannot completely store their physical content. In order to prioritize the choice of quality information sources, librarians and scientists are in need of reliable decision aids. The "impact factor" (IF) is the most commonly used assessment aid for deciding which journals should receive a scholarly submission or attention from research readership. It is also an often misunderstood tool. This narrative review explains how the IF is calculated, how bias is introduced into the calculation, which questions the IF can or cannot answer, and how different professional groups can benefit from IF use.
doi:10.1186/1742-5581-2-7
PMCID: PMC1315333  PMID: 16324222
4.  Relevance similarity: an alternative means to monitor information retrieval systems 
Background
Relevance assessment is a major problem in the evaluation of information retrieval systems. The work presented here introduces a new parameter, "Relevance Similarity", for the measurement of the variation of relevance assessment. In a situation where individual assessment can be compared with a gold standard, this parameter is used to study the effect of such variation on the performance of a medical information retrieval system. In such a setting, Relevance Similarity is the ratio of assessors who rank a given document same as the gold standard over the total number of assessors in the group.
Methods
The study was carried out on a collection of Critically Appraised Topics (CATs). Twelve volunteers were divided into two groups of people according to their domain knowledge. They assessed the relevance of retrieved topics obtained by querying a meta-search engine with ten keywords related to medical science. Their assessments were compared to the gold standard assessment, and Relevance Similarities were calculated as the ratio of positive concordance with the gold standard for each topic.
Results
The similarity comparison among groups showed that a higher degree of agreements exists among evaluators with more subject knowledge. The performance of the retrieval system was not significantly different as a result of the variations in relevance assessment in this particular query set.
Conclusion
In assessment situations where evaluators can be compared to a gold standard, Relevance Similarity provides an alternative evaluation technique to the commonly used kappa scores, which may give paradoxically low scores in highly biased situations such as document repositories containing large quantities of relevant data.
doi:10.1186/1742-5581-2-6
PMCID: PMC1181804  PMID: 16029513
5.  An Entropy-based gene selection method for cancer classification using microarray data 
BMC Bioinformatics  2005;6:76.
Background
Accurate diagnosis of cancer subtypes remains a challenging problem. Building classifiers based on gene expression data is a promising approach; yet the selection of non-redundant but relevant genes is difficult.
The selected gene set should be small enough to allow diagnosis even in regular clinical laboratories and ideally identify genes involved in cancer-specific regulatory pathways. Here an entropy-based method is proposed that selects genes related to the different cancer classes while at the same time reducing the redundancy among the genes.
Results
The present study identifies a subset of features by maximizing the relevance and minimizing the redundancy of the selected genes. A merit called normalized mutual information is employed to measure the relevance and the redundancy of the genes. In order to find a more representative subset of features, an iterative procedure is adopted that incorporates an initial clustering followed by data partitioning and the application of the algorithm to each of the partitions. A leave-one-out approach then selects the most commonly selected genes across all the different runs and the gene selection algorithm is applied again to pare down the list of selected genes until a minimal subset is obtained that gives a satisfactory accuracy of classification.
The algorithm was applied to three different data sets and the results obtained were compared to work done by others using the same data sets
Conclusion
This study presents an entropy-based iterative algorithm for selecting genes from microarray data that are able to classify various cancer sub-types with high accuracy. In addition, the feature set obtained is very compact, that is, the redundancy between genes is reduced to a large extent. This implies that classifiers can be built with a smaller subset of genes.
doi:10.1186/1471-2105-6-76
PMCID: PMC1087831  PMID: 15790388
6.  Volume-based non-continuum modeling of bone functional adaptation 
Background
Bone adapts to mechanical strain by rearranging the trabecular geometry and bone density. The common finite element methods used to simulate this adaptation have inconsistencies regarding material properties at each node and are computationally demanding. Here, a volume-based, non-continuum formulation is proposed as an alternative. Adaptive processes corresponding to various external mechanical loading conditions are simulated for the femur.
Results
Bone adaptations were modeled for one-legged stance, abduction and adduction. One-legged stance generally results in higher bone densities than the other two loading cases. The femoral head and neck are the regions where densities change most drastically under different loading conditions while the distal area always contains the lowest densities regardless of the loading conditions. In the proposed formulation, the inconsistency of material densities or strain energy densities, which is a common problem to finite element based approaches, is eliminated. The computational task is alleviated through introduction of the quasi-binary connectivity matrix and linearization operations in the Jacobian matrix and is therefore computationally less demanding.
Conclusion
The results demonstrated the viability of the proposed formulation to study bone functional adaptation under mechanical loading.
doi:10.1186/1742-4682-2-6
PMCID: PMC553991  PMID: 15733328
7.  Antiglucocorticoid RU38486 reduces net protein catabolism in experimental acute renal failure 
BMC Nephrology  2005;6:2.
Background
In acute renal failure, a pronounced net protein catabolism occurs that has long been associated with corticoid action. By competitively blocking the glucocorticoid receptor with the potent antiglucocorticoid RU 38486, the present study addressed the question to what extent does corticoid action specific to uremia cause the observed muscle degradation, and does inhibition of glucocorticoid action reduce the protein wasting?
Methods
RU 38486 was administered in a dose of 50 mg/kg/24 h for 48 h after operation to fasted bilaterally nephrectomized (BNX) male adult Wistar rats and sham operated (SHAM) controls. Protein turnover was evaluated by high performance liquid chromatography (HPLC) of amino acid efflux in sera from isolated perfused hindquarters of animals treated with RU 38486 versus untreated controls.
Results
Administration of RU 38486 reduces the total amino acid efflux (TAAE) by 18.6% in SHAM and 15.6% in BNX and efflux of the indicator of net protein turnover, phenylalanine (Phe) by 33.3% in SHAM and 13% in BNX animals as compared to the equally operated, but untreated animals. However, the significantly higher protein degradation observed in BNX (0.6 ± 0.2 nmol/min/g muscle) versus SHAM (0.2 ± 0.1 nmol/min/g muscle) rats, as demonstrated by the marker of myofribrillar proteolytic rate, 3-Methylhistidine (3 MH) remains unaffected by administration of RU 38486 (0.5 ± 0.1 v. 0.2 ± 0.1 nmol/min/g muscle in BNX v. SHAM).
Conclusion
RU 38486 does not act on changes of muscular protein turnover specific to uremia but reduces the effect of stress- stimulated elevated corticosterone secretion arising from surgery and fasting. A potentially beneficial effect against stress- induced catabolism in severe illness can be postulated that merits further study.
doi:10.1186/1471-2369-6-2
PMCID: PMC550647  PMID: 15715918
8.  Polymorphisms of the insertion / deletion ACE and M235T AGT genes and hypertension: surprising new findings and meta-analysis of data 
BMC Nephrology  2005;6:1.
Background
Essential hypertension is a common, polygenic, complex disorder resulting from interaction of several genes with each other and with environmental factors such as obesity, dietary salt intake, and alcohol consumption. Since the underlying genetic pathways remain elusive, currently most studies focus on the genes coding for proteins that regulate blood pressure as their physiological role makes them prime suspects.
The present study examines how polymorphisms of the insertion/deletion (I/D) ACE and M235T AGT genes account for presence and severity of hypertension, and embeds the data in a meta-analysis of relevant studies.
Methods
The I/D polymorphisms of the ACE and M235T polymorphisms of the AGT genes were determined by RFLP (restriction fragment length polymorphism) and restriction analysis in 638 hypertensive patients and 720 normotensive local blood donors in Weisswasser, Germany. Severity of hypertension was estimated by the number of antihypertensive drugs used.
Results
No difference was observed in the allele frequencies and genotype distributions of ACE gene polymorphisms between the two groups, whereas AGT TT homozygotes were more frequent in controls (4.6% vs. 2.7%, P = .08). This became significant (p = 0.035) in women only. AGT TT genotype was associated with a 48% decrease in the risk of having hypertension (odds ratio: 0.52; 95% CI, 0.28 to 0.96), and this risk decreased more significantly in women (odds ratio: 0.28; 95% CI, 0.1 to 0.78). The meta-analysis showed a pooled odds ratio for hypertension of 1.21 (TT vs. MM, 95% CI: 1.11 to 1.32) in Caucasians. No correlation was found between severity of hypertension and a specific genotype.
Conclusion
The ACE I/D polymorphism does not contribute to the presence and severity of essential hypertension, while the AGT M235T TT genotype confers a significantly decreased risk for the development of hypertension in the population studied here. This contrasts to the findings of meta-analyses, whereby the T allele is associated with increased risk for hypertension.
doi:10.1186/1471-2369-6-1
PMCID: PMC546009  PMID: 15642127
9.  Quantitative evaluation of recall and precision of CAT Crawler, a search engine specialized on retrieval of Critically Appraised Topics 
Background
Critically Appraised Topics (CATs) are a useful tool that helps physicians to make clinical decisions as the healthcare moves towards the practice of Evidence-Based Medicine (EBM). The fast growing World Wide Web has provided a place for physicians to share their appraised topics online, but an increasing amount of time is needed to find a particular topic within such a rich repository.
Methods
A web-based application, namely the CAT Crawler, was developed by Singapore's Bioinformatics Institute to allow physicians to adequately access available appraised topics on the Internet. A meta-search engine, as the core component of the application, finds relevant topics following keyword input. The primary objective of the work presented here is to evaluate the quantity and quality of search results obtained from the meta-search engine of the CAT Crawler by comparing them with those obtained from two individual CAT search engines. From the CAT libraries at these two sites, all possible keywords were extracted using a keyword extractor. Of those common to both libraries, ten were randomly chosen for evaluation. All ten were submitted to the two search engines individually, and through the meta-search engine of the CAT Crawler. Search results were evaluated for relevance both by medical amateurs and professionals, and the respective recall and precision were calculated.
Results
While achieving an identical recall, the meta-search engine showed a precision of 77.26% (±14.45) compared to the individual search engines' 52.65% (±12.0) (p < 0.001).
Conclusion
The results demonstrate the validity of the CAT Crawler meta-search engine approach. The improved precision due to inherent filters underlines the practical usefulness of this tool for clinicians.
doi:10.1186/1472-6947-4-21
PMCID: PMC539260  PMID: 15588311
11.  Active collaboration with primary care providers increases specialist referral in chronic renal disease 
BMC Nephrology  2004;5:16.
Background
Late referral to specialist nephrological care is associated with increased morbidity, mortality, and cost. Consequently, nephrologists' associations recommend early referral. The recommendations' effectiveness remains questionable: 22–51% of referrals need renal replacement therapy (RRT) within 3–4 months. This may be due to these recommendations addressing the specialist, rather than the primary care providers (PCP).
The potential of specialist intervention aiming at slowing progression of chronic renal failure was introduced individually to some 250 local PCPs, and referral strategies were discussed. To overcome the PCPs' most often expressed fears, every referred patient was asked to report back to his PCP immediately after the initial specialist examination, and new medications were prescribed directly, and thus allotted to the nephrologist's budget.
Methods
In retrospective analysis, the stage of renal disease in patients referred within three months before the introductory round (group A, n = 18), was compared to referrals two years later (group B, n = 50).
Results
Relative number of patients remained stable (28%) for mild/ moderate chronic kidney disease (MMCKD), while there was a noticeable shift from patients referred severe chronic kidney disease (SCKD) (group A: 44%, group B: 20%) to patients referred in moderate chronic kidney disease (MCKD) (group A: 28%, group B: 52%).
Conclusion
Individually addressing PCPs' ignorance and concerns noticeably decreased late referral. This stresses the importance of enhancing the PCPs' problem awareness and knowledge of available resources in order to ensure timely specialist referral.
doi:10.1186/1471-2369-5-16
PMCID: PMC529261  PMID: 15498108
12.  Facilitating arrhythmia simulation: the method of quantitative cellular automata modeling and parallel running 
Background
Many arrhythmias are triggered by abnormal electrical activity at the ionic channel and cell level, and then evolve spatio-temporally within the heart. To understand arrhythmias better and to diagnose them more precisely by their ECG waveforms, a whole-heart model is required to explore the association between the massively parallel activities at the channel/cell level and the integrative electrophysiological phenomena at organ level.
Methods
We have developed a method to build large-scale electrophysiological models by using extended cellular automata, and to run such models on a cluster of shared memory machines. We describe here the method, including the extension of a language-based cellular automaton to implement quantitative computing, the building of a whole-heart model with Visible Human Project data, the parallelization of the model on a cluster of shared memory computers with OpenMP and MPI hybrid programming, and a simulation algorithm that links cellular activity with the ECG.
Results
We demonstrate that electrical activities at channel, cell, and organ levels can be traced and captured conveniently in our extended cellular automaton system. Examples of some ECG waveforms simulated with a 2-D slice are given to support the ECG simulation algorithm. A performance evaluation of the 3-D model on a four-node cluster is also given.
Conclusions
Quantitative multicellular modeling with extended cellular automata is a highly efficient and widely applicable method to weave experimental data at different levels into computational models. This process can be used to investigate complex and collective biological activities that can be described neither by their governing differentiation equations nor by discrete parallel computation. Transparent cluster computing is a convenient and effective method to make time-consuming simulation feasible. Arrhythmias, as a typical case, can be effectively simulated with the methods described.
doi:10.1186/1475-925X-3-29
PMCID: PMC517726  PMID: 15339335

Results 1-12 (12)