Background: Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp.
Methods: To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs.
Results: A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks.
Conclusion: A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.
A brief overview of biorepository sustainability from the perspective of a federated biorepository system at the University of Iowa Carver College of Medicine is presented. The ongoing evolution of the federation and the efforts to improve efficacy and efficiency are described. The key sustainability factors identified are adaptability, focus, collaboration/networking, and service improvement.
Sustainability in the biobanking community has recently become an important and oft-discussed issue as biorepositories struggle to balance limited external funding and complex cost recovery models with high operating costs and the desire to provide the highest quality materials and services to the research community. A multi-faceted view of biobanking sustainability requires consideration of operational and social sustainability in addition to the historical focus exclusively on financial sustainability. Planning and implementing this three pillar model creates a well-rounded biorepository that meets the needs of all the major stakeholders: the funders, the patients/depositors, and the researcher recipients. Often the creation of a detailed business plan is the first step to develop goals and objectives that lead down a path towards sustainability. The definition of sustainability and the complexity of a sustainable business plan may differ for each biorepository. The DNASU Plasmid Repository at Arizona State University stores and distributes DNA plasmids to researchers worldwide, and the Biobank Core Facility at St. Joseph's Hospital and Barrow Neurological Institute consents patients and collects, stores, and distributes human tissue and blood samples. We will discuss these two biorepositories, their similar and different approaches to sustainability and business planning, their challenges in creating and implementing their sustainability plan, and their responses to some of these challenges. From these experiences, the biobanks share lessons learned about planning for sustainability that are applicable to all biorepositories.
Cryopreservation of biological materials such as cells, tissues, and organs is a prevailing topic of high importance. It is employed not only in many research fields but also in the clinical area. Cryopreservation is of great importance for reproductive medicine and clinical studies, as well as for the development of vaccines. Peripheral blood mononuclear cells (PBMCs) are commonly used in vaccine research where comparable and reliable results between different research institutions and laboratories are of high importance. Whereas freezing and thawing processes are well studied, controlled, and standardized, storage conditions are often disregarded. To close this gap, we investigated the influence of suboptimal storage conditions during low-temperature storage on PBMC viability, recovery, and T cell functionality. For this purpose, PBMCs were isolated and exposed with help of a robotic system in a low-temperature environment from 0 up to 350 temperature fluctuation cycles in steps of 50 cycles to simulate storage conditions in large biorepositories with sample storage, removal, and sorting functions. After the simulation, the viability, recovery, and T cell functionality were analyzed to determine the number of temperature rises, which ultimately lead to significant cell damage. All studied parameters decreased with increasing number of temperature cycles. Sometimes after as little as only 50 temperature cycles, a significant effect was observed. These results are very important for all fields in which cell cryopreservation is employed, particularly for clinical and multicenter studies wherein the comparability and reproducibility of results play a crucial role. To obtain reliable results and to maintain the quality of the cells, not only the freezing and thawing processes but also the storage conditions should be controlled and standardized, and any deviations should be documented.
PBMC; cryopreservation; temperature fluctuations; T-cell functionality; viability; recovery
Sharing data in biomedical contexts has become increasingly relevant, but privacy concerns set constraints for free sharing of individual-level data. Data protection law protects only data relating to an identifiable individual, whereas “anonymous” data are free to be used by everybody. Usage of many terms related to anonymization is often not consistent among different domains such as statistics and law. The crucial term “identification” seems especially hard to define, since its definition presupposes the existence of identifying characteristics, leading to some circularity. In this article, we present a discussion of important terms based on a legal perspective that it is outlined before we present issues related to the usage of terms such as unique “identifiers,” “quasi-identifiers,” and “sensitive attributes.” Based on these terms, we have tried to circumvent a circular definition for the term “identification” by making two decisions: first, deciding which (natural) identifier should stand for the individual; second, deciding how to recognize the individual. In addition, we provide an overview of anonymization techniques/methods for preventing re-identification. The discussion of basic notions related to anonymization shows that there is some work to be done in order to achieve a mutual understanding between legal and technical experts concerning some of these notions. Using a dialectical definition process in order to merge technical and legal perspectives on terms seems important for enhancing mutual understanding.
anonymization; data protection; identity; re-identification
High-quality human DNA samples and associated information of individuals are necessary for biomedical research. Biobanks act as a support infrastructure for the scientific community by providing a large number of high-quality biological samples for specific downstream applications. For this purpose, biobank methods for sample preparation must ensure the usefulness and long-term functionality of the products obtained. Quality indicators are the tool to measure these parameters, the purity and integrity determination being those specifically used for DNA. This study analyzes the quality indicators in DNA samples derived from 118 frozen human tissues in optimal cutting temperature (OCT) reactive, 68 formalin-fixed paraffin-embedded (FFPE) tissues, 119 frozen blood samples, and 26 saliva samples. The results obtained for DNA quality are discussed in association with the usefulness for downstream applications and availability of the DNA source in the target study. In brief, if any material is valid, blood is the most approachable option of prospective collection of samples providing high-quality DNA. However, if diseased tissue is a requisite or samples are available, the recommended source of DNA would be frozen tissue. These conclusions will determine the best source of DNA, according to the planned downstream application. Furthermore our results support the conclusion that a complete procedure of DNA quantification and qualification is necessary to guarantee the appropriate management of the samples, avoiding low confidence results, high costs, and a waste of samples.
Even though an increasing portion of biomedical research today relies on the use of bioresources, at present biobankers are not able to trace this use in scientific literature and measure its impact with a variety of citation metrics. The “BRIF (Bioresource Research Impact Factor) and journal editors” subgroup was created precisely with the aim to study this issue and to build a standardized system to cite bioresources in journal articles. This report aims at presenting a guideline for Citation of BioResources in journal Articles (CoBRA). The guideline offers for the first time a standard for citing bioresources (including biobanks) within journal articles. It will increase their visibility and promote their sharing.
The ability to compact and inject the cat germinal vesicle (GV) into a recipient cytoplast allows exploration of a new fertility preservation strategy that avoids whole oocyte freezing. The objective of the present study was to understand the impact of water loss and storage time on GV DNA integrity. Immature cat oocytes were exposed to 1.5 M trehalose for 10 min before microwave-assisted dehydration for 0, 5, 10, 15, 20, 25, 30, or 40 min. Oocytes then were rehydrated to assess chromatin configuration and the incidence of DNA fragmentation (TUNEL assay). The moisture content progressively decreased (p<0.05) from 1.7 to 0.1 gH2O/gDW over the first 30 min, but did not decrease further (p>0.05) after 40 min. Chromatin configuration was unaffected (p>0.05) over time. The percentage of GVs with DNA fragmentation was unaltered (p>0.05) from 0 to 30 min of treatment (range, 6.1%–12%), but increased (p<0.05) to 32.5% after 40 min. Next, the influence of storage at two different supra-zero temperatures after 30 min of drying was investigated. Oocyte-loaded, microwave-treated filters were individually sealed in Dri-Shield moisture barrier bags and stored at 4°C or ambient temperature for 0 to 8 weeks. Moisture contents gradually decreased (p<0.05) from 0.12 to 0.10 gH2O/gDW after 8 weeks of storage at 4°C or ambient temperature. The percentage of GVs with DNA fragmentation more than doubled (p<0.05) from 0 (14.3%) to 2 days (30.0%–33.0%), but remained stable (p>0.05) thereafter (1 through 4 weeks, 25.0%–35.0%). Collective results demonstrate the feasibility of using microwave processing to dehydrate the mammalian GV to a moisture content that is nonlethal and enables nonfrozen storage, an alternative approach for preserving the maternal genome at cool or ambient temperature.
Biomedical investigators require high quality human tissue to support their research; thus, an important aspect of the provision of tissues by biorepositories is the assurance of high quality and consistency of processing specimens. This is best accomplished by a quality management system (QMS). This article describes the basis of a QMS program designed to aid biorepositories that want to improve their operations. In 1983, the UAB Tissue Collection and Biobanking Facility (TCBF) introduced a QMS program focused on providing solid tissues to support a wide range of research; this QMS included a quality control examination of the specific specimens provided for research. Similarly, the Division of Laboratory Sciences at the Centers for Disease Control and Prevention (CDC) introduced a QMS program for their laboratory analyses, focused primarily on bodily fluids. The authors of this article bring together the experience of the QMS programs at these two sites to facilitate the development or improvement of quality management systems of a wide range of biorepositories.
Background: There is growing consensus that individual genetic research results that are scientifically robust, analytically valid, and clinically actionable should be offered to research participants. However, the general practice in European research projects is that results are usually not provided to research participants for many reasons. This article reports on the views of European experts and scholars who are members of the European COST Action CHIP ME IS1303 (Citizen's Health through public-private Initiatives: Public health, Market and Ethical perspectives) regarding challenges to the feedback of individual genetic results to research participants in Europe and potential strategies to address these challenges.
Materials and Methods: A consultation of the COST Action members was conducted through an email survey and a workshop. The results from the consultation were analyzed following a conventional content analysis approach.
Results: Legal frameworks, professional guidelines, and financial, organizational, and human resources to support the feedback of results are largely missing in Europe. Necessary steps to facilitate the feedback process include clarifying legal requirements to the feedback of results, developing harmonized European best practices, promoting interdisciplinary and cross-institutional collaboration, designing educational programs and cost-efficient IT-based platforms, involving research ethics committees, and documenting the health benefits and risks of the feedback process.
Conclusions: Coordinated efforts at pan-European level are needed to enable equitable, scientifically sound, and socially robust feedback of results to research participants.
Storage of labile RNA in laboratories is accomplished through ultra-low freezing of the nucleic acids. This however requires expensive freezers, convenient storage, reliable electrical power, and increased shipping costs, thereby making it a less viable option. Biomatrica (San Diego, CA) has created RNAstable®, a stabilization reagent that is used to store RNA in a dehydrated state at room temperature (RT) and protects the RNA from degradation. Our objective was to investigate the sequence integrity and suitability of RNA when stored in RNAstable at extended time periods and at varying temperatures through use of Illumina and Agilent RNA expression microarrays. We observed in Bioanalyzer electropherograms that total RNA extracted from 293 cells stored at RT in RNAstable for 4.5 and 11.5 months is similar in quality to RNA stored at −80°C. Illumina mRNA expression array QC metrics and gene expression patterns from RNAstable-protected RNA, in contrast to RNA stored without RNAstable, correlated well with those of freezer controls. Significantly, when RNA was stored in RNAstable at 45°C for 4.5 months, equivalent to 22 months RT storage, RNA quality, microarray probe signal intensities, probe detection rates, and expression profiles remained similar between RNAstable-protected RNA at RT and the −80°C controls. At 10.5 months, miRNA levels were compared among the storage conditions using miRNA expression arrays. Here too we found strong concordance between miRNA expression patterns when total RNA was stored in RNAstable or at −80°C. Further, Bioanalyzer electrophoresis of RNAstable-protected samples stored at RT for a relative total of 33 months or 50.5 months showed comparable integrity scores to those of −80°C controls. We conclude that use of RNAstable holds promise as an effective stabilization reagent for total RNA and should be useful in situations where shipping and storage options are limited resources.
Objective: Biorepositories have been key resources in examining genetically-linked diseases, particularly cancer. Asian Americans contribute to biorepositories at lower rates than other racial groups, but the reasons for this are unclear. We hypothesized that attitudes toward biospecimen research mediate the relationship between demographic and healthcare access factors, and willingness to donate blood for research purposes among individuals of Korean heritage.
Methods: Descriptive statistics and bivariate analyses were utilized to characterize the sample with respect to demographic, psychosocial, and behavioral variables. Structural equation modeling with 5000 re-sample bootstrapping was used to assess each component of the proposed simple mediation models.
Results: Attitudes towards biospecimen research fully mediate associations between age, income, number of years lived in the United States, and having a regular physician and willingness to donate blood for the purpose of research.
Conclusion: Participants were willing to donate blood for the purpose of research despite having neutral feelings towards biospecimen research as a whole. Participants reported higher willingness to donate blood for research purposes when they were older, had lived in the United States longer, had higher income, and had a regular doctor that they visited. Many of the significant relationships between demographic and health care access factors, attitudes towards biospecimen research, and willingness to donate blood for the purpose of research may be explained by the extent of acculturation of the participants in the United States.
Enzymatic degradation is a major concern in peptide analysis. Postmortem metabolism in biological samples entails considerable risk for measurements misrepresentative of true in vivo concentrations. It is therefore vital to find reliable, reproducible, and easy-to-use procedures to inhibit enzymatic activity in fresh tissues before subjecting them to qualitative and quantitative analyses. The aim of this study was to test a benchtop thermal stabilization method to optimize measurement of endogenous opioids in brain tissue. Endogenous opioid peptides are generated from precursor proteins through multiple enzymatic steps that include conversion of one bioactive peptide to another, often with a different function. Ex vivo metabolism may, therefore, lead to erroneous functional interpretations. The efficacy of heat stabilization was systematically evaluated in a number of postmortem handling procedures. Dynorphin B (DYNB), Leu-enkephalin-Arg6 (LARG), and Met-enkephalin-Arg6-Phe7 (MEAP) were measured by radioimmunoassay in rat hypothalamus, striatum (STR), and cingulate cortex (CCX). Also, simplified extraction protocols for stabilized tissue were tested. Stabilization affected all peptide levels to varying degrees compared to those prepared by standard dissection and tissue handling procedures. Stabilization increased DYNB in hypothalamus, but not STR or CCX, whereas LARG generally decreased. MEAP increased in hypothalamus after all stabilization procedures, whereas for STR and CCX, the effect was dependent on the time point for stabilization. The efficacy of stabilization allowed samples to be left for 2 hours in room temperature (20°C) without changes in peptide levels. This study shows that conductive heat transfer is an easy-to-use and efficient procedure for the preservation of the molecular composition in biological samples. Region- and peptide-specific critical steps were identified and stabilization enabled the optimization of tissue handling and opioid peptide analysis. The result is improved diagnostic and research value of the samples with great benefits for basic research and clinical work.
Introduction: Clinical, biodiversity, and environmental biobanks share many data standards, but there is a lack of harmonization on how data are defined and used among biobank fields. This article reports the outcome of an interactive, multidisciplinary session at a meeting of the European, Middle Eastern, and African Society for Biopreservation and Biobanking (ESBB) designed to encourage a ‘learning-from-each-other’ approach to achieve consensus on data needs and data management across biobank communities.
Materials, Methods, and Results: The Enviro-Bio and ESBBperanto Working Groups of the ESBB co-organized an interactive session at the 2013 conference (Verona, Italy), presenting data associated with biobanking processes, using examples from across different fields. One-hundred-sixty (160) diverse biobank participants were provided electronic voting devices with real-time screen display of responses to questions posed during the session. The importance of data standards and robust data management was recognized across the conference cohort, along with the need to raise awareness about these issues within and across different biobank sectors.
Discussion and Conclusion: While interactive sessions require a commitment of time and resources, and must be carefully coordinated for consistency and continuity, they stimulate the audience to be pro-active and direct the course of the session. This effective method was used to gauge opinions about significant topics across different biobanking communities. The votes revealed the need to: (a) educate biobanks in the use of data management tools and standards, and (b) encourage a more cohesive approach for how data and samples are tracked, exchanged, and standardized across biobanking communities. Recommendations for future interactive sessions are presented based on lessons learned.
Data from a recent ovarian cancer biomarker study using serum aliquots from the Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial Biorepository showed that CA125II concentrations in these aliquots were significantly lower than those previously measured in the same subjects from the same blood draw. We designed an experiment to investigate whether samples used in the study (reference study) were compromised during the aliquoting process. We measured CA125II in the “sister” vials created during the same aliquoting process as the reference study aliquot, and in “cousin” vials newly aliquoted from another parent vial from the same blood draw, from 15 healthy controls in the study. Because the sister vials were created in a specific order, we also assessed whether there was a CA125II concentration gradient among the sisters. The Wilcoxon signed-rank test was used to test the statistical significance of the observed differences. Mean CA125II concentration (volume-averaged) was greater in the sisters than the cousins in all 15 subjects (p<0.001). The mean coefficient of variation was 0.25 (range: 0.12–0.43) in the sisters and 0.11 (range: 0.–1.1) in the cousins (p<0.008). The mean ratio of CA125II in the 5th aliquoted versus the 3rd aliquoted sister vial was 1.66 (1.25–2.5, p<0.001). These data suggest that the parent vials were not adequately mixed before they were aliquoted. CA125II in serum can partially precipitate to form a concentration gradient in long-term storage. Rigorous vortexing after thawing and before aliquoting is thus critical.
Background: Stable dry-state storage of DNA is desirable to minimize required storage space and to reduce electrical and shipping costs. DNA purified from various commercially available dry-state stabilization matrices has been used successfully in downstream molecular applications (e.g., quantitative polymerase chain reaction [qPCR], microarray, and sequence-based genotyping). However, standard DNA storage conditions still include freezing of DNA eluted in aqueous buffers or nuclease-free water. Broad implementation of dry-state, long-term DNA storage requires enhancement of such dry-state DNA stabilization products to control for temperature fluctuations at specimen collection, transit, and storage. This study tested the integrity of genomic DNA subjected to long-term storage on GenTegra™ DNA stabilization matrices (GenTegra LLC, Pleasanton, CA) at extreme conditions, as defined by a 4-year storage period at ambient temperature with an initial incubation for 7 months at 37°C, 56°C, or ambient temperature. Subsequently, purified DNA performance and integrity were measured by qPCR and next-generation sequencing (NGS)-based human leokocyte antigen (HLA) genotyping.
Results: High molecular weight genomic DNA samples were recovered from the GenTegra product matrix and exhibited integrity comparable to a highly characterized commercial standard under assessment by qPCR. Samples were genotyped for classical HLA loci using next generation sequencing-based methodolgy on the Roche 454 GS Junior instrument. Amplification efficiency, sequence coverage, and sequence quality were all comparable with those produced from a cell line DNA sequenced as a control. No significant differences were observed in the mean, median, or mode quality scores between samples and controls (p≥0.4).
Conclusions: Next generation HLA genotyping was chosen to test the integrity of GenTegra-treated genomic DNA due to the requirment for long sequence reads to genotype the highly polymorphic classical HLA genes. Experimental results demonstrate the efficacy of the GenTegra product as a suitable genomic DNA preservation tool for collection and long-term biobanking of DNA at fluctuating and high temperatures.
The Collaborative (formerly the Cooperative) Human Tissue Network (CHTN) is a federally funded service oriented grant that provides high-quality biospecimens and services to the research community. The CHTN consists of six institutions located throughout the United States to assist investigators in obtaining research specimens required for basic research. The CHTN divisions have similar operating goals: however, each division is responsible for maintaining operations at their local institutions. This requires the divisions to identify ways to maintain and sustain operations in a challenging federally funded environment, especially when the number of investigators requesting services drives the operation. Sustainability plans and goals are often times patched together out of necessity rather than taking a thoughtful approach by clearly defining and aligning activities with business strategy and priorities. The CHTN Western Division at Vanderbilt University Medical Center (CHTN-WD) has responded to this challenge of biospecimen resource sustainability in the face of diminished funding by continually identifying ways to innovate our processes through IT enhancements and requiring that the innovation produce measurable and relevant criteria for credibly reporting our operations progress and performance issues. With these overarching goals in mind, CHTN-WD underwent a Lean Six Sigma (LSS) series to identify operational inefficiencies that could be addressed with redesigning workflow and innovating the processes using IT solutions. The result of this internal collaborative innovation process was the implementation of an error-reporting module (ERM) hosted within our biorepository donor IT application, which allowed staff to report errors immediately; determine the operational area responsible; assess the severity of the error; determine course of action; determine if standard operating procedure (SOPs) revisions were required; and through automated e-mails, alert the area personnel responsible. The module provides a data-reporting feature by date range and area of operation for management and analysis.
The challenges facing biobanks are changing from simple collections of materials to quality-assured fit-for-purpose clinically annotated samples. As a result, informatics awareness and capabilities of a biobank are now intrinsically related to quality. A biobank may be considered a data repository, in the form of raw data (the unprocessed samples), data surrounding the samples (processing and storage conditions), supplementary data (such as clinical annotations), and an increasing ethical requirement for biobanks to have a mechanism for researchers to return their data. The informatics capabilities of a biobank are no longer simply knowing sample locations; instead the capabilities will become a distinguishing factor in the ability of a biobank to provide appropriate samples. There is an increasing requirement for biobanking systems (whether in-house or commercially sourced) to ensure the informatics systems stay apace with the changes being experienced by the biobanking community. In turn, there is a requirement for the biobanks to have a clear informatics policy and directive that is embedded into the wider decision making process. As an example, the Breast Cancer Campaign Tissue Bank in the UK was a collaboration between four individual and diverse biobanks in the UK, and an informatics platform has been developed to address the challenges of running a distributed network. From developing such a system there are key observations about what can or cannot be achieved by informatics in isolation. This article will highlight some of the lessons learned during this development process.
The Genotype-Tissue Expression (GTEx) project, sponsored by the NIH Common Fund, was established to study the correlation between human genetic variation and tissue-specific gene expression in non-diseased individuals. A significant challenge was the collection of high-quality biospecimens for extensive genomic analyses. Here we describe how a successful infrastructure for biospecimen procurement was developed and implemented by multiple research partners to support the prospective collection, annotation, and distribution of blood, tissues, and cell lines for the GTEx project. Other research projects can follow this model and form beneficial partnerships with rapid autopsy and organ procurement organizations to collect high quality biospecimens and associated clinical data for genomic studies. Biospecimens, clinical and genomic data, and Standard Operating Procedures guiding biospecimen collection for the GTEx project are available to the research community.
Availability of and access to data and biosamples are essential in medical and translational research, where their reuse and repurposing by the wider research community can maximize their value and accelerate discovery. However, sharing human-related data or samples is complicated by ethical, legal, and social sensitivities. The specific ethical and legal requirements linked to sensitive data are often unfamiliar to life science researchers who, faced with vast amounts of complex, fragmented, and sometimes even contradictory information, may not feel competent to navigate through it. In this case, the impulse may be not to share the data in order to safeguard against unintentional misuse. Consequently, helping data providers to identify relevant ethical and legal requirements and how they might address them is an essential and frequently neglected step in removing possible hurdles to data and sample sharing in the life sciences. Here, we describe the complex regulatory context and discuss relevant online tools—one which the authors co-developed—targeted at assisting providers of sensitive data or biosamples with ethical and legal questions. The main results are (1) that the different approaches of the tools assume different user needs and prior knowledge of ethical and legal requirements, affecting how a service is designed and its usefulness, (2) that there is much potential for collaboration between tool providers, and (3) that enriched annotations of services (e.g., update status, completeness of information, and disclaimers) would increase their value and facilitate quick assessment by users. Further, there is still work to do with respect to providing researchers using sensitive data or samples with truly ‘useful’ tools that do not require pre-existing, in-depth knowledge of legal and ethical requirements or time to delve into the details. Ultimately, separate resources, maintained by experts familiar with the respective fields of research, may be needed while—in the longer term—harmonization and increase in ease of use will be very desirable.
The National Heart, Lung, and Blood Institute (NHLBI), within the United States' National Institutes of Health (NIH), established a Biorepository in 1976 that initially archived biospecimens from population-based blood product safety surveys. It was later expanded to biospecimens from clinical and epidemiological studies in heart, lung, and blood disorders. The NHLBI also established a Data Repository in 2000 to store and distribute study data from NHLBI-sponsored research. The NHLBI Biologic Specimen and Data Repository Information Coordinating Center (BioLINCC) was established in 2008 to develop the infrastructure needed to link the contents of these two related NHLBI Repositories, facilitate access to repository resources, and streamline request processes.
Three key program subcomponents were developed simultaneously: 1) the linkage of biospecimen electronic inventory records with their clinical or characterization data; 2) the development and implementation of a website with both public-facing information and private processing workspaces; and 3) the development of processes to maximize efficiency via a web-based system while maintaining workflow control, document tracking, and secure processes.
The BioLINCC website was launched on October 1, 2009 with eight biospecimen collections and data from 72 research studies. By the end of the fourth online year, 38 biospecimen collections were linked and posted, and data from 108 research studies had been made available for request. The number of registered users by the end of the fourth online year approached 2600, and continues to show a trend towards an increasing rate of new users per year. BioLINCC has fulfilled 381 requests comprising 851 data collections, as well as 600 teaching dataset requests and 75 data renewal agreements. 154 biospecimen requests comprising 147,388 biospecimens were fulfilled or actively in process. We conclude that the BioLINCC program has been successful in its goal to increase the visibility and utilization of NHLBI biospecimen and data repository resources.