PMCCPMCCPMCC

Search tips
Search criteria 

Advanced

 
Logo of bloodtransLink to Publisher's site
 
Blood Transfus. 2010 July; 8(3): 212–215.
PMCID: PMC2906189

Press review

Blood supply and usage

Ali A, Auvinen M-K, Rautonen J.

The aging population poses a global challenge for blood services.

Transfusion 2010; 50:584–8.

A permanent national register of blood component use was established in Finland in 2002 (Palo R, Ali-Melkkilä T, Hanhela R, et al. Vox Sang 2006; 91:140–7). This national register was very cleverly obtained just by collecting data from pre-existing electronic medical registers, which provided information on hospital admissions, diagnoses, surgical operations, test results, blood components, and transfusions. Data were linked through the personal identification number that is provided by Finnish authorities to all permanent residents. The original purpose of the register was to improve transfusion appropriateness, by means of comparisons of practices (benchmarking).

The authors of the present study analysed this huge amount of data in another direction. They were impressed by how much the distribution of the probability of blood transfusion is skewed in different age groups: in Finland, patients in the eighth decade of life consume eight times more red cell units than those during the third or fourth decade. Similar observations have been previously reported for other Western countries and can be considered representative of all the developed world. The authors calculated the red cell usage per capita by age between the years 2002 and 2006 and combined this information with projections of population statistics on the size and structure of the resident population, from 1950 to 2050. They also considered several other countries (all developed), besides Finland, applying the same age distribution of blood usage. The results were impressive: from 1990 onwards, all countries are expected to undergo an increase of red cell usage, with roughly a 10% increment every decade from now until 2050. The authors also ventured to state that a large part of the variation in red cell usage between the different countries could be explained by differences in age distribution, rather than treatment policies and protocols. Their own graphs do, however, partially contradict them, as the estimates of the present annual consumption for France, Italy and Spain, for example, appear substantially higher than actual figures.

The expected increase in the proportion of elderly people will make it more difficult to recruit blood donors. In this regard, the authors analysed another interesting parameter, “blood dependency”, defined as the ratio between the number of people outside the donor age group (18–65 years) and those inside. Bad news comes from this analysis, too: on average, this ratio is currently at a minimum for all countries, but is soon destined to increase.

Although the estimates may be imprecise, there is little doubt that the overall scenario is correct. In the near future, the proportion of blood donors in the donor age group of the population will have to be considerably greater than at present in order to meet the needs. Alternatively, current limitations on the number of blood donations made per year or on donor age will have to be loosened.

Heitmiller ES, Hill RB, Marshall CE, et al.

Blood wastage reduction using Lean Sigma methodology.

Transfusion 2010; ahead of print (doi: 10.1111/j.1537-2995.2010.02679.x)

Lean manufacturing” is a management practice that focuses on eliminating wasteful expenditure of resources. “Six Sigma” is a production improvement method. Its name derives from statistical jargon (sigma=standard deviation) and the original meaning was that the objective of the improvement process was to achieve a defect-free production, within the range of the average plus or minus six standard deviations. Both approaches can be combined. They lend themselves well to any production environment, but the authors of this article applied them to the hospital wastage of red blood cell (RBC) concentrates.

According to the tenets of their approach, the authors:

  • defined the problem (blood wastage), the goal (to reduce overall hospital RBC wastage by 50%), and the measurement method;
  • measured the baseline wastage;
  • analysed the source of wastage;
  • improved the situation by means of several interventions;
  • controlled that the appropriate parameters remained within the specified limits.

The baseline RBC wastage rate was 4.4%. Outdated units accounted for only 4.4% of this wastage: the main sources of wastage (87.4%) were units returned more than 30 minutes after issue or with the temperature indicator out of range.

Single units were issued without temperature-validated containers and without temperature indicators. The 30-minute limit applied to them. The interventions on this source of wastage were: raising awareness of the problem; educational presentations, particularly targeting the staff of the clinical services most often involved (operating rooms); e-mail notifications and personal phone calls to personnel responsible for the wastage; notification of each wasted blood unit through the Web-tool used by the hospital for reporting adverse events and safety issues. The blood request form was also modified to include a checklist concerning factors possibly delaying transfusion, such as venous access, informed consent, fever, etc. The 30-minute limit had been introduced to satisfy the requirement that the internal temperature of the blood component would not exceed 10°C. During the course of the study, this policy was changed and temperature-sensitive labels were used instead.

The second main source of wastage concerned blood units issued in temperature-validated containers, with temperature-sensitive labels affixed to each of them. The problem was primarily tackled by improving the packing of the blood units inside the container. The temperature-sensitive labels were initially changed because the device in use caused interpretation uncertainties, but, in the end, they were abandoned completely and substituted by inspection of the returned container. These measures were so successful that this source of wastage was completely eliminated.

In the control phase, the wastage rate remained constantly below 2%. Many interventions were applied simultaneously, so the authors were not able to quantify their respective contributions. However, this study suggests that an insightful analysis, followed by such simple and low-cost measures as raising awareness and educating on best practices may have a large impact.

As exemplified by this study, the Lean Sigma methodology is a combination of root cause analysis and continuous quality improvement. Another characteristic of this methodology is the habit of giving fancy names, such as “champion” or “black belt”, to the people mainly responsible for its implementation, in order to motivate them. The authors of the present paper avoided emphasising this aspect, and rightly so, as it is not likely to earn much support in the sober and sceptical world of hospitals.

In view of the prospects put forward in the previous article of this Press Review, the issue of blood wastage should be given more attention. Perhaps it is time to reconsider whether blood units should be discarded if they exceed the 30-minute or the 10°C limit. Red cell survival in vivo is minimally affected. Limited data from the literature suggest that the expiry date of a unit of blood kept at room temperature for 24 hours should be antedated by one week. However, the most fearsome consequence is the exponential growth of contaminating bacteria. In this regard, it has been convincingly argued that a 2-hour limit would be equally safe (Hamill TR. Transfusion 1990; 30:58–62). A reasonable policy could, therefore, be to abandon the use of temperature-sensitive labels, extend the limit to 2 hours, and perform test cultures on blood units exceeding that limit.

Transfusion-related acute lung injury

Tynell E, Andersson TML, Norda R, et al.

Should plasma from female donors be avoided? A population-based cohort study of plasma recipients in Sweden from 1990 through 2002.

Transfusion 2010; 50:1249–1256.

Transfusion-related acute lung injury (TRALI) is one of the most severe complications of transfusion. Most cases are associated with the passive transfer of donor leucocyte antibodies and with plasma from female donors. TRALI is believed to be considerably underreported, because its clinical features are not specific. The authors of the present study decided to verify whether receiving plasma from female donors conferred a survival disadvantage to the general population of transfused patients.

Data were extracted retrospectively from the Scandinavian donations and transfusions database (SCANDAT). Similarly to the Finnish database cited above (see the comment of the first article in this Press Review), it was obtained by linking existing donation, transfusion, health and population registers, from Sweden and Denmark.

The study included subjects aged 18 years or over who received their first allogeneic plasma transfusion between 1990 and 2002. Transplanted patients were excluded, as were those who had received any blood component (except autologous plasma or compatible allogeneic red cell concentrates) in the 30 days before the plasma transfusion. If these conditions occurred after inclusion, the follow-up was censored. Eligible patients were divided into ‘exposed’ (those who received at least one plasma unit from a female donor) and ‘non-exposed’ (those who received plasma exclusively from male donors). Patients were followed for a maximum of 14 days. The primary outcome was death from any cause. The secondary outcome was death associated with a discharge diagnosis involving the respiratory or the cardiovascular system or an adverse reaction. Patients who died on the first day of transfusion were analysed separately.

The relative risk of death was calculated according to the level of exposure (0, 1, 2, 3–4, 5+ units of female donor plasma), adjusted for the total number of plasma units transfused, number of red cell units, duration of follow-up, hospital, calendar year, sex, and age. Hospitals with fewer than 1000 recipients were excluded.

At the end of the selection process, 92,565 recipients of allogeneic plasma in 30 Swedish hospitals were included and 68% of them had received at least one unit of plasma from a female donor. The 14-day mortality increased with the total number of plasma units transfused. The transfusion of up to two plasma units from female donors did not confer any significant survival disadvantage, but the patients with the highest exposure levels (3–4 and 5+ units) had a relative risk of 1.16 (95% CI: 1.06–1.27; p=0.002) and 1.32 (95% CI: 1.17–1.49; p<0.0001), respectively. The analysis of the secondary outcome yielded similar results, with higher relative risks: 1.47 (1.19–1.82; p=0.0003) and 1.72 (1.29–2.29; p=0.0002), respectively, for the highest exposure levels. The authors calculated that excluding female donor plasma would avoid 8.8 deaths every 10,000 patients transfused. Comparing this value with the current estimates of the incidence and mortality of TRALI (1:5000 transfusions, with 6% mortality), it appears that TRALI is actually grossly underreported.

This large and accurate study is an important contribution and confirms, at the epidemiological level, that the transfusion of plasma from female donors is associated with significantly increased mortality. It is a retrospective study, but the authors correctly pointed out that within any hospital at a given time, the exposure to female plasma could reasonably be considered a random event. The results are compatible with the known distribution of leucocyte antibodies in the female population and, therefore, lend support to the above-mentioned pathogenic mechanism. The authors were not able to include donor parity among the analysed variables, because data were insufficient. It is a pity that they also ignored the donors’ transfusion history.

Eder AF, Herron RM Jr, Strupp A, et al.

Effective reduction of transfusion-related acute lung injury risk with male-predominant plasma strategy in the American Red Cross (2006–2008).

Transfusion 2010; ahead of print (doi: 10.1111/j.1537-2995.2010.02652.x).

As an ideal companion to the previous article, this paper reports on the effects of the use of male-predominant plasma for transfusion. This policy was introduced by the American Red Cross in 2006, after their Haemovigilance Program had found that female donors with leucocyte antibodies were associated with 75% of fatalities attributed to TRALI after plasma transfusion between 2003 and 2005. Each year, approximately 1,700,000 plasma components were transfused. The percentage of male donor plasma was 55% in 2006, 79% in 2007, and 95% in 2008. The authors expected a decrease of TRALI cases and tried to document this by analysing their haemovigilance data. The diagnosis of TRALI is not easy, as the clinical features are not specific and the clinical records are often incomplete. The reported cases were, therefore, classified according to their probability and also to the type of blood component transfused.

There were 19 reported fatalities, associated with probable TRALI, in 2006, 10 in 2007, and 9 in 2008. The numbers of non-fatal probable TRALI cases were 50 in 2006, 36 in 2007, and 42 in 2008. If only cases attributed to the transfusion of plasma are considered, there were 6, 5, and 0 fatal cases and 26, 12, and 7 non-fatal cases, respectively, in the three years. These results are difficult to interpret because the degree of reporting was increasing in the previous years. However, the number of cases attributed to red cells or apheresis platelets did not change in the same period. The authors calculated the statistical significance of the differences between the years 2006 and 2008, obtaining many “significant” results (p<0.05). However, if the tests are repeated considering all three years (more correctly, in my opinion), a p<0.05 is only obtained for the cases attributed to the transfusion of plasma.

The authors are probably correct in concluding that the male-predominant plasma policy decreased the frequency of TRALI after transfusion of plasma. Their experience, however, highlights a number of difficulties, besides the known interpretation problems of observational studies. TRALI also occurs after the transfusion of red cells and platelets, with the highest frequency now associated with apheresis platelets. The American Red Cross tries to recruit male donors for the donation of platelets by apheresis, but males still account for only 65%. In fact, even the implementation of the male-predominant plasma strategy was incomplete: only 57% of AB group plasma was from male donors, because of the disproportionate clinical demand.

Tailored transfusion

Reikvam H, Prowse C, Roddie H et al.

A pilot study of the possibility and the feasibility of haemoglobin dosing with red blood cells transfusion.

Vox Sang 2010; 99:71–76

A major concern of the late Prof. Högman was the lack of standardisation of the haemoglobin content of red cell units: he estimated that, ignoring non-viable erythrocytes, a “unit” of red cells could contain between 35 and 64 grams of haemoglobin (Högman CF, Meryman HT. Transfusion 2006; 46:137–42). As a solution, he proposed collecting blood donations with the same amount of haemoglobin, instead of the same volume of blood.

A Turkish group approached the problem in the opposite way: they tried to exploit the present variability to match the clinical needs of the different patients and claimed that they achieved the clinical objective sparing 30% of the requested blood units (Arslan O, Toprak S, Arat M, Kayalak Y. Transfusion 2004; 44:485–88).

The Turkish study included only 51 haematological patients and was more a proof-of-concept study than a trial of real world implementation. The whole idea is based on the (quite natural) presupposition that the more haemoglobin is transfused, the higher the increase in the patient’s haemoglobin concentration. The present article describes a pilot study designed to show just how much this is true.

Patients were haemodynamically stable, without haemolysis or a positive direct antiglobulin test. The total haemoglobin content of the transfused red cell units was measured by direct sampling. The patient’s blood volume was estimated by means of a known formula, using height and weight. The patient’s haemoglobin concentration was measured before and after transfusion. The sample after transfusion was taken after at least 15 minutes to allow for the necessary volume equilibration. In this regard, the authors mentioned two studies supporting the claim that 15 minutes are sufficient.

Fifty-two transfusion episodes in 50 patients were studied. A minimum of two and a maximum of four units were given per transfusion. The haemoglobin content of the red cell units varied from 39 to 69 grams. The dose and observed increment of haemoglobin were significantly correlated (r=0.41) and, as expected, the correlation improved considerably after adjusting for the blood volume (r=0.67). The value of the coefficient of correlation does, however, show that the dose explains less than 50% of the variance of the increment. Part of the residual variance could reasonably be due to the survival properties of the red cells, which are known to vary among the blood donors considerably. The authors, moreover, did not include any correction for the duration of storage of the blood units. I also suspect that sometimes 15 minutes might have been insufficient for complete equilibration. The two studies cited in support of this time span being sufficient only transfused two units per transfusion episode. Moreover, only one of them mentioned how long the transfusion lasted, and it was certainly not fast (240 minutes for two units).

The haemoglobin increment per unit transfused and the patient’s weight were not significantly correlated. The authors concluded that haemoglobin dosage is feasible but its clinical usefulness remains to be demonstrated.

The concept of haemoglobin dosing is intriguing, as it promises to provide a sort of tailored transfusion. In fact, it is a bit surprising that the brilliant idea of the Turkish colleagues has been so timidly embraced, so far. Its implementation requires that the haemoglobin content of blood units is known. The direct measurement, as applied in this study, is not practical on a routine basis. Many Blood Banks now record the donor’s haemoglobin concentration and the volume of whole blood collected. The removal of the buffy-coat and/or the leucoreduction certainly decrease the haemoglobin content significantly, but their effect could conceivably be estimated with good approximation. That could be the topic of a further preliminary study.

Other requirements are that the patient’s height and weight are communicated to the Transfusion Service, as well as the current and the desired haemoglobin concentration. Finally, software should be available for processing the data and for choosing the most appropriate blood units. Obviously, haemoglobin dosing would only make sense for patients without active bleeding or haemolysis.


Articles from Blood Transfusion are provided here courtesy of SIMTI Servizi