|Home | About | Journals | Submit | Contact Us | Français|
The last 4 decades of this century have witnessed the evolution of liver and other organ transplantation to a high standard of care for patients with end-stage organ disease. In this chapter, we will focus on what has been accomplished at our center in: 1) defining the meaning of organ allograft acceptance and acquired tolerance; 2) improving survival after liver transplantation, with particular reference to tacrolimus, drug weaning, and disease recurrence, 3) applying lessons from liver and other organ transplantation to hepatointestinal and multivisceral transplantation and 4) assessing the feasibility of clinical xenotransplantation.
The patient and graft survival curves for all of the clinical studies were generated using the Kaplan-Meier method; group comparisons were done using the log-rank test. The cumulative risks of rejection and graft loss from rejection were also estimated using the Kaplan-Meier method. Risk factors for mortality, graft loss, and morbidity were analyzed using Cox's proportional hazards and the artificial neural network models. Continuous variables were presented as mean ± standard deviation and categorical data as proportions. Differences between group means were tested using the standard two-tailed sample t-test and differences in proportions by Fisher's exact test.
Organ allograft survival has improved stepwise over the last 37 years with the successive availability of azathioprine, cyclosporine (CYA) and tacrolimus (TAC), each of which has been combined with adjustable doses of prednisone or incorporated into more complex multiple drug cocktails. However, the prototypical events after transplantation (1,2) have remained the same (Fig. 1). In successful cases, the host response against the graft, which may or may not be strong enough to cause clinical and/or histopathologic findings of rejection, is followed by a reduced need for immunosuppression.
It was generally assumed for more than 3 decades that the foregoing events leading to organ allograft acceptance involved different mechanisms than the neonatal tolerance of Billingham, Brent, and Medawar or the acquired tolerance of cytoablated recipients of bone marrow allografts. However, studies following the discovery of small numbers of donor leukocytes (microchimerism) in the tissues and blood of human organ recipients many years after transplantation (3,4) have shown that the essential mechanisms of organ engraftment are the same under all 3 circumstances (5–7).
After organ transplantation, most of the donor leukocytes of bone marrow origin that are contained in the allograft migrate primarily to host lymphoid organs. This results not only in progressive loss of the allograft's antigenicity, but also in the initiation by the peripheralized donor cells of reciprocal acute clonal activation of the usually dominant host-versus-graft (HVG) response and the graft-versus-host (GVH) response of the passenger leukocytes (Fig. 1). With successful engraftment, these proceed to mutual clonal exhaustion-depletion (3,7).
Maintenance of the exhaustion-deletion is dependent on persistence of the donor leukocytes, small numbers of which escape primary destruction in the lymphoid organs and secondarily migrate to non-lymphoid sites where they may periodically return to host organized lymphoid organs or ectopic collections. The delayed leakage of donor leukocytes from the non-lymphoid to the lymphoid compartment is thought to represent a balance between destructive and non-destructive immunity as has been demonstrated experimentally in autoimmune diabetes models (8). The events after successful bone marrow transplantation to a cytoablated recipient are the same, but in mirror image, ie., the dominant immune system is the donor's (5–7).
The 2 mechanisms of clonal exhaustion-deletion and immune indifference are governed by antigen migration and localization (7) and proceed to permanent engraftment without therapy in a limited number of animal models, especially when the transplanted organ is the leukocyte-rich liver (6). Under most circumstances, however, immunosuppression is required to prevent one cell population from destroying the other before the tolerogenic events shown in Figure 1 can occur.
Although this principle was not understood, its exploitation with azathioprine, combined with adjustable doses of prednisone (1), established an empirical basis for the field of clinical organ transplantation that has been exploited with progressively greater efficiency using baseline therapy with CYA and TAC for all organs, including the liver (Fig. 2) (9). In addition to explaining transplantation enigmas of the past, this concept has dominated our plans for future research and development as will be evident in the succeeding sections.
In the 1995 edition of this book, we discussed the high therapeutic index and relatively short-term survival benefits of TAC as the primary baseline immunosuppressive agent (Fig. 2) for liver allograft recipients based on our own experience (9,10,11), as well as the data of both the American (12) and European randomized studies (13). The results unequivocally proved that TAC is a more potent and satisfactory immunosuppressant than CYA. In the multicenter European and American randomized trials as in our single-center trial, the ability of TAC to rescue intractably rejecting grafts in patients treated primarily with CYA (9–15) allowed equalization of patient and graft survival on both arms of the randomized trials when the intent-to-treat analytic methodology was applied. However, the ability of TAC to systematically rescue the treatment failures of CYA and to then be continued permanently suggested that it was the preferred baseline drug for hepatic transplantation (and all organs). This conclusion was supported by analysis of secondary end points.
To further document the safety and long-term efficacy of TAC, we recently analyzed the results of the first 1,000 primary liver allograft recipients who were transplanted between August 1989–December 1992 and were followed until August 1997 (16). Of these 1,000 consecutive patients, 7.5% were infants, 9.1% were children, 63% were adults (aged 19–60), and 20.4% were aged 60 or older (Table 1). Eighty-four percent of these recipients were hospital bound at the time of transplantation. Livers from older donors (over age 50) were used in 141 cases.
With a mean follow-up of 6.4 years (range: 4.7–8), the overall patient and graft survival rates at 7 years were 65% and 60%, respectively (Fig. 3). Patient and graft survival rates were best among recipients aged 2–18 with a 7-year patient survival rate of 91% and graft survival of 85%. The differences in both patient (Fig. 4A) and graft (Fig. 4B) survival among the 4 age cohorts were statistically significant (p<0.001). The common causes of long-term patient/graft loss (particularly in the adult and senior population) were primary disease recurrence, cardiopulmonary failure, cerebrovascular accidents, noncompliance, and de novo malignancy.
The level of TAC-steroid maintenance immunosuppression is depicted in Figure 5. It is important to note that the daily TAC dose was gradually reduced and that the whole blood drug trough level was maintained below 10 ng/ml with long-term follow-up. Equally important, complete weaning from steroids was achieved in 53% and 67% of the recipients at 3 months and 5 years after transplantation. Despite the lightened maintenance immunosuppression, only 1.6% of the grafts were lost due to chronic rejection. In most of these exceptional cases, TAC had been reduced or stopped because of life-threatening infections (eg. hepatitis C or B, cytomegaloviral disease) or patient noncompliance (15).
The early and long-term overall estimated risk of hypertension requiring anti-hypertensive medication, insulin dependent diabetes, and hyperkalemia are shown in Table 2. There was a slight increase in the mean serum cholesterol levels with long-term follow-up. An initial increase in serum creatinine during the first 3 postoperative months was maintained without further deterioration in kidney function in most recipients. Only 2% of the pediatric population and 4% of the adult patients including senior recipients developed end-stage renal failure and required dialysis and or kidney transplantation.
Posttransplant lymphoproliferative disease (PTLD) developed in a total of 35 (3.5%) patients. The incidence was higher among the pediatric (11.4%) than the adult (1.9%) population and was associated with a 21% and 31% mortality, respectively. De novo non-lymphoid malignancy was observed in 57 patients, all adults, for an overall incidence of 5.7% (6.8% for adults) (17). The malignancies were lethal in 58% of these cases. The type and/or site of malignancy and its frequency compared with the general population are shown in Figure 6. Skin cancers were the most common (2.2%) type of de novo non-lymphoid malignancy but had the best long-term survival. The relative risk of developing oropharyngeal cancer was 7.6 times higher and lung cancer was 1.7 times higher than in the general population matched for age, gender, and length of follow-up. The cumulative risks of PTLD and de novo malignancy are comparable with azathioprine and CYA-based regimens (18,19).
Childbearing, which was an unusual event after liver transplantation under previous immunosuppressive regimens, has been common in women treated with TAC. The maternal and fetal risk of pregnancy after liver transplantation and under TAC was prospectively studied in 27 pregnancies among 21 female recipients (20). The 21 mothers had surprisingly few serious complications of pregnancy and no mortality, in contrast to previous experience with azathioprine and CYA-based immunosuppression. Pregnancy was possible with a very low incidence of hypertension, pre-eclampsia, abnormal allograft function, and other maternal complications historically associated with such gestations.
No allograft loss was attributed to pregnancy. As in previous experience with other immunosuppressive regimens, preterm deliveries and spontaneous renal impairment of the infants were common (20). However, prenatal growth for gestational age and postnatal infant growth for postpartum age were normal. The 2 infant losses in our series were by mothers whose conceptions were temporally close to liver transplantation. Therefore, it is our practice to advise women of child bearing age to avoid pregnancy for 2 years after hepatic transplantation and for longer when there have been transplant-related management problems including steroid dependence.
Addition of adjunct maintenance immunosuppressive agents has not added survival benefits. Over the past 3 years, a prospective randomized trial of TAC and prednisone versus TAC, prednisone, and mycophenolate mofetil (MMF) in primary adult liver transplant recipients was conducted at our center (21). A total of 300 patients were enrolled; 151 in double-drug therapy and 149 in triple-drug therapy. During the study period and as of June 1998, 35 (23.2%) of the double-drug therapy patients had MMF added to control ongoing acute rejection, nephrotoxicity, and/or neurotoxicity. On the other hand, 83 (55.7%) patients in the triple-drug therapy cohort discontinued MMF for infection (18.5%), myelosuppression (15.9%), gastrointestinal disturbances (13.9%), and other reasons (6.6%).
With a mean follow-up of 24.7 months (range: 3.7–29) and by an “intent-to-treat analysis”, the actuarial 2-year patient survival rate was 83% with double-drug therapy and 85% with triple-drug therapy (Fig. 7A) with a graft survival rate of 76% and 78%, respectively (Fig. 7B). The difference between the 2-year rejection-free rate of each cohort was also not statistically significant with an incidence of 67% (triple-drug) and 58% (double-drug). The mean maintenance steroid dose, however, was slightly lower with triple-drug compared with double-drug therapy. The rates of postoperative infection, perioperative hematological changes, and gastrointestinal side effects were similar in both groups but with better preservation of renal function among the triple-drug therapy group (21).
Despite the increase in the cumulative survival rate of liver allograft recipients over the last 15 years at our center (Fig. 8) and elsewhere (particularly since 1989), the maintenance immunosuppression, which has been assumed to be a permanent requirement, is associated with complications that range from trivial to lethal. When evidence of chimerism in long-surviving kidney, liver, and other organ recipients was discovered in 1992, a linkage was established between organ allograft acceptance, and the acquired neonatal tolerance of Billingham, Brent and Medawar as well as the cytoablated drug-free recipient of bone marrow allograft (3–7) (see section on “Organ Engraftment and Acquired Tolerance”).
Because the engraftment of organs involves the transplantation of a small fragment of the donor immune system (ie., the passenger leukocytes), it was considered possible that the resulting chimerism-dependent tolerogenic process no longer required the assistance of immunosuppression. In fact, it was learned in the course of the chimerism studies that several of our patients who included the longest surviving liver and kidney recipients in the world had secretly stopped their medications — some for as long as 30 years (4). Consequently, a prospective weaning trial was begun in liver recipients as extensively discussed in 1995 edition of this book and recently updated elsewhere (22,23).
A total of 95 liver recipients was entered into the prospective weaning series between June 1992-April 1996 (23). The criteria for admission were: 1) ≥ 5 years posttransplantation, 2) ≥ 2 years rejection free, 3) compliance of both patient and local physician, 4) absence of acute/chronic rejection, and primary disease recurrence in a baseline liver biopsy, and 5) exclusion by imaging or interventional radiologic techniques of vascular or biliary tract complications. Thirty-one (33%) patients were children (≤ age 20), and 64 (67%) were adults 21–68 years old. The leading causes of pretransplant liver failure were biliary atresia in children and viral/autoimmune hepatitis, cholestatic liver disease and alcoholic/cryptogenic cirrhosis in adults.
The immunosuppression at the start of weaning was azathioprine-based in 13 (13.5%), CYA-based in 71 (75%) and TAC-based in 11 (11.5%) cases. Except for the TAC era, initial triple-drug therapy had been used throughout the 30-year period of our program (Colorado/Pittsburgh), always involving prednisone as the secondary drug, but with the variable use of azathioprine and ALG/OKT3 as third or fourth agent (9). Prednisone was the only adjunct drug in most of the patients treated from the outset with TAC. With a mean time from liver replacement to the start of weaning of 8.4 ± 4.4 years, complications of long-term immunosuppression prior to weaning included hypertension (n=32), renal insufficiency (n=27), skin lesions/malignancy (n=12), neurologic disorders (n=4), infection (n=6), steroid-related complications (n=9), and diabetes (n=9). Thirty-seven (39%) patients had multiple immunosuppression-related complications. The details of the weaning protocol, monitoring of graft function, and rescue treatment in the event of rejection are described elsewhere (22,23).
No grafts were lost to weaning, although 2 patients died with normally functioning organs. The first death was from septic pulmonary complications of cystic fibrosis, while still weaning. The second patient, who had been returned to baseline immunosuppression following an easily controlled mild acute cellular rejection, died of a pulmonary embolus following complications of a severe foot infection. These deaths were directly attributable to the morbidity of chronic immunosuppression, which the weaning protocol was designed to ameliorate.
Eighteen (19%) of the 95 patients have been drug free with normal liver function tests for 10–58 months. The success rate was higher in children than adults (Fig. 9). There was no significant difference in the success rate when the baseline immunosuppressant was TAC or azathioprine. However, there was a significant difference between these groups and all of the CYA-based regimens (P≤ 0.003). Azathioprine/prednisone-based patients were more likely to be off drugs at one year as compared with CYA-based patients (p<0.001). TAC-based patients enjoyed a similar advantage (p=0.003). Figure 10 illustrates the cumulative percent of patients off immunosuppression at one year.
Another 37 (39%) patients are currently in the uninterrupted process of drug weaning with a mean percentage decrease in baseline immunosuppression (mg/day) of 44 ± 31 for TAC, 30 ± 35 for CYA, 14 ± 88 for prednisone, and 13 ± 33 for azathioprine.
Interruption of weaning occurred in the remaining 40 (42%) patients; 28 because of rejection and 12 because of withdrawal from the protocol. The rejection was histologically documented in 21 of these recipients (18 acute and 3 incipient chronic) and clinically suspected in the other 7. Withdrawal from the weaning protocol was due to noncompliance (n=8), pregnancy (n=1), kidney transplantation (n=1), and recurrent primary biliary cirrhosis (n=2). The non-compliant cases were maintained at whatever level of treatment had been reached at the time the decision to stop weaning was made. All 8 of these recipients were well at the time and remain so.
The achievement of a drug-free state (n=18) did not significantly improve either renal function or pre-existing hypertension. The most common benefits were resolution of gingival hyperplasia in 2 children, resolution of infections in 3, resolved PTLD in 3, and relief of neuropsychiatric complaints in 2 recipients.
During the study period, significant elevation in AST, ALT, and GGTP occurred in 44 (46%) patients followed by liver biopsy in 37. The biopsy findings were acute cellular rejection (n=18), minor duct injury (n=3), hepatitis (n=3), normal histology (n=7), evidence of biliary tract obstruction (n=3), nonspecific portal inflammation (n=2), and steatosis (n=1). Rejection, which was diagnosed by biopsy in 18 (19%) of the 95 patients occurred at a mean time of 13.2 months after the start of weaning and was graded as mild or moderate by histopathologic criteria described in detail elsewhere (11). Treatment consisted of pulse steroids with resumption of the baseline immunosuppressive therapy in 15 patients and conversion from CYA to TAC in 3 patients with moderate rejection. None of the patients required OKT3.
Seven additional patients were diagnosed without biopsy as having acute rejection. Because the liver injury tests were normalized after the increase of immunosupppression, they were arbitrarily included in the acute rejection category. There were no infectious complications following treatment for rejection except for one example of herpes stomatitis that was successfully managed with anti-viral therapy.
Chronic rejection was suspected in 3 (3%) cases based upon the development of de novo histopathologic findings during weaning. The previous level of immunosuppression was restored in 2 patients and TAC was started in the third, who had been weaned from azathioprine. None of the patients developed further evidence of chronic rejection.
Three (3%) patients developed biopsy evidence, during weaning, of recurrent (n=2) or de novo (n=1) viral C hepatitis. Weaning has continued in these patients with stabilization of liver injury tests.
Two (15%) out of a total of 13 recipients who entered the study 7 and 8 years after transplantation for primary biliary cirrhosis developed recurrent disease. The diagnosis was histopathologically documented 7 and 24 months after the initiation of weaning. Liver function tests normalized after return to the pre-existing combined CYA/steroid-based immunosuppression. There has been no recurrence of other autoimmune hepatic diseases.
In 3 patients (3%), abnormal liver function tests were incorrectly attributed to rejection and briefly treated as such. After reconstruction of the bile duct, liver enzyme values normalized in all 3 recipients and weaning was resumed.
The disappointing results with the addition of azathioprine, MMF, and other adjunct agents to CYA/prednisone-based regimens (see earlier) have also been reported with kidney transplantation (24). A prime objective of these increasingly complex multiple agent cocktails is the reduction of the incidence of acute rejection toward zero, with the assumption that chronic rejection would be commensurately reduced. Instead, the 3–5-year survival of kidney and liver allografts has had no correlation with the incidence of readily controlled acute rejection. Only steroid resistant (incompletely reversible) rejection has proved to be an early poor prognostic omen. In the meanwhile, chronic rejection has emerged as the greatest unsolved problem in clinical transplantation (25).
A plausible explanation for this disappointing observation may be that the immunosuppression which permits a natural tolerogenic succession of the adaptive immune response (ie., clonal activation → exhaustion → deletion), carries the penalty of systematically undermining this process (see section on “allograft acceptance and acquired tolerance”). This explanation is consistent with the finding described in the drug weaning trials that complete discontinuance of therapy in patients bearing long surviving grafts has been consistently possible only when the original treatment was with “weak” azathioprine-based regimens, or when weaning was from simple monotherapy with TAC (22,23).
The potential anti-tolerogenic role of immunosuppression described in the section on drug weaning is evident in all transplantation treatment protocols including those used for both conventional and adjunct bone marrow (BM) transplantation. This reservation notwithstanding, a persuasive argument can be made for administering adjunct donor BM to organ recipients on grounds that it merely augments the natural mechanisms set in motion by the organs migratory “passenger leukocytes”. These are the same chimerism-driven mechanism that are responsible for acceptance of organ allografts in continuously immunosuppressed patients and in bone marrow transplant recipients preconditioned with cytoablation (see section on “allograft acceptance and acquired tolerance”). If given to non-cytoablated organ recipients at about the same time as the organ transplantation when the immunocompetent donor and recipient cells are subject to the same immunosuppression, the mutual “nullification” of the 2 cell populations makes the infusions safe (3–7). Violation of this principle may have caused the death of one of our liver recipients.
The BM augmentations were carried out between December 1992-September 1998, involving a total of 271 organ allograft recipients (26). Of these, 229 received a single dose of 3–6×108 unmodified donor BM cells/kg body weight and the remaining 42 received multiple infusions (2×108 donor BM cells/kg body weight/ dose) of donor BM during the first 48 hours after transplantation in a concurrent protocol. The types of organ transplantation in each study cohort are summarized in Table 3. As contemporaneous controls, 133 recipients of liver (n=33), kidney (n=47), heart (n=23), lung (n=12), small bowel (n=16) and multiple organs (n=2) were also accrued for whom BM cells were not available. Patient characteristics and clinical features were similar for both the study (BM augmented) group and control cohorts. BM cells were harvested from the vertebral body of the cadaveric donors (27) and immunosuppression was with TAC and prednisone. MMF but not OKT3 also was given to 53 of the study patients and 17 of the control group. For steroid-resistant allograft rejection, an anti-CD3 monoclonal antibody (OKT3) was used. The methods utilized for the in vitro monitoring of donor cell chimerism and the mixed leukocyte reaction were fully described elsewhere (26–28).
Although both single and multiple infusions of unmodified donor BM cells were free of complications with transplantation of the other organs, multiple infusions of BM were followed by the development of fatal graft-versus-host disease in one (17%) out of 6 liver transplant recipients; the multiple but not the single infusion protocol was discontinued. With a median follow-up of 30 ± 13 months, the overall patient and graft survival as well as graft function were similar in both the study and control cohorts with successful engraftment rates of 82% and 78% for the study and control groups, respectively. The actuarial 5-year graft and patient survival for the BM augmented (single and multiple infusions) and control recipients are shown in Figure 11. With a minimum follow-up of at least one year, the maintenance immunosuppression required to maintain a rejection free-state has been lower in the BM-augmented group: daily TAC and steroid doses of 6.4±3.9 and 5.8±3.3 mg for the study group and 7.1±5 and 6.2±3.7 mg for the control group. A steroid-free state has been achieved with a slightly higher frequency among BM-augmented (64%) than the control (59%) group. However, the differences were not statistically significant.
In the total study population, the cumulative risk of acute cellular rejection in the first year was reduced from 78% in the control population to 62% (p=0.037) (Fig. 12A). This effect was more pronounced with multiple BM infusions than with single BM infusions (Fig. 12B) (29). Of interest, the beneficial effect of BM augmentation was most obvious in heart allograft recipients with a rejection-free rate of 62% and steroid–resistant rejection rate of only 6.7% compared with 18% and 8.6%, respectively, for the control group. There also was a significant reduction in the development of obliterative bronchiolitis with BM augmentation (3.8%) in lung allograft recipients, compared with contemporaneously acquired controls (31%) (26,29).
Adult recipients who were EBV positive and received grafts from EBV-negative donors developed PTLD with similar incidences in the study (22%) and control group (20%). The disease was reversible, however, with withdrawal or reduction of immunosuppression in all but 2 of the study patients. One of these patients recovered following immunotherapy with autologus lymphokine-activated killer (LAK) cells (30), while the other succumbed to disseminated disease. In the pediatric population, none of the study cohort (n=17) developed PTLD whereas both of the control patients developed the disease, which was fatal in one.
BM-augmented patients had a 92% incidence of peripheral blood chimerism compared with 50% in controls. These findings were consistent throughout the duration of follow-up. Semi-quantitative PCR studies in cases of male donor-to-female recipient showed that the levels of donor cell chimerism were at least 2 logs higher after BM augmentation, with the greatest difference following multiple BM infusions.
Serial one-way mixed leukocyte reaction (MLR) determinations were performed, using donor and third party stimulator cells when the recipient pretransplant blood samples and donor splenocytes were available (31). A higher incidence of donor-specific hyporeactivity was found in the BM-augmented liver (47%), kidney (65%), and lung recipients (50%) compared with the control cohorts in which the hyporeactivity incidences were 21%, 38%, and 40%, respectively. A reduced incidence of hyporeactivity was not seen, however, in heart, small bowel, kidney/pancreas, and multiorgan allograft recipients.
Historically, the recurrence of hepatitis B virus (HBV) infection after liver transplantation has been associated with unsatisfactory survival especially in patients with active viral replication at the time of transplantation and in those who were not treated with long-term B-virus prophylaxis (32,33). The recent use of antiviral agents (eg., lamuvidine) and long-term use of passive immunoprophylactic therapy have improved early outcome and reduced the incidence of disease recurrence. However, long-term follow-up is required to assess the eventual efficacy of the current protocols.
In a recent study at our center, the clinical outcome of 183 liver transplant recipients with end-stage liver disease secondary to hepatitis C infection was compared with a contemporary control group of 556 patients who underwent transplantation for non-viral, non-malignant end-stage hepatic diseases (34). All patients were prospectively screened for anti-HCV antibodies and HCV RNA by reverse-transcriptase polymerase chain reaction. All recipients were receiving low-dose TAC-based immunosuppression.
Fifty-eight (32%) patients in the HCV group received interferon-alpha therapy at some point in their clinical course with no or partial response and without increasing the risk of rejection during the course of treatment (35). The incidence, time to recurrence, and response to interferon-alpha therapy did not differ significantly between the various genotypes in our liver transplant population, but there was a trend towards higher infectious morbidity and overall mortality in patients with genotype 1b (36,37).
The cumulative survival rates for the HCV group were 80% after one year and 75% after 3 years compared with rates of 84% and 78%, respectively, in the control group (p=0.452; Fig. 13A). Primary graft survival rates at the same time intervals for the HCV and control group were 72% and 77.5% at one year and 67% and 72% at 3 years, respectively (p=0.144; Fig. 13B). The incidence of retransplantation in the HCV group and the control group was 12.6% and 10.4%, respectively (p=0.42). The cumulative risk for retransplantation over time for both groups is shown in Figure 14. None of the retransplantations in the HCV recipient group during the study period were because of recurrent disease.
The actual risk and natural history of recurrent PBC after liver transplantation has been studied in a total of 421 consecutive adult recipients of 513 grafts who underwent hepatic replacement at our center for PBC (38). The diagnosis was confirmed in all patients by histologic examination of the hepatectomy specimen. The immunologic markers were typical for PBC in 79.7%. The immunosuppressive regimen was based on CYA for 339 grafts and on TAC for the other 174.
With a median follow-up of 5.8 years, the actuarial patient survival rate was 82% at one year, 74% at 5 years and 63% at 10 years with graft survival rates of 70%, 63%, and 54%, respectively. Disease recurrence was histologically documented in 54 grafts (10.5%). The criteria for recurrence included mononuclear cell portal inflammatory infiltrate, portal epithelioid granulomas, lymphoid aggregates in the portal tracts, bile duct damage and/or florid duct lesions. At time of diagnosis, these histologic changes in the allografts were early in all cases. Median onset of diagnosis was 67 months from the time of liver transplantation. The estimated cumulative risk of disease recurrence using the Kaplan-Meier method was 1.1% at one year, 7.9% at 5 years, and 21.6% at 10 years (38).
With a median follow-up of 91 months from the time of transplant and 16 months from the onset of diagnosis, only one patient developed hepatic dysfunction serious enough to require graft replacement. The risk factors analyzed for disease recurrence included occurrence of acute/chronic rejection, recipient/donor age and sex, HLA mismatch, cold ischemia time, and immunosuppressive regimen. Using multivariate Cox regression, the independent risk factors were recipient age (p=0.02), cold ischemia time (p=0.02), and use of TAC-based immunosuppression (p=0.01). None of the current survivors with disease recurrence had clinical or biochemical evidence of significant graft dysfunction or extrahepatic disease recurrence during the time of observation.
A total of 303 consecutive recipients of 380 hepatic allografts treated for PSC during the last 16 years at our institute were studied (39). Follow-up evaluation included serial liver biopsies and cholangiographic studies in most of the recipients. The diagnosis of PSC recurrence was based upon the documentation of typical cholangiographic non-anastomotic multiple biliary strictures and obstructive cholangiopathy on biopsy in a graft without vascular compromise.
With a mean follow-up of 5.8 years (range: 0.0–15.4), the actuarial patient survival was 88% at one year, 73% at 5 years, 65% at 10 years, and 61% at 15 years with graft survival of 74%,58%, 51%, and 48%, respectively. Recurrence of PSC was documented in 52 (13.7%) allografts with a mean onset of 3.9 years (range: 0.3–9.3). The cumulative risk, estimated by the Kaplan-Meier method, was 3.6% at one year, 12% at 5 years, 25.3% at 10 years, and 25.3% at 15 years (39).
With a median follow-up of 7.6 years from liver transplantation and 3.7 years from onset of diagnosis, none of the patients died of recurrent disease although 8 (15.4%) of the 43 survivors with recurrent PSC already have undergone retransplantation and 2 more are awaiting graft replacement. Old age (p<0.03) and the presence of inflammatory bowel disease (IBD) (p<0.04) were significant risk factors for recurrent hepatic disease. Donor age and sex, duration of PSC and lBD, HLA mismatch, cold ischemia time and baseline immunosuppression were not significant risk factors for disease recurrence.
Only 2 of 91 consecutive liver recipients treated at our institution for end-stage AIH without other concomitant diseases (40) have developed histopathologic changes highly suggestive of recurrent AIH. The diagnoses of disease recurrence 5.0 and 5.5 years after transplantation were based upon histopathologic changes characteristic of the original disease, particularly the presence of hepatitic activity with plasmacytic infiltrates in the absence of viral infection. Neither graft was lost due to disease recurrence (40).
Primary hepatic tumors that commonly have been treated with hepatic replacement are hepatocellular carcinoma (HCC), cholangiocarcinoma (hilar, peripheral), fibrolamellar hepatoma, and hepatic epithelioid hemangioendothelioma.
To estimate the recurrence rate of HCC after liver transplantation and the associated risk factors, 214 liver recipients with this diagnosis were analyzed. Case accrual was from January 1981-December 1992 (41). Of 178 patients who survived more than 150 days, 71 (40%) have suffered HCC recurrence. Based on 5 risk factors (gender, tumor number, lobar tumor distribution, tumor size, grade of vascular invasion), we developed artificial neural network models predicting the likelihood of HCC recurrence within one, 2, and 3 consecutive years after transplantation.
The following inferences were made. 1) In addition to the previously known risk factors, gender exerts a powerful effect on HCC recurrence. With identical tumor characteristics, female patients have a lower risk of recurrence than males. 2) The grade of vascular invasion is the single most influential risk factor for HCC recurrence. Conversely, the absence of vascular invasion with a maximum tumor diameter <4cm ensures no recurrence, except in male patients with multiple tumors and bilobar distribution. 3) A bilobar tumor distribution has a strong impact on HCC recurrence. 4) In general, the larger the diameter of the largest tumor, the greater the risk of HCC recurrence. 5) Essentially all patients with positive margins die from recurrent disease within one year after transplantation.
The influence of postoperative chemotherapy on patient survival and recurrence was studied using the Kaplan-Meier survival analysis. For the group predicted not to suffer HCC recurrence, postoperative chemotherapy did not improve patient survival. For the group with the intermediate risk of recurrence, a definite but not a significant trend was observed of prolonged tumor-free patient survival. The survival benefit of postoperative chemotherapy was equivocal among patients with high risk of recurrence (Fig. 15).
This uncommon variant of hepatocellular carcinoma is distinguished by histopathologic features suggesting greater differentiation than conventional HCC. Over a period of 27 years (1968–1995), 13 patients with FH were treated with total hepatectomy and liver transplantation, after it was determined that curative partial hepatectomy was not feasible (42). In 6 of these 13 patients total hepatectomy was extended to adjacent organs; 2 of the 6 had upper abdominal exenteration and organ cluster transplantation. In one patient, liver transplantation was combined with pancreaticoduodenectomy (Whipple procedure). Cumulative survival rates at one, 3, 5 and 10 years were 95%, 45%, 35%, and 20%, respectively. Tumor recurrence was confirmed in 9 (69%) of the 13 patients 12–52 months after transplantation. The liver and lung were the most frequent sites of recurrence. Survival thereafter was up to 43 months.
The TNM stage at the time of transplantation was significantly associated with tumor-free survival. Although tumor-positive lymph nodes were associated with a shortened tumor-free survival, the most adverse factor was the presence of vascular invasion. As expected, total hepatectomy was a viable option when a subtotal hepatectomy could not be performed in patients with no vascular invasion and without extra-hepatic disease.
Cholangiocarcinoma is an uncommon malignant neoplasm arising from the bile duct epithelium. The tumor location is either hilar (Klatskin tumor) or peripheral (intrahepatic). Because of the rarity of this malignancy, its prognostic risk factors have not been completely analyzed in past reports to prospectively assess the risk of recurrence and its impact on survival. Our 15 years (1981–1996) experience with a total of 38 patients who had hilar cholangiocarcinoma and underwent liver transplantation with curative intent were analyzed (43).
In 27 of these patients, subtotal hepatectomy could not be performed either because of the extent of the tumor (n=14) or because of concomitant advanced cirrhosis (n=13). The remaining 11 patients underwent upper abdominal exenteration and cluster organ transplantation (liver with various combinations of pancreas, duodenum, and short segments of proximal jejunum) under even less favorable circumstances: lymph node metastases, direct tumor extension into adjacent organs, or regional metastasis (n=10). Dense fibrotic reaction caused by preoperative radiation therapy made the intraoperative assessment of tumor impossible in one of these patients with sclerosing cholangitis. In about half of the 38 patients, adjunct external radiation therapy was given with or without 5-FU sensitization before or after the operation.
Survival rates for one, 3, and 5 years after the hepatic or more complex transplantation procedures were 60%, 32%, and 25%, respectively. Tumor recurrence was diagnosed in about 50% of the cases. Although, the exact timing of recurrence was often difficult to determine, most of the recurrences were diagnosed within the first 3 years. The most common sites of recurrence were the hepatic hilum, liver, lung, bone, and skin. Univariate analysis revealed that T-3, positive lymph nodes, positive surgical margins, and pTNM stage III and IV were statistically significant adverse factors. With multivariate analysis, pTNM stage 0, I and II, negative lymph node, and negative surgical margins were statistically significant good prognostic variables. It was concluded that a 5-year survival rate of 50% could be achieved by total hepatectomy and liver transplantation for hilar CC when lymph nodes and surgical margins are free of tumor and in the absence of distant metastases (43).
This conclusion was validated in an independent analysis of a subset of 20 patients with the diagnosis of peripheral CC treated during a slightly different period (1981–1994) and followed longer (44). With a median follow-up of 6.8 years, the overall tumor-free patient survival rates were 57% at one year, 34% at 3 years, and 27% at 5 years. However, when patients who did not have prognostic factors of positive margins, multiple tumors, and lymph node involvement were excluded, patient survival rates at one, 3 and 5 years were 74%, 64%, and 62%.
This multifocal, low-grade malignant neoplasia is characterized by its epithelial-like appearance and vascular endothelial histogenesis. The outcome of 16 patients treated with liver transplantation was studied to assess the potential risk of recurrence (45). With a median follow-up of 4.5 years (range: 1–15) after transplantation, the actuarial patient survival was 100% at one year, 88% at 3 years, and 73% at 5 years with disease-free survival rates of 81 %,69%, and 60%, respectively. Involvement of the hilar lymph nodes or vascular invasion did not affect survival. The most common sites of recurrence were the liver, lung, and bone.
The potential value of composite abdominal visceral transplantation for patients with combined liver/intestinal, and other abdominal visceral organ failure has been established (46–48). These procedures also may find a place in the treatment of intra-abdominal neoplasms (49,50, also see preceding section). To test the immunoprotective role on the intestine of the concomitantly transplanted liver, we have included the isolated intestinal grafts in the present analysis to serve as a control group.
Between May 2, 1990 and August 11, 1997, a total of 63 consecutive patients (43 children, 20 adults) received either combined liver-intestinal (n=48) or multivisceral grafts (n=15) and 35 patients (16 children, 19 adults) underwent intestinal transplantation alone. All of the 48 liver-intestine recipients and the 13 multivisceral grafts in which the liver was included (in 2 others it was excluded) had advanced hepatic disease that was TPN induced in most cases. Ischemia of the upper abdominal organs (adults) and diffuse irreversible gastrointestinal diseases involving the foregut (mostly children) were the usual reasons for choosing multivisceral in preference to liver-intestine replacement. Retransplantation became necessary in 6 patients; 4 of composite grafts and 2 of isolated intestine.
The clinical features of the allograft recipients are summarized in Table 4. The causes of intestinal failure were short gut syndrome (80%), dysmotility syndrome (11 %), intestinal neoplasm (6%), and enterocyte dysfunction (3%) with equal distribution among the composite visceral and isolated intestinal allograft recipients. The causes of the short gut syndrome were variable: predominantly thrombotic disorders, Crohn's disease, and trauma in adults and mostly volvulus, gastroschisis, necrotizing enterocolitis, and intestinal atresia in children. Interestingly, 3 of the pediatric patients had undergone liver replacement 4–10 years prior to intestinal transplantation (46). Forty-six (73%) of the 63 composite allograft recipients were United Network of Organ Sharing status I (intensive care unit bound), or /I (permanently hospital bound).
Donor characteristics, and retrieval operations have been described previously (46,49,51). Over the past 2 years, the standard harvesting technique of combined liver-intestinal grafts has been modified with inclusion of the duodenum in continuity with the jejunum and allograft biliary system (Fig. 16). The technique reduces the recipient operative time and avoids the potential risks of biliary reconstruction. To avoid humoral GVH complications (ie., from isoagglutins) every effort is made to find donors whose ABO blood types are the same as the recipients'.
The principles and various modifications of the generic liver-intestine and multivisceral procedures have been reported elsewhere (49,50,52–54). When combined liver-intestinal transplantation was performed, preliminary portocaval shunt of the native vessels was performed in order to alleviate portal hypertension and minimize bleeding during dissection and removal of the host organs. The shunt was left permanently in 69% of the operations, but in the remaining 31 %, it was disconnected after graft revascularization and the host portal vein was anastomosed to the side of the allograft portal vein.
Basic immunosuppression was with T AC and prednisone. The perioperative management with emphasis on monitoring of graft rejection, and the long-term graft function of these patients has been updated elsewhere (46). The humanized anti-IL2 monoclonal antibody “Zenapax” has been recently utilized as induction immunoprophylaxis. In recent cases, donor leukocyte augmentation to facilitate acceptance of the allograft has always been utilized when informed consent was obtained from both donor and recipient families. Intestinal recipients for whom permission for bone marrow harvest could not be obtained, were considered to be prospective contemporaneous control patients.
The preparation of the “un purged” BM cell infusate from the donor thoracolumbar vertebrae and the infusion method has also been described elsewhere (27). The methods used for diagnosis and treatment of rejection and graft versus host disease were previously described (46,55). Protocols for prophylactic and active treatment of viral, bacterial, and fungal infections were adopted from those developed for liver transplantation (54). The newly developed technique of semiquantitative PCR assay of Epstein-Bar virus (EBV) in the peripheral blood has been recently used for early detection and monitoring of EBV viremia, which forewarns the development of PTLD.
With a mean follow-up of 32 ± 26 months (range, 1–86), the overall patient survival was 72% at one year and 48% at 5 years with graft survival rates of 64% and 40%, respectively (Fig. 17). Most of the deaths and/or graft losses occurred within the first 30 postoperative months. The most common causes of failure were infection, rejection, and PTLD. Long-term engraftmentwas better achieved with the composite grafts compared with the intestine only (Fig. 18).
Patients with BM augmentation had 89% patient survival at 6 months and 72% at 2 years compared with 77% and 57% in the control group (Fig. 19A), Importantly, no mortality occurred beyond the first postoperative year in the BM group and only one of the 4 deaths was from a classic complication of immunosuppression. In the control group, 4 of the 7 deaths were from opportunistic infections, and a fifth was from intractable rejection.
Survival of the BM-augmented grafts was 75% at 6 months and 61 % at 2 years compared with 77% and 52% in the control group (Fig. 19B). It should be noted, however, that 2 graft losses in the experimental group were caused by management or technical errors. Acute rejection was the cause of graft loss in one (5%) of the BM-augmented, and 2 (9%) of the control grafts.
The intestine contained in a composite graft had a significantly lower incidence of rejection (66%) compared with the isolated intestine (92%) during the first 30 postoperative days (p=0.004). The median postoperative time to the first episode of intestinal rejection was 9 days for the isolated intestine, and 19 days for the composite grafts (p<O.001). OKT3 was required to treat steroid-resistant rejection in 20 (54%) of the isolated intestine, and 15 (23%) of the composite grafts (p=0.002).
Acute vascular rejection was histologically documented in 3 of the isolated intestine and none of the composite grafts. Although the cumulative rejection rate of intestine contained in a composite graft approached that of isolated intestine during the rest of the first postoperative year (Fig. 20A), the rate of composite graft loss from rejection was less than half that of the isolated intestine (Fig. 20B). Among recipients of composite grafts, the incidence of liver rejection was less than half that in the intestine (Fig. 21). The median time to diagnosis of the first episode was 58 days, and the mean number of episodes in all cases was 0.78±1.5.
Chronic rejection was histologically diagnosed in 3 enterectomy specimens for an incidence of 8% with isolated intestinal transplantation. The only example of chronic rejection of a composite graft was in a patient whose donor was strongly crossmatch positive; both the liver and intestine were affected and the recipient died of hepatointestinal failure. Because the diagnosis of chronic rejection requires study of a transmural specimen (not deliberately obtained in biopsies), the overall incidence of chronic rejection undoubtedly was grossly underestimated (46).
Bone marrow augmentation increased the risk of acute rejection (85%) during the first postoperative month compared with the control patients (64%; p=0.20) but the overall incidence, frequency and median time of onset of the first episode were similar in the 2 groups. OKT3 was used to control rejection in 4 (20%) of the BM recipients and 9 (41%) of the control patients (p=0.2).
The risk of CMV disease in the isolated and composite graft recipients was similar, with a combined incidence of 36%. There was no difference in the incidence and onset of CMV disease in patients given the adjunct BM treatment (35%) versus control patients (44%; p=0.8).
A total of 20 patients (20%) developed EBV-associated PTLD. With an incidence of 33%, the multivisceral recipients were more prone to PTLD than the recipients of liver-intestine (21%), or the intestine only (11%; p=0.2). In the BM augmentation trial period, the incidence was 10% in the experimental group and 18% in the control patients (p=0.60). By univariate and multivariate analyses, the 3 significant risk factors for the development of PTLD were young age (children), type of intestinal graft (multivisceral), and recipient splenectomy (46).
Skin changes suspicious for GVHD were observed in 11 patients, but histologically verified in only 5 (5%). Two of the 5 patients had received intestine only and the other 3 were composite allograft recipients. The GVHD was lethal in only one patient, a previously reported child (56) with preexisting IgA deficiency who received a liver-intestine graft. An adult recipient of a multivisceral graft developed chronic GVHD (diagnosed by biopsy of the buccal mucosa) and eventually died of disseminated PTLD with the chronic GVHD still active. The disease was self-limited in the other 3 patients, one of whom had received adjunct BM (46).
The nonavailability of either an appropriate monoclonal antibody of the donor-specific primer precluded the analysis of chimerism in 4 BM-augmented and 7 control patients. These studies were feasible in 31 patients. At their most recent follow-up, 16 (100%) of 16 BM-augmented, and 12 (80%) of 15 non-augmented recipients had donor cell blood chimerism. The BM-augmented recipients had much higher levels of donor cell chimerism than control patients (46).
With univariate and multivariate analyses, 6 risk factors had been previously reported (57): high TAC blood level, steroid bolus therapy, use of OKT3, length of operation, CMV disease, and inclusion of a segment of colon with the graft. In addition, the recent analysis has added, as adverse factors, the number of intestinal rejection episodes, development of PTLD, cold ischemia time, number of previous abdominal operations, and male donor-to-male recipient.
Outcome analysis of the last 115 intestinal and multivisceral allografts showed significant improvement in survival during the last 4 years with a cumulative survival rate of 65% at 4 years (Fig. 22). Such an achievement may reflect the recent refinement in patient selection, operative techniques, immunomodulation, and postoperative management (48).
The total loaded cost of the 3 different types of intestinal transplantation has been significantly reduced during the past 4 years. Between 1990–1994, the average cost was $203,111 for the isolated intestine, $252,453 for the combined liver-intestine, and $284,452 for the multivisceral procedure. These have been reduced to an average of $132,285, $214,716, and $219,098, respectively.
Because of the lack of an alternative life-saving treatment for patients with hepatointestinal failure, it is difficult to put a comparative price on hepatointestinal or multivisceral transplantation. However, intestinal transplantation alone can be compared with maintenance TPN. Based on Medicare data, the average yearly cost of TPN in 1992 was more than $150,000 per patient, not including the cost of frequent hospitalization, medical equipment, and nursing care (58). The total dollars spent on TPN are escalating yearly because of the growing population of home- and hospital-bound patients receiving this treatment. Based upon these data, isolated intestinal transplantation becomes cost-effective by the second posttransplant year.
With a mean follow-up of 3.3 years (range: 1–94 months), full nutritional autonomy with complete discontinuation of TPN was successfully achieved in more than 90% of the recipients. Thirty-one patients are alive with good nutrition beyond the third postoperative year and 18 are beyond the 5-year milestone (48). These individuals have reported improved quality of life measures in comparison to TPN dependency (59,60).
The immune system did not evolve to frustrate transplant surgeons, but by a survival-driven co-evolution with environmental microorganisms (61–65). Nevertheless, because of the central role of the major histocompatibility complex (MHC) in T-cell recognition, the immunology of transplantation and infectious disease has much in common (66). Thus, the barriers to xenotransplantation can be defined in the context of infectious disease. From the infectious perspective, the immune system makes strategic decisions comparable to those following transplantation, based on its early distinction of cytopathic from the less dangerous non-cytopathic microorganisms.
Non-cytopathic microorganisms, which are characterized by their entry into host cells, can be accommodated by the host in ways that allow both pathogens and host cells to survive. Here, the highest priority is avoidance of tissue damage. Otherwise, complete immune destruction of widely disseminated pathogens could kill the host. The sophisticated recognition and effector mechanisms against overwhelming non-cytopathic infections (eg., hepatitis B or C virus) are much the same as those which have been successfully subverted for organ allotransplantation with the aid of non-specific immunosuppressive drugs (see section on “organ engraftment and acquired tolerance”).
In contrast, the full force of the host innate and adaptive immune capability is urgently mobilized quickly and completely to eliminate cytopathic pathogens, without much regard for damage to infected host cells. The antigenic signal issued by an a cytopathic infectious invader to B cells may come from its densely arranged and ordered repetitive epitopes, sometimes aided by lipopolysaccharides or by other unknown means (summarized in 61,63,64). The first line of defense is dominated by interferons, macrophages, gamma-delta T cells, “natural” killer (NK) cells, and by B cells which recognize the suspect antigen patterns and may be activated directly without T-cell help. In addition, non-specific or less specific effector mechanisms such as complement, interleukins, and phagocytes are promptly involved.
Similar mechanisms, which are predominantly those of innate immunity, are thought to be responsible for the hyperacute rejection of xenografts, and of allografts transplanted to recipients who have circulating antigraft antibodies because of ABO incompatibility or prior allosensitization (67). The best characterized antigen on the cells of discordant xenografts is the surface carbohydrate epitope Gal-a (1,3) Gal (68,69) which is chemically similar to ABO antigens and is found on numerous bacteria, protozoa, and viruses (70).
The transfection of human complement regulatory proteins into discordant donor species (71–73) temporarily delays xenograft destruction, but the other mechanisms of innate immunity promptly cause inexorable rejection (74,75). Successful xenotransplantation from such donors will require different and combined approaches whereby antigens are eliminated, or human equivalents are introduced.
Gene knockout procedures have not been done in the pig. However, using technologies that may be applicable in pigs, Osman et al (76) working in COS cells transfected with the Gal-a (1,3) Gal gene have been able to reduce the cell surface expression of the gene product to negligible levels. This was done by a further double transfection with human a-galactosidase and a 1,2 fucosyl transferase cDNAs. a-galactosidase cleaves a-linked galactosyl residues of the epitope, thereby exposing subterminal saccharides to which there also are natural antibodies. However, the a 1,2 fucosyl results in the substitution of Gal-a (1,3) Gal with the non-immunogenic H substance (ie., the universally tolerated O blood group antigen) (77) and eliminates complement-mediated lysis of the transgenic COS cells by human serum (76).
Because what must be done to succeed with xenotransplantation has become clear, the prospect of using animal organs for human transplantation is brighter than at any previous time (78). We have already learned empirically with allotransplantation how to work with the defense mechanisms developed by nature to control noncytopathic infections. Because this approach will not work for the xenogeneic antigens that resemble cytopathic microorganisms, these antigens in discordant animal donors will have to be deleted or changed. It remains to be seen whether species restriction of complement (67,79–81) will necessitate transfection of complement regulatory proteins to prevent continuous complement activation. If so, strategies for xenotransplantation of the liver will be more complex because this organ is the source of most complement. Finally bridge trials with xenografts to provide desperately needed temporary organ function for candidates waiting for allografts may be the best way to obtain information about the efficacy of donor species alteration.
The unwarranted assumption that stem cell-driven hematolymphopoietic chimerism was irrelevant to successful whole organ transplantation as currently practiced led to alternative inadequate explanations of organ allograft acceptance, clouded the meaning of successful bone marrow transplantation, and precluded for more than 3 decades the development of a cardinal principle applicable to all aspects of transplantation. Recognition of this error and the incorporation of the chimerism factor into transplantation biology have allowed previous enigmas of organ as well as bone marrow engraftment to be explained and should allow past and future discoveries by basic immunology to be more meaningfully exploited to advance clinical transplantation, including strategies of xenotransplantation.