|Home | About | Journals | Submit | Contact Us | Français|
Between 1955 and the end of 1967, the framework of clinical organ transplantation that exists today was established in a small number of centers in continental Europe, Great Britain, and the North America. Here, I will describe the events during this period that led to human liver replacement. These efforts were influenced by, and in turn contributed to, the development of other kinds of organ transplantation, and especially that of the kidney.
As described by the immunologist, Leslie Brent,1 and the Glasgow surgeon-historian David Hamilton,2 transplantation of all the major organs except the liver can be traced back to the early 1900s. In contrast, liver transplantation was not mentioned in the literature until 1955. The first report was in a journal called Transplantation Bulletin, the forerunner of the current Transplantation.
In this one-page article, C Stuart Welch of Albany Medical College (Fig. 1), described the insertion of a hepatic allograft in the right paravertebral gutter of dogs, without disturbing the native liver.3 A more complete report was published in Surgery the following year.4 The auxiliary livers were revascularized by anastomosing the graft hepatic artery to the recipient aortoiliac system, and by end-to-end anastomosis of the portal vein to the host inferior vena cava (Fig. 2). The transplanted organs underwent dramatic shrinkage, a finding that was incorrectly considered for most of the next decade to be a special feature of liver rejection.
In 1957, Welch gave a lecture on his experimental operation during a visiting professorship at the University of Miami Medical School, where I was a general surgery residen. Because he had provided the auxiliary grafts with high flow input of systemic venous blood from the recipient inferior vena cava, Welch was convinced that his transplanted livers were optimally revascularized.
Contrary to this assumption, I had been exploring the possibility that the first pass delivery of endogenous insulin from the pancreas to the liver by portal blood was important in metabolic crossregulation of the two organs. Evidence consistent with this hypothesis had come from studies of Eck’s fistula (portacaval shunt) and reverse Eck’s fistula.5,6 If the hypothesis was correct, the Welch procedure was physiologically flawed.
To pursue the metabolic studies, I had developed a new method of total hepatectomy.7 The unique feature of the procedure was preservation of the retrohepatic inferior vena cava, as is done in today’s piggy-back modification of liver transplantation8-10 (Fig. 3). Reimplantation (autotransplantation) of the excised specimen was soon envisioned as an ideal way to study the portal physiology of an unequivocally denervated liver that was devoid of cryptic collateral arteries. Welch had obviated the need to anastomose multiple hepatic veins by including as part of his auxiliary allografts the short length of vena cava into which all of these hepatic veins empty and by connecting the upper end of the caval stump to the recipient vena cava (Fig. 2).
For liver replacement, it was easiest to excise the host retrohepatic vena cava along with the native liver, and to replace it with the comparable caval segment of an allograft (Fig. 4). Restoration of caval continuity required end-to-end anastomoses: one at the diaphragm and the other below the liver. The performance of everting anastomoses in a confined space without the need for long vascular cuffs was made feasible by perfection of the intraluminal continuous suture technique used today (Fig. 5). Hepatic arterial and biliary tract anastomoses were done with conventional methods.11
At first, none of the animals survived the operation. This finally was accomplished in June, 1958, a few days after I moved from Miami to Northwestern University in Chicago. During the rest of the summer, the different kinds of liver revascularization studied in Miami in non-transplant models were systematically tested in allografts (Fig. 6). Any alteration of the portal supply resulted in reduced survival. Although the findings were congruent with the original hypothesis that splanchnic venous blood contains liver-modulating factors, this issue was not fully resolved for another 15 years.
Two administrative steps were taken at the end of that summer that ensured crucial research support for at least 5 years. The first was submission of a four-page NIH grant request for continued investigation of the liver’s role in insulin and carbohydrate metabolism that included liver transplantation. In addition, my North-western chairman, Loyal Davis, nominated me for a Markle Scholarship; the purpose of these awards was to keep young faculty members in academic medicine in pursuit of some stipulated career objective. My proposal was the development of clinical liver transplantation. Just before Christmas, 1958, I was notified that the grant would be fully funded, and that I was to be a Markle scholar.
Unknown to the in 1958, our attempts at liver replacements had been preceded by those of a UCLA surgeon named Jack Cannon (Fig. 7). Cannon, now 83 years old, later practiced in Phoenix where he has retired. In collaboration with William P Longmire Jr, Cannon already had made important basic observations about spontaneous tolerance in a neonatal chick model of skin transplantation, and the facilitation of such tolerance with adrenal corticosteroids.12 The significance of this neglected work is discussed elsewhere.13 His liver transplant experiments were mentioned in a one-page review14 entitled “Brief Communication,” published in 1956 in the same Transplantation Bulletin as Welch’s report of the year before.
Cannon’s description did not specify the species studied (presumably dog), and did not contain any specific information. Cannon acknowledged Welch’s report as his inspiration, and alluded to “…several successful operations… (liver replacements) without survival of the ‘patient’… .“ In a prophetic comment, he suggested that … ”the liver undoubtedly has a great deal to do with the production of the homograft reaction and probably with the inception and maintenance of tissue specificity. Replacement transplantation of intact liver, therefore, might well lead to interesting results.”
In early 1959, I learned that a team headed by the late Francis D Moore (Fig. 8) had begun the development of canine liver transplantation at the Peter Bent Brigham Hospital in June or July 1958, at the same time as my own first successful experiments. By the end of the summer of 1958, the Boston team had done six liver replacements. These were reported in a 1959 issue of Transplantation Bulletin.15 I first met Moore at the 1960 meeting of the American Surgical Association, where I discussed his presentation.16 By then, the cumulative total of canine liver replacements in the 2 laboratories had increased to 111—31 in Boston and 80 in Chicago. The results were published separately in 1960 in different journals.11,16
The rwo prerequisites for perioperative survival of canine recipients were identified in both the Boston and Chicago laboratories. The first requirement was prevention of ischemic injury to the allograft. This was accomplished in Boston by immersing the liver in iced saline, a method independently used for preservation of intestinal and cardiac allografts by Lillehei and colleagues17 and by Lower and Shumway,18 respectively.
Our exploitation of hypothermia in Chicago reflected the influence of F John Lewis, professor of surgery at Northwestern, who, with his Fellow, Norman Shumway, had pioneered total body hypothermia for open heart surgery at the University of Minnesota (Fig. 9). The livers were cooled by intravascular infusion of chilled solutions (Fig. 10), using thermal probes to monitor core temperatures (Fig. 11). Interestingly, this now universal practice had never been done before, apparently because of fear of damaging the microcirculation. Better liver preservation was later obtained with infusates of differing osmotic, oncotic, and electrolyte composition: eg, the Collins,19 Schalm,20 and University of Wisconsin (UW) solutions21-23 that originally were developed for kidney transplantation.24,25
The second prerequisite for successful canine liver transplantation was avoidance of damage to the recipient splanchnic and systemic venous beds, the drainage of which was obstructed during host hepatectomy and graft implantation. This was accomplished in both the Boston and Chicago laboratories with decompressing external venovenous bypasses (Fig. 12).
Until 1960, the kidney had been the only organ allograft whose unmodified rejection had been systematically studied. Most transplanted canine livers were destroyed in 5 to 10 days. The survival of our dogs that lived for at least 4 days is shown in Fig. 13. The histopathologic studies were done in Chicago by Donald Brock26 (now retired in Northville, MI) and in Boston by the late Gustav Dammin27 (Fig. 14). Typically, a heavy concentration of mononuclear cells was seen in the portal triads, and within and around the central veins. Hepatocyte necrosis was extensive.
A curious observation was made, however, in our 63rd liver replacement experiment.26 The recipient’s serum bilirubin reached a peak at 11 days and then progressively declined (Fig. 15, dashed line). The predominant histopathologic findings in the allograft by the 21st day were more those of repair and regeneration than of rejection. This was, to my knowledge, the first recorded exception to the existing dogma (based on skin graft research) that rejection, once begun, was inexorable. Five years later, the London pathologist, KA Porter (Fig. 16), described similar findings in allografts of the first long-surviving canine liver recipients whose rejections had been reversed under immunosuppression in Denver.28 In 1969, he extended his observations to human liver allografts.29
Because Porter’s previous principal research had been kidney transplantation,30 he was now able to sort out features of rejection that were common to both organs (and various other allografts) in unmodified and immunosuppressed recipients, and to distinguish these changes from those that were specific to the different kinds of organ allografts. Under the leadership of AJ Demetris at the University of Pittsburgh (Fig. 16), the field of clinical transplantation pathology rose from the base laid by the earlier workers.
The studies completed in Boston and Chicago defined, almost to the last detail, the liver replacement operation soon to be performed in humans. (Fig. 17). By the end of 1959, we also had developed the operation of multivisceral transplantation. Here, the allograft consisted of the liver and all of the other intraperitoneal organs (Fig. 18)31,32. Essentially all of this work, and the development ofliver transplantation, was done with the help of Harry A Kaupp Jr, a skillful general surgery resident, who practiced vascular surgery in Allentown, PA, until his retirement. Two medical students (Robert Lazarus and Robert Johnson) rounded out the team.
Two further observations about rejection were made in the multivisceral experiments that were validated much later in rodent studies33,34 and in humans.35 First, rejection of the different organs transplanted with the liver was less than rejection of the individual organs transplanted alone. Second, there was histopathologic evidence of a widespread graft versus host reaction in recipient tissues without resulting in overt graft versus host disease.
Multivisceral transplantation and its modifications (Fig. 19) were applied in humans 30 years later36-39 and are now part of the conventional armamentarium of advanced organ transplant centers. When the operation was first presented at the Surgical Forum of the American College of Surgeons in October 1960,31 it was lampooned. In fact, all surgical research in transplantation of the 1958 to 1960 era was considered naïve or wasteful by many critics and especially by basic immunologists, most of whom viewed the immune barrier as impenetrable.
Just as this kind of surgical research in unmodified animals was losing momentum, it was dramatically revitalized by six successful human kidney transplantations performed between January 1959 and February 1962, first by Joseph Murray in Boston,40 and then by the teams of Jean Hamburger41 and Rene Kuss42 in Paris (Fig. 20). Although ”success“ was defined as survival for at least 1 year, the first two recipients (both of fraternal twin kidneys) had continuous graft function for more than two decades without post-transplant immunosuppression. All six patients had been conditioned before transplantation with sublethal doses of 450R tOtal body irradiation (Table 1, above dashed line).
In an extension of the host preconditioning concept, the urologist, Willard Goodwin (Fig. 21), performed six human kidney transplantations at UCLA in 1960-1961 in which host cytoablation was done with myelotoxic doses of cytoxan and methotrexate instead of total body irradiation.43 Although five of the six recipients came to an early death, Goodwin successfully reversed several rejections with prednisone during the 143-day survival of his third patient, whose kidney was donated by her mother in September 1960. This crucial observation was not reported until 1963 and was not known to us until then.
In any event, it quickly became apparent that the Boston and French successes with cyroablation for kidney transplantation, remarkable though they were, would not be a bridge to liver transplantation. In our hands, total body irradiation precluded even perioperative, much less extended, survival of canine liver recipients.44
A sea change occurred with the arrival of the drug 6-mercaptopurine (6-MP). The key observation was that immune depression under 6-MP did not depend on overt bone marrow depression.45 The potential value of the drug in transplantation was first demonstrated in a rabbit skin graft model by Schwartz and Dameshek46 at Tuft’s Medical School in Boston, and by the research team of Robert Good at the University of Minnesota.47 Prolongation of survival of canine kidney allografts under 6-MP was reponed soon after by Roy Calne (in London)48 and Charles Zukoski (in Richmond)49 (Fig. 22).
By the end of 1960, Calne (by now in Boston with Murray)50,51 and Zukoski (with David Hume in Richmond)52 obtained survival of canine kidney recipients for 100 days or more under treatment with 6-MP. Even better results soon were reported by Calne50 with azathioprine an imidazole derivative of 6-MP. When clinical kidney transplant trials with the new drugs were begun in Boston in 1960 and 1961, the possibility of transplanting the human liver seemed close at hand.
Before William R Waddell (Fig. 23) left the Massachusetts General Hospital to become chair of surgery at the University of Colorado in Denver on July 1, 1961, where I joined him from Northwestern a few months later, we settled on clinical liver transplantation as our highest priority. The plan was shelved in early 1962 when we learned of disappointing results in the clinical trials of kidney transplation in Boston53,54 and England.54 ray of hope could be found, however, in a report by the future Nobel laureate, Murray, in the September, 1962, issue of the Annals of Surgery.53
The article contained a description of a kidney that had functioned under azathioprine therapy for 120 days, from the time of its transplantation from an unrelated donor on April 5, 1962. That kidney still functioned at 11 months,54 and was destined to suppon dialysis-free life of the recipient for 17 months. Although this was the only patient who lived for as long as 6 monrhs, he became the 7th human to survive more than 1 year after kidney transplantation, and the first to do so without total body irradiation (Table 1, below the dashed line).
In the meantime, we had obtained our own supply of azathioprine in the spring of 1962, and systematically evaluated it with the simpler canine kidney model rather than with liver transplantation. Many of the experiments were done with Tom Marchioro (Fig. 24), subsequently a revered professor of surgery at the University of Washington (1967–1995). As in other laboratories, our yield of 100-day canine kidney transplant survivors was small.
But two crucial findings were clinically relevant. First, canine kidney rejection developing under azathioprine invariably could be reversed with the addition of large doses of prednisone.56 The second key observation was that a mean survival of 36 days in dogs treated with azathioprine was almost doubled when the animals also were pretreated with the drug for 7 to 30 days57 (Table 2). We committed to clinical trials of kidney and liver transplantation, in that order. Daily doses of azathioprine were to be given for 1 to 2 weeks before and after transplantation, with the addition of prednisone only to treat rejection (Fig. 25). The renal program was opened in the autumn of 1962.
The two features of the adaptive immune response to allografts that eventually would make transplantation of all kinds of organs feasible were promptly recognized. These were described in the title of the report of the first 10 Colorado kidney cases: the reversibility of rejection, and more importantly, the subsequent development of donor specific tolerance.58 ”Tolerance,“ which referred to the time-related decline of need for maintenance immunosuppression, proved to be the correct word.
Nine of the 46 recipients of kidney allografts from live-related donors (20%) remained dialysis-free for 4 decades, all but one with normal renal function throughout. Seven of the nine became immunosuppression-free for periods ranging from 21/2 to 38 years (Fig. 26). One of the nine was recently murdered in a love triangle and had a normal transplanted kidney at coroner’s autopsy. Those remaining bear eight of the nine longest surviving kidney allografts in the world today, including the four longest.59
Although the maximum followup of our first human renal recipient was only 6 months in the spring of 1963, our kidney experience triggered the decision to go forward with the infinitely more difficult trial of liver transplantation. The first attempt on March 1, 1963 was in an unconscious and ventilator-bound child with biliary atresia who bled to death during operation. The next two recipients, both adults, died 22 and 7.5 days after their transplantations on May 5 and June 3, 1963 for the indication of primary liver malignancies (Table 3). The two adults were found at autopsy to have extrahepatic micrometastases. The three failed cases were described in the December 1963 issue of Surgery, Gynecology, and Obstetrics,60 2 months after the optimistic kidney report in the same journal.58
Immunosuppression was the same as that used for our kidney recipients. Pretreatment was begun with azathioprine with or without small doses of prednisone. The same therapy was continued after transplantation. With evidence of rejection, a high-dose course of prednisone was added (Fig. 27). Rejections, which were monitored by serial serum bilirubin concentrations, were easily reversed (Fig. 28). The transplanted livers retrieved at autopsy were remarkably free of rejection, as noted in the caption describing the histopathology of the 22-day graft (Fig. 29, right).
Efficient allograft preservation was accomplished by transfemoral infusion of a chilled perfusate into the aorta of the nonheart-beating donors after cross-clamping the aorta at the diaphragm61 (Fig. 30). The procedure was the same in principle as that of first stage of the ”flexible“ multiple organ procuremem operarion currently used worldwide.62 There was very little ischemic damage to the allografts during their postmortem intervals of 2½ to 8 hours, as indicated by modest increases in the liver injury tests (Fig. 31).
The various anastomoses were performed in the same way as in the canine experiments (Fig. 32). The lethal mistake in the human cases was the use of passive venovenous bypasses (Fig. 32). Emboli formed in the bypass tubing, migrated to the lungs, and caused or contributed to the deaths of all 1963 Denver recipients who survived the operation (Table 3). Overzealous correction of clotting abnormalities may have contributed to the complication. In much the same way as today, coagulation had been monitored with serial thromboelastograms and corrected with blood components and with epsilon amino caproic acid (an analogue of the currently used aprotinine).
The supreme irony was that the venous decompression that had been critical in the dog experiments is not mandatory in most human liver recipients. The motordriven venovenous bypass system, which was introduced in Pittsburgh in the 1980s,63 made the procedure easier. In some centers, however, it now is used only selectively.
During the last half of 1963, two more attempts at liver transplantation were made in Denver,64 and one each in Boston65 and Paris,66 (Table 3). Clinical activity then ceased for 3½ years. The worldwide moratorium was voluntary. The decision to stop was reinforced, though, by widespread criticism of attempting to replace an un-paired vital organ with an operation that had come to be perceived as too difficult to ever be tried again.
In contrast, kidney transplantation thrived at the University of Colorado. In 1964, a textbook was produced based on our first 70 cases, emphasizing that renal transplantation had reached the level of a bona fide clinical service.67 At the beginning of 1963, the only three clinically active kidney transplant programs in the United States were at the long-standing Brigham center and the two opened in 1962: ours and David Hume’s in Richmond, VA. One year later, nearly 50 kidney teams had started or were gearing up. A similar proliferation was going on throughout Europe.
Advances were made between January 1964, and the summer of 1967, most of which were applicable to all organs.
In a clinical collaboration with Paul Terasaki of UCLA (Fig. 33), it was shown that the quality ofHLA matching short of perfect compatibility had little association with kidney transplant outcome.68-70 By inference, desperately ill liver, heart, and other transplant candidates who could not wait for a well-matched organ would not suffer a significant penalty by receiving a mismatched one.
A second objective was to improve immunosuppression. Antilymphocyte globulin (ALG) was prepared from antilymphocyte serum (ALS) obtained from horses immunized against dog and human lymphoid cells.71 After its development and testing in dogs between 1963 and 1966, human-specific ALG was introduced clinically in 1966 in combination with azathioprine and prednisone (the “triple drug cocktail”).
In the preclinical canine studies, the efficacy of dogspecific ALG had been demonstrated when it was given before, at the time of, or after kidney and liver transplantation. It was noted that ” … pretreatment [with ALG] did appear to be of value in the canine experiments, and was accordingly made part of the protocol used for patients.“71 The conclusion about the efficacy of pretreatment was consistent with previous rodent studies of ALS by Woodruff and Anderson72 and by Monaco, Wood, Gray, and Russell.73
The goal in Denver of resuming clinical liver transplantation was reflected by a growing kennel population of long surviving canine recipients (Fig. 34), none of whom were treated with more than a 4-month course of azathioprine,8 or a few doses of ALG.71 In presenting the results of 143 canine liver replacements to the Society of University Surgeons in February 1965, I emphasized that
… Although the early recovery after liver homotransplantations has many hazards … the frequency and rapidity with which dogs could be withdrawn from immunosuppression without an ensuing fatal rejection is remarkable …. The consistency of this state of host-graft nonreactivity and the rapidity with which it seemed to develop exceeds that reported after canine renal homotransplantations. 28
A year later, the French surgeon, Henri Garnier, reported (with Cordier) that a significant percentage of untreated outbred pig liver recipients did not reject their allografts.74 This observation promptly was confirmed and extended in England by Calne at Cambridge,75 Peacock and Terblanche in Bristol,76 and us.77 Calne and coworkers78 subsequently demonstrated that the tolerance self-induced by the liver extended to other tissues and organs from the liver donor, but not from third-party pigs.
Although our primary focus during the moratorium was on liver replacement, we also evaluated the ostensibly less radical auxiliary liver transplantation (Welch’s operation). After showing that rejection could be completely prevented in some dogs with high doses of azathioprine, it was proved that the acute atrophy of Welch’s auxiliary livers was caused by depriving the allografts of liver supporting constituents of splanchnic venous blood.64,79 At the time of transplantation 45 days previously, the diminutive rejection-free allograft shown in Fig. 35 had been the same size as the recipient’s normally vascularized liver.
These findings, which finalized the decision to proceed clinically with liver replacement, were not fully explained until the mid-1970s. Eventually, it was established that endogenous insulin was the most important ”hepatotrophic“ factor in normal portal blood.80,81 This was a decisive step in understanding the pathophysiology of Eck’s fistula (portacaval shunt).81,82
The potential pitfall of organ preservation remained. It would still be necessary to obtain livers from nonheartbeating donors. To help surmount this difficulty, we developed an ex vivo perfusion system in 1966 and 1967 that permitted reliable preservation of canine livers for as long as a day. The effort was spearheaded by a young naval surgeon, Larry Brettschneider.83 Now, it was time to try again.
When the liver program reopened in July, 1967, it had another weapon. This was an enthusiastic 2-year NIH-supported Fellow from Stockholm named Carl Groth (Fig. 36). Groth who was determined to succeed, and had no doubt that this would be possible, was a key member of both donor and recipient surgical teams. He also took charge of the post-transplant management team in a continuous vigil that lasted for many months. By the end of the year, multiple examples of prolonged human liver recipient survival had been produced, under triple drug immunosuppression: azathioprine, prednisone, and ALG84 (Fig. 37). The liver transplant beachhead was reinforced by the opening of Roy Calne’s clinical program in Cambridge, England, in February 1968.8 In 1969, a companion to the 1964 kidney transplant book was published, entitled Experience in Hepatic Transplantation,85 based on our first 25 liver replacements and 8 performed elsewhere (4 by Calne).
Transplantation of other extrarenal organs followed close behind the liver, using similar immunosuppression (Table 4). Hearts were successfully transplanted in 1968 in Capetown by Barnard86 and in Palo Alto by Shumway.87 In 1969, the first prolonged survival after human lung88 and pancreas transplantation89 was accomplished in Ghent and Minneapolis, respectively. But transplantation of the extrarenal organs, and especially of the liver, remained controversial for another decade, because of the high mortality. Swimming against the stream, the German and French teams of Rudolf Pichlmayr and Henri Bismuth entered the field in the early 1970s, as did the Dutch group of Rudi Krom later in the decade.
The unusual tolerogenicity of the hepatic allograft previously demonstrated in dogs and pigs was evident in human liver recipients of the 1970s. In 1995, 12 of our 42 patients (28%) surviving from this era already had been off all immunosuppression for 1 to 17 years.90 Since then, many of the remaining 30, who are now out to 33 years post-transplantation, also have stopped drugs and remain well.91,92 Such drug-free tolerance was almost unheard of with the other kinds of cadaveric organs.
Despite such encouraging notations, the widespread use of the liver and other extrarenal organs was precluded for another decade by the high mortality. The outlook for all organs improved after cyclosporine was introduced clinically in England in 1978 by Calne,93 and combined with prednisone in Denver 1 year later.94 Results further improved when tacrolimus was substituted for cyclosporine in the 1990s.95
The increases in liver recipient survival with the two new drugs were particularly striking,96,97 but less dramatic gains were recorded with the other organs. By the end of the 20th century, transplantation of the liver and all of the other vital organs had become an integral part of sophisticated medical practice in every developed country in the world. There was, however, one nagging disappointment.
With better immunosuppressants, drug-free liver recipients were expected to become common. Yet, the ultimate prize of tolerance was achieved less frequently than before, and only when liver transplantation was carried out during three time periods. The first two in-tervals were 1979-1980 and 1989-1990, just after cyclosporine 96 and tacrolimus,95,97 respectively, were introduced clinically as monotherapy, adding prednisone only to treat rejection.
When prophylactic high doses of prednisone and other agents were added at the time of operation in an effort to further reduce the threat of acute rejection, drug-free tolerance was no longer seen. The third cluster of tolerant patients was produced at the turn of the century in Kyoto, Japan, where many pediatric recipients of partial livers from parental donors were successfully weaned from a steroid-sparing tacrolimus-based regimen of immunosuppression98 similar to that originally used in 1989-1990.97
Despite these ”proof of feasibility“ observations, the goal of deliberate induction of drug-free tolerance would remain out of reach until the mechanisms leading to this state were understood. Insight into the mechanisms began to emerge in 1992 with the study up to 30 years post-transplantation of 5 kidney and 25 liver recipients.
The kidney recipients were survivors from the 1962-1967 era that has been the main focus of this review; most of the liver recipients had undergone transplantation in the 1970s. Samples were obtained of the allografts and of host blood, skin, lymph nodes, and, in some cases, other organs such as bowel and heart. In all 30 patients, including several who had long since stopped all immunosuppression, small numbers of donor hematolymphopoietic cells were demonstrated in blood or tissues,99,100 a condition known as donor leukocyte chimerism.
These findings prompted numerous studies in animal models,101-109 and ultimately led to a collaboration with RolfZinkernagel of Zurich (Fig. 38) whose 1996 Nobel prize was for studies of the adaptive immunity responsible for defense against noncytopathic microorganism and for rejection. 110 At the end, the chimerism-associated mechanisms of organ and bone marrow cell engraftment, and of acquired tolerance, were delineated. 111,112 Only then was it possible to fully explain the meaning of the historical events that I have been describing. More importantly, a map for the future of organ transplantation could be drawn.
In essence, organ engraftment is a dynamic process initiated and governed by the migration and localization of the passenger leukocytes of bone marrow origin (ie, hematolymphopoietic cells) that are normal constituents of all tissues and organs. The donor leukocytes migrate preferentially to the host lymphoid organs104,113-115 (Fig. 39, left), where they induce an antigraft T-cell response. Without immunosuppression, this response proceeds to rejection in humans and in most animal models.
In a few experimental models of spontaneous tolerance, however, or under the appropriate conditions of immunosuppression, the immune response is too weak to eliminate the migratory donor leukocytes and regresses. Collapse of the response occurs when the activated host T-cell clone reaches a proliferation limit and is exhausted and deleted by incompletely understood mechanisms that include apoptosis109 (Fig. 40). Clonal exhaustion-deletion is the seminal mechanism of organ engraftment and of acquired tolerance.99,100,111,112,116 Because the exhaustion is never complete, maintenance of the variable deletional tolerance achieved at the outset depends on the persistence of residual stimulatory donor leukocytes. 111,112 These were the cells discovered up to 30 years after transplantation in our pioneer organ recipients.
The important point is that allograft rejection and clonal exhaustion-deletion are products of the same donor-specific immune response. The close relation of rejection and deletional tolerance is exemplified by the results with pig liver transplantation in the mid-1960s (see earlier).74-78 The immune response reported in the majority of untreated pig liver recipients proceeds to rejection (Fig. 41A, ascending thick arrow). In a significant minority of experiments, how-ever, the response is exhausted and deleted (Fig. 41A, decline of the thin-line curve). Unlike the liver, pig kidneys and other porcine organs fail to induce tolerance because they are endowed with a smaller quantity of passenger leukocytes.
In contrast to its unpredictable occurrence in un-treated pigs, spontaneous tolerance invariably is induced by hepatic allografts in numerous inbred rodent modelS.103,117,118 This is not, however, an exclusive quality the liver. Spontaneous tolerance also can be induced in experimental rodent models by less well leukocyte-endowed heart119,120 and kidney allografts,121,123 albeit less commonly.
In experimental models in which organ allograft rejection by the unmodified recipient is the normal out-come, the passenger leukocyte-induced response can be reduced enough to be deleted by treating the recipient prior to transplantation (Fig. 41B). That almost certainly is what Murray and colleagues40 and Hamburger and associates41 accomplished with pretransplant total body irradiation of their historic fraternal twin kidney recipients of 1959. Less drastic pretreatment is possible with thoracic duct drainage,124,125 total lymphoid irradiation,126 conventional antirejection drugs,57,125 and especially the antilymphoid antibody preparations.71-73,127,128 Prototypes of effective antilymphoid preparations that have been used for pretreatment include the polyclonal ALGs of the kind developed and introduced clinically during the liver transplant moratorium of 1964–1967,70 and the humanized monoclonal ALG, alemtuzumab that has been extensively evaluated clinically by Calne and colleagues.129,130
The antigraft response also can be brought into the deletable range by immunosuppression given after transplantation (Fig. 41 C); This, though, is a doubleedged sword. If antigen-specific immune activation is unduly depressed (Fig. 41D), the derivative event exhaustion-deletion also is eroded. When immunosuppression is reduced later in an initially over-treated recipient, the unde1eted donor-specific clone recovers along with the return of global immune reactivity.106,112 Graft survival is then dependent on permanent immunosuppression.
Two principles of tolerogenic immunosuppression derive from this paradigm.112 The first is the value of recipient pretreatment. The second is the importance of limiting post-transplant immunosuppression to the minimum needed to prevent irreversible immune damage to the transplanted organ. In this view, the primary role of immunosuppression is not to eliminate the antigraft response as has been commonly assumed, but rather to lower it into the deletable range.
Both therapeutic principles had been observed empirically in the early Denver renal transplant experience that resulted in a unique cohort of drug-free kidney recipients and prompted the first liver trial. Pretreatment of these patients as well as minimal post-transplant immunosuppression was given with azathioprine, adding prednisone temporarily only to treat rejection (Fig. 25). No comparable series of tolerant kidney recipients was compiled anywhere in the world during the next 40 years. What had changed?
First, pretreatment of kidney recipients was deemphasized or abandoned in December 1963 after infectious complications occurred in the preoperative period (Fig. 42). Second, high doses of prednisone were instituted at the time of transplantation because of the excessive loss of kidney allografts to uncontrollable rejection (Fig. 42). Although heavy prophylactic immunosuppression has dominated transplantation practice ever since (Fig. 41D), the superior tolerogenic qualities of hepatic allografts continued to result in a handful of drug-free tolerant liver recipients, but only in isolated periods when restrictive posttransplant immunosuppression was used.
With the insight into the mechanisms of engraftment that I have described, we have applied both therapeutic principles in more than 200 cases since July, 2001. Kidney, liver, pancreas, and intestinal recipients were pretreated with a single large dose of a potent ALG, and then administered conservative doses of tacrolimus after transplantation. No other immunosuppression was given unless specifically indicated for rejection (Fig. 43).
Half of these patients have been weaned after 4 months to only one, two, or three doses per week of tacrolimus. Many are expected to become drug free in their second post-transplant year. These results suggest that a. high, if not absolute, degree of sustained donor specific nonreactivity (ie, tolerance) can be expected after the transplantation of all organs. So, the next, and perhaps the most gratifying, phase of transplantation may already have begun. But, of course, that will be a story for future Congresses of the American College of Surgeons.
That concludes my remarks except for a postscript. In the early 1960s, Sir Peter Medawar, Nobel laureate and founder of the discipline of transplantation immunology, wrote the following:
Good scientists study the most important problems they think they can solve. It is, after all, their professional business to solve problems, not merely to grapple with them. The spectacle of a scientist locked in combat with the forces of ignorance is not an inspiring one if, in the outcome, the scientist is routed.131
Medawar was referring to the search for the “holy grail” of transplantation, the secret that would allow organ recipients to be rendered drug-free tolerant. Thousands of scientists joined the crusade, driven by the false premise that successful engraftment after organ transplantation occurred by fundamentally different mechanisms than the donor leukocyte chimerism-associated ones of bone marrow transplantation and of acquired tolerance. This assumption was not challenged for the next third of a century.
All the while, the historical observations that I have recounted pointed in a different direction, and in the end these observations were at least as important in arriving at the truth as the results from reductionist investigations. The lesson is clear. History is neither dull nor dead. It is a uniquely human survival tool, aiding those in the present by the ability to draw on the past to meet current needs, and to predict needs yet to come.
Presented at the American College of Surgeons, 88th Annual Clinical Congress, San Francisco, CA, October 2002.