|Home | About | Journals | Submit | Contact Us | Français|
Our two recent attempts at baboon-to-human xenotransplantation failed.1–3 However, there were encouraging notations. First, the xenografts had no evidence of B virus infection during their posttransplant survival of 70 and 26 days in B virus carriers. Second, there also was little histopathologic evidence of humoral or cellular rejection of these livers. Yet, their function was suboptimal. If rejection was not the explanation, why did the baboon liver cases fail?
The surgical techniques were adapted from hepatic allotransplantation.4 Although the baboon donors were large, their body weights were only 40% of the recipients, necessitating the so-called piggyback operation that leaves the recipient vena cava intact (Fig 1). This procedure, which was first described by Calne and William5 in 1968 and later popularized by Tzakis et al,6 requires more skill to perform than the standard operation. In our opinion, it is increasingly dangerous in proportion to the donor/recipient size disparity. Of course, this means that the baboon is not an ideal donor—even from a purely technical point of view—for adult humans.
Nevertheless, the procedures were initially satisfactory. Once the livers were in, they regenerated up to optimal volume for recipient size in both cases. The silhouette in Fig 2 is the original baboon liver size laid over the actual computed axial tomographic (CAT) scan at 12 days in patient 1. On this day, the liver appeared grossly normal at surgical re-exploration and biopsy. Histopathologically, the bromodeoxyuridine (BrDU) monoclonal antibody with which the biopsy sections were stained showed multitudes of brown-stained (proliferating) hepatocytes and duct cells. There were almost no infiltrating immunocytes.
The conventional lymphocytotoxic crossmatch of the recipient sera with their donor lymphocytes was positive initially but negative after dithiothreitol (DTT) treatment (Table 1), meaning that the antibodies were largely IgM. The conventional crossmatches became negative postoperatively. Both patients had ABO compatible donors: A to A in case 1 and B to B in case 2. They were equivalently immune competent with in vitro testing. Estimated survival time without transplantation was less than 1 month in both cases.
These were the similarities. One difference that may have been important was that patient 1 had asymptomatic human immunodeficiency virus (HIV; Table 2). In addition, he had undergone splenectomy 3 years previously following a motorcycle accident, whereas patient 2 did not have splenectomy until the fourth postoperative day. Another difference was that patient 1 was half the age of patient 2 and far less frail. The day before the operation, patient 2 developed deep hepatic coma and was placed on a ventilator.
Patient 1 awoke promptly from anesthesia and resumed diet and ambulation. Although he became jaundice free for most of the 70 days of survival, the canalicular enzymes were high from the second week onward; alkaline phosphatase levels rose to over 10,000 IU. Serum transaminases were only modestly elevated perioperatively, or for that matter at any subsequent time.2
When icterus finally developed 2 months after transplantation, it was ascribed to partial obstruction of the reconstructed bile duct, even though the biliary-enteric anastomosis appeared satisfactory by cholangiography (Fig 3). At autopsy at 70 days, the entire biliary tree was filled with inspissated bile, and most of the biliary ducts, which by this time had become bile lakes, were denuded of epithelium (Fig 4). This can be the end result with unrelieved obstruction, but the alternative explanation could have been that the epithelial damage was the primary, not a derivative, event.
In contrast to the first one, this patient remained icteric and comatose postoperatively.3 The xenograft had the same cholestasis as the first one on biopsy after 4 days despite an unquestionably adequate biliary anastomosis. This patient’s lowest bilirubin was 8 mg/dL on the fourth postoperative day, down from 17.3 mg/dL, but increasing thereafter to the terminal concentration of 28.3 mg/dL. As in the first patient, the jaundice was not particularly responsive to steroid boluses and increased maintenance doses of prednisone.
Having the same problem twice with cholestasis raised the possibility that the baboon liver produced a lithogenic bile in the human environment. This has not been absolutely ruled out, but we doubt it. Equally perplexing in both patients was the inability of the baboon livers to maintain the postoperative serum albumin above 2 g/dL in spite of other adequate synthetic function including prothrombin time.2, 3 These seemingly healthy baboons had been accepted as donor cases because of their large size, even though they were both more than 15 years old and had hypoalbuminemia. This was not a “normal” finding as was thought at the time. In a recent study of 25 ostensibly healthy baboons housed at the Southwest Foundation from which our donors came, the mean serum albumin was 3.5 ± 0.7 SD gm/dL (range 1.8 to 5.1), not suggesting a genetically determined limit on albumin production.7 Similar results from a different baboon colony at Ohio State were recently reported.8
A ruptured mycotic intracerebral aneurysm (caused by aspergillus) was the immediate cause in case 1.2 In case 2, peritonitis was listed as the official cause of death, secondary to an anastomotic leak at the jejunojejunostomy of the Roux-en-Y biliary reconstruction.3 However, both patients developed renal failure and in fact patient 2 never made a drop of urine postoperatively. The most disquieting fact was that neither hepatic xenograft provided adequate function. The mystery was the disparity between the paucity of the histopathologic abnormalities (which was very encouraging) and the discouraging (and unexplained) functional deficiencies of these transplants, which suggested incomplete control of xenograft rejection.
Nearly 30 years ago, six baboon kidney xenografts were transplanted to patients treated with azathioprine-prednisone immunosuppression. The organs functioned for 6 to 60 days.9 At the end, the baboon kidney xenografts had fierce cellular rejection. However, the key finding was a presumably antibody-mediated (humoral) occlusive endothelialitis of the graft vessels that closed off much of the arterial supply. The consequent distal ischemia appeared to be responsible for patchy gangrene of the xenografts, interspersed between islands of still functional parenchyma. Similar gross and histopathologic findings were reported more than 20 years later by Bailey et al10 after cardiac xenotransplantation under a cyclosporine (CyA)-based immunosuppressive regimen (the Baby Fae case).
A four-drug immunosuppressive cocktail was used. This consisted of FK 506 given IV or orally, prednisone, prostaglandin E1, and cyclophosphamide. The cyclophosphamide, like the other three drugs, was started IV and continued orally in doses that were not myelotoxic as judged by normal white blood cell count.2, 3 The striking value of cyclophosphamide and other antimetabolites when used in combination with FK 506 had been demonstrated in hamster-to-rat heart and liver transplantation by Murase et al11 and in the same heart model with CyA by Hasan et al.12 In her rat liver xenograft recipients, Murase demonstrated the cell migration and systemic microchimerism,1 which we believe is associated with and necessary for both allograft or xenograft acceptance, and is the first stage of tolerance induction.13–15 Valdivia et al16 showed that the hamster liver graft itself becomes a genetic composite, in the same way as allografts do. In our patients, striking chimerism also was evident.2, 3 Baboon DNA was found by polymerase chain reaction (PCR) in essentially all tissues retrieved at autopsy at 70 days from patient 1. In addition to the liver, patient 2 was given a large dose of unpurged bone marrow cells (3 × 108/kg body weight) perioperatively. He also had mixed xenogeneic chimerism at all times until death.
Only one of the seven biopsies obtained from patient 1 (this was on day 12) had a mild focal cellular rejection by the conventional criteria used for hepatic allografts. On day 64 there was a mild but diffuse increase in T (CD3+) and NK (Leu-7+) cells in the sinusoids and septal bile ducts of patient 1 but the findings were insufficient for an unequivocal diagnosis of rejection. There was some centrilobular hepatocyte drop out. This was the worst looking biopsy in either case by criteria of cellular rejection.
No definite evidence of cellular rejection was seen in any of the seven biopsy samples taken from patient 2 over a 26-day period.
The xenograft of both patients out to 70 and 26 postoperative days was entirely free of the arteritis that has been associated with vascular rejection in all previous baboon-to-human kidney or heart grafts. Nevertheless, sludging as well as the presence of polymorphonuclear leukocytes was seen in the sinusoids of the xenografts immediately after reperfusion, compatible with the diagnosis of an aborted hyperacute rejection.
Total complement was depleted for most of the critical first 2 weeks whereas complement components C3, 4, and 5 became undetectable. During this time, circulating immune complexes appeared (Table 3). After 10 days, the complement system settled down but irreversible damage may have been done.3 This complement evolution was very similar to that reported in Paris last year by Manez et al17 in recipients of allografts transplanted across a positive lymphocytotoxic crossmatch.
Although these baboon liver xenografts had little evidence of cellular or vascular rejection, they exhibited a very fine microsteatosis on their first biopsies that became obvious within a few days, particularly in case 2 (Fig 5). This finding has been reported in cases of allotransplantation with inexplicable primary hepatic nonfunction. Although these findings receded, the microsteatosis may have been a sublethal injury that precluded long-term success in either case.
During this early phase, there was binding of IgM and IgG in the grafts. In both patients, the immunoglobulins largely disappeared from the graft tissues except for IgG, which was positive throughout.
We believe that these livers were acutely damaged by an incomplete version of a form of rejection that we described in 1964 in ABO-incompatible kidneys18 and with Terasaki et al19 a year later in kidney allografts transplanted across a positive lymphocytotoxic crossmatch. These were the first ever descriptions of hyperacute kidney rejection associated with preformed antigraft antibodies. They were followed in 1966 by Kissmeyer-Neilsen et al’s report.20 The mind set created by these three papers that hyperacute rejection always is precipitated by antigraft antibodies quickly became one of the sacred cows of transplantation immunology.
Several years later joined by Frank J. Dixon of the Scripps Institute (La Jolla) in a revolt against the sacred cow, we defined hyperacute kidney rejection as a complement activation syndrome with mechanisms analogous to the Shwartzman and local Arthus reactions.21,22 We pointed out that although hyperacute allograft rejection usually was associated with antigraft antibodies, this was not an absolute requirement. It was a heretical statement at the time.
However, the distinction we had made between hyperacute rejection with and without preformed antibodies was merely the difference between the classical pathway of complement activation in which the first steps are antibody dependent vs the alternative pathway that does not require an antibody trigger or the participation of complement components C1, 2, and 4. It always has seemed to us that these hyperacute allograft rejection syndromes with or without preformed antigraft antibodies are not fundamentally different than those seen after xenotransplantation of organs between genetically diverse species.1,23
The complement pathogenicity is derived from the cleavage products of C3 and C5. The harmful consequences have been effectively mitigated (but not eliminated) with cobra venom (which depletes C3 and C5) and soluble recombinant complement receptor type I that binds to the cleavage fragments of C3 and C5 preventing amplification through C3b. However, binding to the anaphylatoxins C3a and C5a prevents another mechanism of complement injury—namely activation of mast cells, polymorphonuclear leukocytes, and other sources of soluble inflammatory mediators such as platelet activating factor.24,25 Thus, cobra venom and the soluble complement receptor I impede by different mechanisms both the classical and alternative cascades. By shutting off both pathways, such agents could interdict or ameliorate the Arthus reaction, Shwartzman reaction, and neutrophil-mediated tissue injury.
Perhaps the most interesting anticomplement drug is a sesquiterpene compound called K76, which is produced by a species of fungi imperfecti found in the soil of one of the Okinawa islands. This drug blocks the C5 step of complement activation and also accelerates the decay of C5b, just proximal to the formation of the membrane attack complex (MAC) formed by complement components 5 to 9 in both the classical and alternative pathways of complement activation (Fig 6). Our attention was drawn to K76 by the article by Miyagawa et al26 in the April 1993 issue of Transplantation, which actually was a negative report. Intraperitoneal K76 was described as having no effect on the hyperacute rejection of guinea pig hearts transplanted to rats (Fig 7). Another drug called FUT (a synthetic inhibitor of serine protease) had a small therapeutic effect by itself (third and fourth bars, Fig 7). K76 and FUT together allowed heart graft survival for 100 minutes (fifth bar, Fig 7).
Realizing that the results of the second and fifth groups could be internally inconsistent, we obtained a small supply of K76. When Murase et al11 gave the same dose (200 mg/kg) of K76 to rats, but IV instead of intraperitoneally, median survival of guinea pig hearts in rats was increased by a single dose from 8 minutes to more than 8 hours, and in one experiment to more than a day. The astonishing potency of IV K76 by itself, or in combination with the other drug FUT, has also been shown in the very difficult pig-to-dog kidney xenograft model—with functioning graft survival out to 8 hours during which 200 mL/h of urine was produced. This is a truly extraordinary achievement, with this most difficult of all large animal models.
Of course, control of the complement pathogenicity for a few hours with a drug like K76 does not mean anything per se. However, this could be the missing piece in a treatment recipe for clinical use. In this context, the anticomplement drug is envisioned only as an adjunct to the drug cocktail already shown to be highly effective in our clinical cases but deficient in that it did not deal with the very earliest mechanism of complement injury. Experiments to verify this hypothesis with various cocktail regimens are underway in our laboratory by Murase, Todo, Tzakis, and others in several small and large animal models.
The need to prevent complement activation may be short term, particularly if the transplantation is of the liver, which is the principal source of the body's complement and the sole source of most complement components such as C3. In this connection, a paper by Valdivia et al27 at this meeting has tremendous potential importance because it shows the species restriction of complement.
In their investigation, the combination of hamster liver and its complement plus the rat recipient equaled a rat swimming in hamster complement (Fig 8). They showed how in the new complement environment, hyperacute rejection of stable liver and heart xenografts could be precipitated with just 1 mL of IV antihamster rat serum (Fig 9). This was a specific effect of the rat complement rather than the rat anti-hamster antibody infused into the rat recipient. The hyperacute rejection could be completely avoided by simply heating the rat serum at 56°C for 30 minutes (Fig 9), which allowed retention of the rat anti-hamster antibodies but removed the rat complement. The demonstration of species restriction of complement is a fundamental discovery that undoubtedly will be exploited to facilitate xenograft acceptance.
This mechanism might explain the terminal course of our second recipient. This patient had a normal liver biopsy on days 24 and 25 when he was explored for an enteric fistula—yet less than I day later, the graft was found to be necrotic. What had happened? One possible explanation came from the possibility that human blood and blood products infused during these emergency operations contained human complement and anti-baboon antibodies. It requires no imagination to suggest that we might have accidentally performed Valdivia et al’s27 experiment in the operating room. The need to control the complement that is given to liver xenograft recipients will never again be very far from our minds. I have been told by our blood bankers that preparing complement-free blood constituents will be quite feasible.
This presentation is too complex for easy summarization. However, the impression I want to leave is that the xenograft barrier may be more vulnerable than most people realize at present. More than ever, cracking the complement shell that prohibits the exploitation of what hides inside via the classical or alternative pathway does not seem unrealistic. Once entry is achieved, it is clear that conventional cellular and vascular rejection can be controlled with modern drug combinations. The complement reactions are the same as those that can abruptly destroy allografts. Now, the prospect of dealing with these problems and subsequent classical rejection with very straightforward strategies seems closer than ever. An additional possibility is the production of transgenic animals whose organs have complement regulatory proteins (such as decay accelerating factor [DAF] and CD59).
Supported by grant DK 29961 from the National Institutes of Health, Bethesda, Maryland.