|Home | About | Journals | Submit | Contact Us | Français|
Residual disease or rapidity of response to induction therapy is among the most powerful predictors of outcome in pediatric Acute Lymphoblastic Leukemia (ALL).
Utilizing a multiparameter flow cytometric chemosensitivity assay (FCCA), we studied the relationship between in vitro drug sensitivity of diagnostic leukemic blasts from 30 children with ALL and rapidity of response to induction therapy. We also analyzed the in vitro drug sensitivity of de novo leukemic blasts among various clinical subsets.
Compared to rapid early responders (RER), slow early responders (SER) had a significantly greater in vitro drug resistance to Dexamethasone (DEX) (P = 0.04) and Prednisone (P = 0.05). The studies with all other drugs showed a non-significant trend with the SER having a higher in vitro drug resistance compared to the RER. Risk group stratified analyses indicated that in vitro resistance to Asparaginase (ASP), DEX, and Vincristine (VCR) were each significantly related to having very high risk ALL. Additionally, a significantly higher in vitro drug resistance to ASP and VCR was associated with unfavorable lymphoblast genetics and ultimate relapse.
Our data indicate that this FCCA is a potentially simple and rapid method to detect inherent resistance to initial ALL therapy very early in induction, thus allowing for treatment modification shortly thereafter.
The prognosis of acute lymphoblastic leukemia (ALL) has improved considerably with advancements in multidisciplinary therapy . However, 20% of patients relapse and over 50% of these patients die of disease. A number of clinical trials have demonstrated the prognostic importance of rapidity of cytoreduction in induction therapy for children with ALL [2-14]. Indeed, the Children’s Oncology Group (COG) recently reported minimal residual disease (MRD) at end induction as the most powerful predictor of outcome by multivariate analysis . Treatment failure is complex and at least partially due to intrinsic or acquired cellular drug resistance . Curative treatment of ALL requires eradication of all leukemic clones, including those with inherent drug resistance. The current COG treatment approach relies upon end of induction MRD to fully define slow early responders (SERs) before intensifying therapy. However, it is possible that modifications of therapy early in induction will improve outcome of patients with relatively resistant disease.
The degree to which each drug in induction contributes to lymphoblast cell death probably varies among individuals within different clinical and cytogenetic ALL subsets. Laboratory tests to measure lymphocyte cell death after exposure to cytotoxic agents were initially developed over 90 years ago. The first report was made by Pappenheimer in 1917 when he used trypan blue to detect non-viable lymphocytes after exposure of fresh thymic lymphocytes to toxic agents . Significant improvements in cell culture assays of cytotoxicity have been utilized since then, leading to the evolution of current assays [17-19]. Over the past 20 years, a number of studies from Europe and Japan measured in vitro drug resistance of diagnostic lymphoblasts and consistently demonstrated significant correlations of in vitro chemosensitivity results with outcome in pediatric ALL [20-24]. Based on these data, the German COALL (Cooperative Study Group for Childhood ALL) designed a prospective trial that incorporated in vitro drug resistance profiles into the treatment stratification of newly diagnosed childhood ALL [20, 23]. Whether in vitro drug sensitivity correlates with measures of in vivo MRD remains to be determined. Indeed, three European cooperative groups are currently studying this question in prospective clinical trials [25, 26].
Most of the studies mentioned above rely on the MTT (methyl-thiazol-tetrazolium) assay to determine chemosensitivity profiles of lymphoblasts in childhood ALL. With this technique, MTT is reduced to colored formazan crystals by living cells and leukemia cell survival is calculated from optical density . However, this assay is relatively labor-intensive, requires 4 days incubation, and is unable to completely exclude from its results the chemosensitivity response of normal marrow lymphocytes from malignant lymphoblasts. Several recent reports suggest that techniques studying in vitro drug sensitivity using flow cytometry may be readily applicable to the study of childhood ALL [28-31].
We applied a high throughput, multiparameter flow cytometric chemosensitivity assay (FCCA)  with a rapid 48 hour turnaround time to evaluate apoptosis induction in leukemic blasts with standard induction drugs. The purpose of this study was two-fold: to assess the feasibility of this assay applied to blasts of ALL and to determine the relationship between a patient’s lymphoblast sensitivity to induction drugs in vitro and his/her rapidity of response to induction therapy. Additionally, we analyzed the in vitro drug response among clinical subsets. We report our initial results using this FCCA on leukemic blasts from 30 children with newly diagnosed ALL.
This study was approved by the Institutional Review Board of Oregon Health & Science University. We studied 38 clinical specimens derived from bone marrow aspirates of children with ALL diagnosed at a single institution and enrolled on Children’s Oncology Group (COG) clinical trials from 1997 to 2007. The diagnosis of ALL was based on morphology and immunophenotype by flow cytometry. Mononuclear cells had been isolated from diagnostic bone marrow and cryopreserved in liquid nitrogen with RPMI 1640 culture medium containing 20% heat-inactivated fetal calf serum (FCS) and 10% DMSO. Parental consent for sample storage and future research was obtained in all cases.
Clinical parameters available on all patients whose blasts were studied included gender, age, white blood cell count (WBC) at diagnosis, bone marrow karyotype, and day 7/8 and/or day 14/15 bone marrow status. The majority of patients also had testing with fluorescence in situ hybridization (FISH) for TEL/AML1, t (4;11), and t(9;22). MRD status at induction day 29 was determined by flow cytometry in the COG reference laboratory on approximately 1/3 of subjects.
Risk group stratification was based on National Cancer Institute criteria of age and WBC at diagnosis . Definition of very high risk ALL and categorization of bone marrow karyotype as favorable, unfavorable or neither was according to current COG criteria . The COG definitions of rapid and slow early response have evolved over the past 12 years, differing among older and newer protocols. In the present study, patients were considered rapid early responders (RERs) if they achieved M1 status (< 5% blasts) by day 14/15 and, when available, if MRD was < 0.1% at day 29 of induction. All patients received three- or four-drug induction therapy with vincristine (VCR), L-asparaginase (ASP) or PEG-asparaginase, prednisone (PRED) or dexamethasone (DEX) on CCG-1952, CCG-1991, COG AALL0331, and the addition of daunomycin (DAUNO) on CCG-1961 and COG AALL0232 protocols. All stored samples from patients with SER or VHR ALL were included in this study, while the RER patient samples were randomly chosen among many stored specimens.
This FCCA was developed in our laboratory and described in detail previously (31). Cryopreserved cells were thawed from liquid nitrogen, washed and resuspended in RPMI 1640 supplemented with 10% fetal calf serum (FCS), 2 mM glutamine, 100 microgram/ml streptomycin, and 100 microgram/ml penicillin. Viability was determined by trypan blue dye exclusion. Cell concentrations were adjusted to 1.5-3.0 × 106 viable cells/ml. 100 μl of cell suspension was added to each well of a 96-well microliter plate, containing 50 μl of vincristine, E. Coli L-asparaginase, dexamethasone, daunomycin, or prednisone. Each drug concentration was tested in triplicate. Eight to eleven wells without drug exposure served as controls. To ensure a similar number of cells between control and drug-treated wells, the same cell suspension was used for each sample and the same volume pipetted. The culture plates were incubated at 37 degrees Celsius in humidified atmosphere containing 95% air and 5% CO2 for 48 hours. The five drugs were tested each at three concentrations, the empirically derived cut-off concentration (EDCC), one-fifth and five times the EDCC. The EDCC were adopted from previous studies of leukemic cells based on a concentration producing the widest scatter of cell survival among the different samples in vitro : L-Asparaginase (ASP) 10 U/ml, Daunomycin (DAUNO) 0.5 μg/ml, Dexamethasone (DEX) 1.4 μg/ml, Prednisolone (PRED) 50 μg/ml and Vincristine (VCR) 0.5 μg/ml. At the end of the incubation period, the cells in each well were washed and resuspended in 25μl of RMPI 1640 containing 10% FCS and incubated with monoclonal antibodies, CD 45-PerCP (BD Biosciences, San Diego, CA), CD-19 APC (BD Biosciences, San Diego, CA) and CD10-PE (BD Biosciences, San Diego, CA) on ice for 20 minutes with manufacturer-recommended antibody concentrations. Anti-CD19 was replaced with anti-CD3 for T-cell leukemia. Following the antibody incubation, cells were stained with Annexin V-FITC (BD Biosciences, San Diego, CA) per manufacturer recommendations and analyzed with a Becton Dickinson FACSCalibur flow cytometer with Cellquest Pro software. Leukemic blasts were identified based on immunophenotypic features. Viable cells showed no binding to Annexin V. The results are presented as leukemic cell survival index (LCSI), defined as: LCSI = (mean number of viable cells in drug-treated cultures/mean number of viable cells in control cultures) x 100.
LCSI was correlated with each patient’s previously determined rapidity of response in vivo. LCSI was also correlated with relapse status and various other prognostic factors. Statistical analyses were performed using SAS (Statistical Analysis System) software version 9.1. The overall effects of drug and concentration on LCSI, along with the interaction effect between drug and concentration were assessed using a mixed effect model . The mixed effect model accounts for intra-subject correlation due to measurements taken from the same subject at different concentrations. The interaction effect indicates whether the drug effect is dependent on specific concentration. The difference in LCSI between the various ALL subsets was assessed by fitting a mixed model for each drug. The models accounted for the effect of each subgroup and the interaction between effect and drug concentration. If the interaction effect was significant, indicating that the subgroup effect depends on concentration, the difference in LCSI between subgroups was assessed at each one of the three concentrations. If the interaction effect was not significant, indicating that the subgroup effect does not depend on concentration, the difference in LCSI between subgroups was assessed over the three concentrations of the drug. A p-value less than 0.05 was considered significant.
Cellular drug response was determined successfully by FCCA in 30 of 38 samples analyzed. Reasons for failure were: poor cell viability, with less than 50% cell viability determined by trypan blue staining prior to testing (n=2), or poor cell viability in drug-free control cultures after 48 hours of incubation (n=6). Table I shows the characteristics of the patients with successful in vitro testing. As shown in Figure 1, a marked variability was seen in the drug-induced cytotoxicity from 30 cases of ALL tested. In addition all drugs showed concentration dependent cytotoxicity effects.
We first investigated the relationship between in vitro drug sensitivity and in vivo rapidity of response to induction therapy, with results summarized in Table II. Leukemic cells of the diagnostic specimens from rapid early responders showed significantly higher in vitro sensitivity to glucocorticoids, either dexamethasone or prednisone. The mean LCSI for DEX was 1.5 fold higher in the SER compared to the RER group (P = 0.04). The mean LCSI for PRED was 1.4-fold higher in the SER compared to the RER group (P = 0.05). The mean LCSI for ASP, DAUNO and VCR was not significantly different between the two groups (p>0.05).
The in vitro drug sensitivities to ASP and DEX of leukemic cells from patients of different risk groups are shown in Figure 2A-E. Leukemic cells from patients with very high risk (VHR) factors showed significantly decreased in vitro sensitivity to ASP. The mean LCSI for ASP was significantly higher (P <0.005) at all three concentrations tested in samples from VHR group compared to samples from the standard risk (SR) group and from high risk (HR) group compared to the SR group. The mean LCSI for VCR was significantly higher in blasts from VHR patients compared to those from the SR and HR subgroups (p<0.05). The mean LCSI for DEX was significantly higher (P<0.05) in the VHR group compared to the SR group only at the 1X and 5X concentrations.
The mean LCSI for DAUNO and PRED were not statistically different in blasts among SR, HR, and VHR groups (P>0.05). No significant differences were observed between HR and VHR groups for all 5 drugs tested.
The in vitro drug sensitivity differences between those with favorable and unfavorable cytogenetics are outlined in Table III. Decreased in vitro sensitivities to ASP and VCR were seen in the leukemic cells from patients with unfavorable genetics compared to those with favorable genetics. The mean LCSI for ASP and VCR were 89% and 62%, respectively, in the group with unfavorable genetics. In contrast, the mean LCSI for ASP and VCR in the group with favorable genetics were 39% and 35% (P < 0.05). Such difference was not detected with other drugs tested (P > 0.05).
To examine whether there was any association between the in vitro drug sensitivity and relapse, the mean LCSI for each drug was compared between patients who ultimately relapsed and those who remained in remission. The mean LCSI for both ASP and VCR were significantly higher (P = 0.05, 0.01, respectively) in the leukemic cells from patients who ultimately relapsed (77%, 61%, respectively) than in blasts from those who remained in remission (53%, 35%, respectively). However, the mean LCSI for DAUNO, DEX and PRED were not significantly different (P>0.05) in the relapsed group (34%, 60%, 41% respectively) compared to patients remaining in remission (23%, 55%, 38% respectively). Among samples from patients with neutral cytogenetics, no difference in LCSI was observed for any drug tested between those who relapsed and those who did not, with P values between 0.2 and 0.9 (data not shown). Among samples from patients remaining in remission, comparison between those with favorable (n = 5) versus unfavorable (n= 4) genetics found the mean LCSI with Asparaginase to be significantly lower in the favorable group (P =.02). No difference in LCSI was observed for the other 4 drugs tested, with P values between 0.1 - 0.9.
In the present study we used the FCCA to examine relationships between rapidity of response to induction therapy in de novo childhood ALL and in vitro leukemic cell sensitivity to standard induction drugs. We found that slow early response to induction therapy was associated with a significantly increased lymphoblast survival after exposure to glucocorticoid therapy in vitro. The importance of glucocorticoid resistance in vivo and in vitro as independent prognostic factors in childhood ALL has been well highlighted [21,22,24,36,37]. The BFM (Berlin-Frankfort-Muenster) group has shown that in vivo response to one week of prednisone is a strong prognostic variable . Multiple cooperative groups, utilizing either the colorimetric MTT or a fluorometric microculture cytotoxicity (FCMA) assay, reported a correlation with in vitro cellular resistance to glucocorticoids and poor outcome in childhood ALL [21,22,24,36]. Indeed, our findings with the FCCA assay agree with the preliminary observations of others that in vitro resistance to glucocorticoids correlates with MRD during and after induction therapy [25,37,39]. In our study we found non-significant trends toward higher LCSI in SERs than in RERs with each of the other induction drugs tested. These trends with small sample size are consistent with results obtained in larger studies using the MTT assay [22,24]. Perhaps LCSI comparisons with more patient samples in each SER and RER subgroup would reach statistical significance with non-glucocorticoid drugs.
Specific genetic abnormalities in de novo ALL correlate with outcome [40-44]. The t(9;22) and hypodiploidy that define VHR ALL make this subgroup biologically distinct from HR and SR ALL. Likewise, trisomies of chromosomes 4, 10, and 17 or expression of TEL/AML1 gene fusion found in > 40% of patients with SR-ALL impart different cellular biology to the latter risk group. The risk stratification (VHR, HR and SR) includes cytogenetic features of leukemic cells in addition to clinical risk factors (age and white count and poor response to induction chemotherapy). Despite our small sample size, comparisons of mean LCSI among risk groups (VHR, HR and SR) or cytogenetic groups showed similar and striking results that leukemic cells from the VHR or unfavorable cytogenetic group were significantly more resistant to ASP. Such findings are consistent with previous studies showing that in vitro and in vivo resistance to ASP correlates with a relatively poor prognosis in ALL [24,45]. TELAML1 + samples were >10-fold more sensitive to ASP in vitro than TEL/AML- samples and hyperdiploid samples were even more sensitive . This is at least partially due to high expression of asparagine synthetase which has been shown to be linked with L-asparaginase resistance in TEL-AML1-negative pediatric acute lymphoblastic leukemia . The mean LCSI among different risk groups trended in the anticipated direction: VHR > HR>SR for each drug studied although not of statistical significance for each drug. The underlying mechanism for drug specific difference among different risk group is not clear. There is intrinsic biologic overlap among HR and SR patients given that the age cut-off of 10 years old and WBC cut-off of 50,000/mcl were derived from statistical calculations .
Multi-institutional groups have documented that in vitro drug sensitivities to PRED, VCR, ASP, ± DAUNO can identify ALL patients at risk for early relapse or refractory disease [20-22]. Based on the predictive value of PRED, VCR, and ASP cytotoxicity in vitro, COALL-97 treatment protocol has incorporated the drug profile score in the treatment stratification of patients with newly diagnosed ALL . Interestingly, even with our small number of events, our findings agree with the observation that early relapse is associated with resistance to VCR and ASP. Six of our 7 relapses occurred within 18 months of diagnosis. However, our study showed that PRED and DEX sensitivities correlated with rapidity of response during induction but not with relapse. The validity of our findings is limited by the small sample size. In addition, the biological features of leukemic cells are quite complex and heterogenous among 30 patients we studied. With larger sample size, we may be able to show the correlation between steroid-sensitivity and relapse among our patients with defined biological features. To investigate further if leukemic relapse is due to the expansion of drug resistant leukemic cells, additional studies need to be performed to compare in vitro drug sensitivity profiles among leukemic cells at the diagnosis, post induction, and at relapse.
The apoptosis-based FCCA used in this study is easy, relatively inexpensive, and efficient. It is a high throughput assay with a 48-hour turnaround time, allowing less time for spontaneous cell death than the 4-day incubation of the MTT assay. Our technique is suitable for all diagnostic ALL marrow samples regardless of the percent of residual normal lymphocytes or hematopoietic cells, as the latter are gated out of the lymphoblast populations identified by flow cytometry. Moreover, the same flow cytometer and computer programs used in the clinical hematopathology laboratory can perform the LCSI calculations, as was the case in our study. This FCCA was previously applied in our laboratory to cryopreserved samples of B-cell Chronic Lymphocytic Leukemia and demonstrated increased in vitro drug resistance among patients with a known poor prognostic cytogenetic abnormality . The FCCA assay is suitable for both fresh and cryopreserved leukemic samples, allowing extensive testing for a large panel of drugs available, especially for patient with very high clinical risk factors.
We recognize the preliminary nature of our results, as the validity of our findings is limited by small samples size. To achieve 80% power would require an N of 45 in both the SER and RER subset. Our results are also potentially compromised by an inconsistent definition of SER, which was based in 2/3 of cases on day7/14/28 marrow morphology alone and in 1/3 cases on MRD measurements as well. As with previous studies using the MTT or FMCA assays, we investigated each drug individually in the FCCA. Because of the small sample size, we did not develop a composite drug-resistance score. Nor did we more closely mimic in vivo treatment by exposing blasts simultaneously to all 3 or 4 induction drugs. Neither our study, nor those reported with the MTT and FMCA assay, account for potential drug synergism.
In conclusion, the rapid FCCA used in this study holds promise as a simplified technique to measure in vitro drug resistance of blasts from children with ALL. The FCCA may be used to identify poor responders very early in induction and allow treatment modification shortly thereafter. Such changes in therapy might include the addition of agents like epratuzumab or rapamycin to induction regimens [48, 49]. COG recently reported the prognostic significance of blasts in peripheral blood at day 8 of induction . Studies to determine the correlation between LCSI and day 8 MRD may have important implications for risk stratification and treatment of childhood ALL. The combination of multiple factors, like end induction MRD, NCI risk group, cytogenetic subsets, and day 8 MRD, offers the best prognostic information in childhood ALL to date . Results of the FCCA assay could theoretically combine with these factors to provide even better prognostic information for each patient. Fresh clinical specimens at the diagnosis typically do not present with problems of cell viability. If specimens are handled correctly, almost 100% of fresh samples can be tested for in vitro drug sensitivity. The cell viability and testability of cryopreserved samples may be decreased if samples are not processed promptly and the cryopreservation process is not well controlled. As with the more labor-intensive MTT or FMCA techniques, the FCCA has potential for use in programs of cytotoxic drug development, for risk stratification of patients on clinical trials, and for guiding personalized treatment. In addition, the FCCA may prove useful in the study of genes associated with chemo-resistance and in the identification of agents to overcome this resistance.
We thank Fouad Attia and Kathrynn Mosley, senior research associates at Oregon Health & Science University, who assisted in data collection used in this study. Full salary support from the St. Baldrick’s Foundation provided by a two year St. Baldrick’s Pediatric Oncology Research Fellowship Scholarship Award to Dr. Faith Galderisi is gratefully acknowledged. Additionally, this publication was made possible with support from the Oregon Clinical and Translational Research Institute (OCTRI), a grant (UL1 RR024140) from the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH), and NIH Roadmap for Medical Research.