首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
BACKGROUND: Increased expression of major histocompatibility complex class II (MHC-II) antigen occurs during cardiac allograft rejection. We tested the hypotheses that (1) radiolabeled antibody to MHC-II antigen allows detection of cardiac allograft rejection using nuclear imaging techniques and (2) uptake of radiolabeled antibody to MHC-II antigen correlates with severity of rejection. METHODS AND RESULTS: Thirteen beagles with cervical cardiac allografts were studied for 64+/-23 days by use of myocardial biopsy and in vivo imaging. Uptake of radiolabeled (131I [n=2], 123I [n=1], or 111In [n=10]) antibody to MHC-II increased over baseline in 7 animals that developed histological evidence of progressively worsening allograft rejection (group A), from 72.2+/-46.1 to 176.8+/-102.0 counts/pixel/mCi (P<.009). In 4 beagles without progressively worsening allograft rejection (group B), uptake was unchanged during follow-up (74.4+/-43.8 and 60.2+/-37.4 counts/pixel/mCi; P=NS). In animals studied with 111In-labeled antibody, uptake increased from 102.9+/-23.1 at baseline to 233.2+/-82.7 counts/pixel/mCi at follow-up in group A animals (P=.036), with no significant change in group B (91.1+/-34.9 and 75.9+/-24.9 counts/pixel/mCi; P=NS). Uptake of 111In-labeled antibody was 107.5+/-35.7, 135.9+/-70.8, and 307.8+/-90.1 counts/pixel/mCi in biopsy samples showing evidence of mild, moderate, and severe rejection, respectively (P=.001). Biopsy samples showing mild, moderate, and intense MHC-II expression antibody uptake had uptakes of 92.6+/-36.3, 158.5+/-54.7, and 307.8+/-90.1 counts/pixel/mCi, respectively (P=.00004). CONCLUSIONS: Radiolabeled monoclonal antibodies to MHC-II antigen can detect cardiac allograft rejection in this large mammal model of cardiac allograft transplantation, and this technique may have a potential role in the detection of rejection in patients after cardiac transplantation.  相似文献   

2.
BACKGROUND: Mycophenolate mofetil reduces episodes of biopsy-proven acute cellular rejection or treatment failure in the first year after kidney transplantation; however, limited data exist regarding the efficacy after lung transplantation. METHODS: In a 2-center, nonrandomized concurrent cohort study (level III evidence), we analyzed the incidence of biopsy-proven acute cellular rejection (International Society for Heart and Lung Transplantation grade > or=A2) and decrement in pulmonary function during the first 12 months after successful lung transplantation. All patients received induction immunosuppression with antithymocyte globulin (< or=5 days' duration), cyclosporine and prednisone, in addition to either mycophenolate mofetil (2.0 g/d) [n=11] or azathioprine (1 to 2 mg/kg per day) [n=11]. RESULTS: During the first 12 months after lung transplantation, the mycophenolate mofetil group experienced significantly fewer episodes of acute cellular rejection than the azathioprine group (0.26+/-0.34 vs 0.72+/-0.43 episodes/100 patient-days [mean+/-SD], p < 0.01; 95% CI for the difference=0.126 to 0.813). The change in forced expiratory volume -1 second [deltaFEV1] (liters) between the 3rd and 12th months after lung transplantation was analyzed for the two treatment groups. For this interval, deltaFEV1 for the mycophenolate mofetil group was +0.158+/-0.497 L vs -0.281+/-0.406 L for the azathioprine group (p < 0.05; 95% CI for difference=+0.0356 to 0.843). During the first year, there was 1 death in each group attributed to bronchiolitis obliterans syndrome with concurrent pneumonia. There were no differences in incidence of cytomegalovirus or bacterial infections between the treatment groups; however, a higher prevalence of aspergillus sp airway colonization in bronchoalveolar lavage fluid was observed for the mycophenolate mofetil group (p < .05). The prevalence of bronchiolitis obliterans syndrome at 12 months was 36% for the azathioprine group vs 18% for the mycophenolate mofetil group (p=NS). CONCLUSIONS: Our preliminary experience with mycophenolate mofetil after lung transplantation suggests a decreased incidence of biopsy-proven acute cellular rejection. Furthermore, less decline in FEV1 after 12 months may suggest a reduced incidence or delayed onset for development of bronchiolitis obliterans syndrome. Prospective randomized trials with low beta error (level I evidence) should be performed to assess the efficacy of mycophenolate mofetil vis-à-vis acute allograft rejection and bronchiolitis obliterans syndrome.  相似文献   

3.
BACKGROUND: Since the introduction of cyclosporine (CsA), 1-year renal allograft survival has improved, but concern persists about the long-term adverse effects of CsA, especially with respect to renal function and blood pressure. This randomized controlled trial was set up to establish whether withdrawal of CsA would alter long-term outcome. METHODS: Adult patients who, at 1 year after renal transplantation, had a stable serum creatinine of less than 300 micromol/L and who had not had acute rejection within the last 6 months were eligible for entry. Patients were randomized either to continue on CsA (n=114) or to stop CsA and start azathioprine (Aza, n=102). All patients remained on prednisolone. Median follow-up was 93 months after transplantation (range: 52-133 months). RESULTS: There was no significant difference in actuarial 10-year patient or graft survival (Kaplan-Meier), despite an increased incidence of acute rejection within the first few months after conversion. Median serum creatinine was lower in the Aza group (Aza: 119 micromol/L; CsA. 153 micromol/L at 5 years after randomization, P=0.0002). The requirement for antihypertensive treatment was also reduced after conversion to Aza; 75% of patients required antihypertensive treatment at the start of the study, decreasing to 55% from 1 year after randomization in the Aza group and increasing to >80% in the CsA group (55% (Aza) and 84% (CsA) at 5 years after randomization, P<0.005). CONCLUSIONS: Conversion from CsA to Aza at 1 year after renal transplantation results in improvement in both blood pressure control and renal allograft function, and is not associated with significant adverse effects on long-term patient or graft survival.  相似文献   

4.
BACKGROUND: Heart transplantation (HT) as a therapeutic option for end-stage chronic Chagas' heart disease (CCHD) is controversial. Reactivation of Trypanosoma cruzi infection and recurrence of the disease in the allograft are likely to occur. Furthermore, active myocarditis has been reported to predispose patients to an increased incidence and severity of rejection. METHODS AND RESULTS: We prospectively investigated the long-term follow-up of 10 patients with CCHD who underwent HT. Immunosuppression was based on cyclosporine A and azathioprine. T cruzi reactivation was prevented with benzonidazole. Besides allograft rejection surveillance, T cruzi infection was monitored through blood tests, myocardial biopsies, and serological tests. Over a mean follow-up period of 34 +/- 38 months (range, 73 to 124 months), 7 patients are alive and in NYHA functional class I. Life expectancy was 78% for the second year and 65% for 10 years. Rejection was less frequent in chagasic than in age- and sex-matched control patients (mean +/- SD, 1.60 +/- 1.26 versus 5.70 +/- 1.89 episodes per patient, respectively; P = .0001); decreased severity of rejection was also observed (P = .006). T cruzi parasitemias detected on three occasions were successfully treated with benzonidazole. There were no signs of recurrence of the disease in the allograft. CONCLUSIONS: These results suggest an important role of HT in the treatment of CCHD. There was a low frequency of T cruzi infection reactivation and no signs of recurrence of the disease in the allograft. The surprisingly decreased rejection incidence and severity require further studies for elucidation.  相似文献   

5.
BACKGROUND: The risk for rejection is highest early, but graft rejection requiring intensified immunosuppression may be present even late after transplantation. Nonetheless, a considerable number of patients are absolutely free of rejection requiring intensified immunosuppression (Rej) late after transplantation. Therefore, we tried to identify patients who do not need endomyocardial biopsies > or = 2 years after transplantation and those who may benefit from long-term follow-up with routine endomyocardial biopsies. METHODS: A total of 112 patients (age 45+/-12 years) had a follow-up with regular endomyocardial biopsies of > or = 3 years. A total of 4194 endomyocardial biopsies were performed (1364 > or = 2 years after transplantation). They were divided into three categories: rejection score=0, Texas 0-2 or International Society for Heart and Lung Transplantation (ISHLT) 0 or 1A; rejection score=1, Texas 3-4 or ISHLT 1B or 2; rejection score=2, Texas > or = 5 or ISHLT > or = 3A. RESULTS: During the third and subsequent posttransplantation years, 31 of 112 (28%) patients had < or = 1 further Rej (total 51). Independent predictors identifying patients with Rej in multivariate analysis were age [odds ratio (OR)=0.96 per year, P<0.05], the sum of rejection score (OR=1.07 per score point, P<0.005) and the mean cyclosporine level in the first 2 years (OR=1.07 per % of upper therapeutic range, P<0.01). Fifty-eight (52%) patients with age >25 years, sum of rejection score < or = 17, and mean cyclosporine level <90th percentile during the first 2 years would not have needed biopsies in the third and subsequent years, whereas the other 48% had a risk of 54% to develop further Rej. In addition to predictors identifying patients with rejection, time after transplantation (OR=0.73 per year, P<0.005), cyclosporine level below therapeutic range (OR=2.15, P<0.05), and reduction of prednisone (OR=2.64, P<0.05) were independent predictors at each endomyocardial biopsy. CONCLUSIONS: Risk of Rej remained considerably high in approximately one third of our patients late after transplantation. In these, further surveillance biopsies appear justified, whereas half of the patients had no risk of Rej as long as immunosuppressive therapy was sufficient.  相似文献   

6.
BACKGROUND: There is evidence that inducible nitric oxide (NO) may be directly related to the process of allograft rejection. Because of its strong pulmonary vasodilatory activity, inhaled NO (INO) has recently been used as a therapeutic option for allograft dysfunction after lung transplantation. The action of inducible NO and inhaled NO seems contradictory for preserving posttransplantation pulmonary allograft function. INO used for lung transplant recipients may actually enhance acute allograft rejection. We studied the effect of INO on acute allograft rejection with a rat pulmonary allograft model. METHOD: A total of 24 left lung allotransplantations were performed from Lewis donors into F344 recipients. Animals were divided into two groups and inhaled either room air alone or 20 ppm NO with room air in a closed chamber immediately after transplantation until rats were killed on days 7 and 14. During observation, NO uptake was monitored by measuring serum NO2-/NO3- level. Acute rejection was evaluated by use of a semiquantitative radiographic scoring method (aeration score: 0 to 6, opaque to normal appearance) and rejection score (0 to 4, no sign of rejection to diffuse mononuclear infiltration). RESULTS: Markedly elevated serum NO2-/NO3- levels were observed in the NO inhalation group compared with levels in the normal air inhalation control group (110.8 +/- 25.3 vs 16.3 +/- 4.0 micromol/L/ml on day 7, p < 0.01; 107.0 +/- 30.9 vs 16.8 +/- 4.8 micromol/L/ml on day 14, p < 0.01). However, no positive effect of INO on acute rejection was found histologically or radiographically. CONCLUSION: The effect of INO on acute rejection is likely so minimal as not to be clinically relevant.  相似文献   

7.
When loss of graft function occurs more than six months after transplantation, allograft nephrectomy is not routinely performed at the time of graft failure. It is usually performed only on those patients who subsequently develop specific complications. However, little is known about the characteristics that make patients more likely to require allograft nephrectomy. The purpose of our study was to identify risk factors for the subsequent need for allograft nephrectomy in patients with graft failure occurring more than 6 months after transplantation. Forty-one patients were studied. Inclusion criteria were: loss of graft function > or = 6 months after transplantation, resumption of dialysis and initiation of weaning from immunosuppression. Thirty patients were treated with cyclosporine + prednisone +/- azathioprine and 11 with azathioprine + prednisone. Mean follow-up time was 17.8 months, ranging from 6 months to 6.1 years. Recipient age, sex and race, original renal disease, donor, donor source (cadaveric vs living related), HLA compatibility, levels of panel reactive antibodies, occurrence of initial delayed graft function, causes of graft failure and tapering of immunosuppression were similar in patients with and without allograft nephrectomy. Using univariate analysis, allograft nephrectomy was found to be significantly more frequent in patients with a history of 2 or more episodes of acute rejection than in patients with no rejection episode: 83% vs 30% (p = 0.03). In addition, allograft nephrectomy was found to be significantly more frequent if the immunosuppressive regimen included cyclosporine (62% vs 27.3%; p = 0.04). Using multivariate analysis however, the number of previous episodes of rejection was found to be the only significant predictor for allograft nephrectomy. None of the other variables considered in the multivariate analysis, including the type of immunosuppressive therapy, was identified as a significant predictor for the need to perform allograft nephrectomy. In summary, the need for late allograft nephrectomy was correlated with the number of previous episodes of acute rejection. Patients with a history of numerous rejection episodes should thus be considered more likely to require allograft nephrectomy once immunosuppression is withdrawn. Possible interventions to reduce or prevent the need for nephrectomy include more gradual tapering of immunosuppression at the time of graft failure or indefinite low-dose immunosuppressive therapy.  相似文献   

8.
BACKGROUND: 31P-Magnetic resonance spectroscopy (31P-MRS) can be used as a non-invasive tool for measuring the relative intracellular concentrations of several phosphorus metabolites in different organs. Various pathological conditions are characterized by different metabolic patterns. We studied the value of 31P-MRS after renal transplantation with both an uneventful and a clinically complicated course. METHODS: We determined the relative concentrations of phosphate-containing metabolites in renal allografts of humans with 31P-MRS (1.5 Tesla) in the first few weeks after transplantation; 18 patients with an uneventful clinical course and 10 patients who required dialysis after transplantation were examined. Six patients with a stable allograft function 2-3 months after transplantation served as controls. RESULTS: In patients with primary allograft function, we found a significant correlation between the phosphomonoester/phosphodiester-ratio (PME/PDE) (r = 0.66, r < 0.01) and the time after transplantation, but no correlation between the nucleoside triphosphate (beta-NTP)-concentration (r = -0.11) and the time course. In the patients with primary or early allograft dysfunction caused by histologically proven rejection (n=5), we found a low beta-NTP compared to patients with an uncomplicated clinical course (0.09+/-0.01 vs 0.15+/-0.03), but no differences in the PME/PDE ratio (0.73+/-0.21 vs 0.80+/-0.21). In contrast, the PME/PDE ratio was lowered in three patients with delayed graft function caused by acute tubular necrosis (0.45+/-0.07 vs 0.80+/-0.21), but the beta-NTP concentration was not reduced (0.15+/-0.003 vs 0.15+/-0.03). The 31P-MR spectrum of two patients with cyclosporin A damage was not altered compared to the controls. CONCLUSIONS: 31P-MRS can be used in patients in the early period after renal transplantation. A significant correlation between the PME/PDE ratio and the time course but no change in the beta-NTP concentration was found in patients with primary allograft function in the first 4 weeks after renal transplantation. Different patterns of 31P-MR spectra were observed depending on the different causes of primary and early transplant dysfunction.  相似文献   

9.
BACKGROUND: The role of monocytes and neutrophils is crucial during acute allograft rejection. They have the capacity to generate toxic reactive oxygen intermediates in response to specific agonists that may act as tissue destructive molecules. We examined the possibility of reactive intermediate-mediated tissue injury in acute lung allograft rejection, as well as the effect of superoxide dismutase. METHODS: Allogenic (Brown Norway to F344) or syngenic (F344 to F344) rat left-lung transplantation was performed. Generation of reactive oxygen intermediates in peripheral blood was evaluated by the method of luminol-dependent chemiluminescence. Cell membrane phospholipid peroxidation in the graft was measured as malondialdehyde concentration. The third group of animals having allografts received bovine erythrocyte superoxide dismutase (5,000 U/kg intravenously every 12 hours after transplantation). RESULTS: Relative chemiluminescence response in the allograft recipient to normal F344 was elevated on postoperative day 1 (257%), then decreased slightly on day 3 (156%) and was elevated again on day 7 (560%) as the process of rejection progressed. Allograft tissue malondialdehyde levels (248.37 +/- 112.35 nM/whole lung, n = 6; p < 0.05 by Student's t test) were higher than isograft levels (139.29 +/- 35.93 nM/whole lung, n = 6) on day 7. Superoxide dismutase treatment significantly ameliorated the histologic degree of rejection on day 7. CONCLUSIONS: These results demonstrate the tissue destructive activity of reactive oxygen intermediates during lung allograft rejection. To scavenge free radicals may be a useful therapeutic modality in the management of acute lung allograft rejection.  相似文献   

10.
BACKGROUND: A study was performed by 17 different U.S. liver transplantation centers to determine the safety and efficacy of conversion from cyclosporine to tacrolimus for chronic allograft rejection. METHODS: Ninety-one patients were converted to tacrolimus a mean of 319 days after liver transplantation. The indication for conversion was ongoing chronic rejection confirmed by biochemical and histologic criteria. Patients were followed for a mean of 251 days until the end of the study. RESULTS: Sixty-four patients (70.3%) were alive with their initial hepatic allograft at the conclusion of the study period and were defined as the responder group. Twenty-seven patients (29.7%) failed to respond to treatment, and 20 of them required a second liver graft. The actuarial graft survival for the total patient group was 69.9% and 48.5% at 1 and 2 years, respectively. The actuarial patient survival at 1 and 2 years was 84.4% and 81.2%, respectively. Two significant positive prognostic factors were identified. Patients with a total bilirubin of < or = 10 mg/dl at the time of conversion had a significantly better graft and patient survival than patients converted with a total bilirubin > 10 mg/dl (P=0.00002 and P=0.00125, respectively). The time between liver transplantation and conversion also affected graft and patient survival. Patients converted to tacrolimus < or = 90 days after transplantation had a 1-year actuarial graft and patient survival of 51.9% and 65.9%, respectively, compared with 73.2% and 87.7% for those converted > 90 days after transplantation. The mean total bilirubin level for the responder group was 7.1 mg/dl at the time of conversion and decreased significantly to a mean of 3.4 mg/dl at the end of the study (P=0.0018). Thirteen patients (14.3%) died during the study. Sepsis was the major contributing cause of death in most of these patients. CONCLUSIONS: Our results suggest that conversion to tacrolimus for chronic rejection after orthotopic liver transplantation represents an effective therapeutic option. Conversion to tacrolimus before development of elevated total bilirubin levels showed a significant impact on long-term outcome.  相似文献   

11.
BACKGROUND: In renal transplantation the beneficial immunosuppressive effects of cyclosporin (CsA) may be curtailed by its nephrotoxicity, specially in patients receiving a cadaveric allograft from suboptimal donors or at risk of delayed graft function. Mycophenolate mofetil (MMF) and antithymocyte globulin (ATG) have each demonstrated to be potent immunosuppressants in renal transplantation. In a prospective analysis we have studied the results at 6 months of the combination of MMF, ATG and low-dose steroids in patients with low immunological risk receiving a first cadaveric renal allograft from a suboptimal donor or at risk of delayed graft function. METHODS: Patients with preformed reactive antibodies < 500% receiving a first graft from a suboptimal donor (age > or = 40 years, non-heart-beating, acute renal failure, arterial hypertension) or at risk of delayed graft function (cold ischaemia time > or = 24 h) were eligible for this open single-arm pilot trial. From September 1996 to March 1997 we recruited 17 patients. They were treated with MMF 2 g p.o. preoperatively, and after transplantation at 3 g/day; rabbit ATG i.v. at 2 mg/kg preoperatively, and 1.5 mg/kg/day the first day after transplantation, followed by four doses of 1 mg/kg on alternate days; prednisone was given at 0.25 mg/kg/day and reduced progressively to 0.1 mg/kg/day at 3 months. Primary outcomes were incidence of biopsy-proven acute rejection, delayed graft function, opportunistic infections, graft and patient survival, and the need for introduction of CsA treatment. RESULTS: delayed graft function occurred in two cases (12%). Four of 17 patients (24%) had a biopsy-proven acute rejection (2 grade I and 2 grade II) within the first 3 months after transplantation. CsA was added in two cases with grade II biopsy-proven acute rejection, and in one with grade I biopsy-proven acute rejection. In one patient MMF was replaced by CsA because of gastrointestinal intolerance. Mean serum creatinine 6 months after transplantation was 159+/-59 micromol/1. Cytomegalovirus tissue invasive disease occurred in one patient (6%). At 6 months follow-up all patients are alive with functioning allografts. CONCLUSIONS: These preliminary results suggest that in low-immunological-risk patients who receive a suboptimal renal allograft or at risk of delayed graft function, the combination of MMF, ATG, and steroids is an efficient immunosuppressive regime that may avoid the use of CsA in 70% of the recipients.  相似文献   

12.
BACKGROUND: This paper reports the histopathologic results of 2-year protocol biopsies from patients who were enrolled in the U.S. FK506 kidney transplant study . METHODS: Recipients of cadaveric kidney transplants were randomized to tacrolimus or cyclosporine therapy. Patients active in the trial at 2 years after transplantation were approached for a protocol biopsy. Biopsies were scored by the Banff classification in a blinded fashion by one pathologist. RESULTS: A total of 144 patients (41.3% of those active at 2 years) had a 2-year protocol biopsy performed; 79 patients were treated with tacrolimus and 65 patients were treated with cyclosporine. Evidence of acute rejection was found in seven (8.9%) of the 2-year biopsies in tacrolimus-treated patients and six (9.2%) cyclosporine-treated patients. Chronic allograft nephropathy was found in 49 (62.0%) tacrolimus biopsies and 47 (72.3%) cyclosporine biopsies (P=0.155). There were no apparent histopathologic differences between the tacrolimus and cyclosporine biopsies. The occurrence of chronic allograft nephropathy was significantly higher in patients who received a graft from an older donor (P<0.01), who experienced presumed cyclosporine or tacrolimus nephrotoxicity (P<0.001), who developed a cytomegalovirus infection (P=0.038), or who experienced acute rejection in the first year after transplantation (P=0.045). A multivariate analysis showed that nephrotoxicity and acute rejection were the most significant predictors for chronic allograft nephropathy. CONCLUSIONS: The occurrence of histologic acute rejection was rare at 2 years, confirming the absence of subclinical acute rejection in these late biopsies. A majority of the biopsies showed features consistent with chronic allograft nephropathy that was associated with acute rejection (particularly in cyclosporine-treated patients), nephrotoxicity, and cytomegalovirus infection in the first year. This suggests that nonimmunologic factors, such as drug-induced toxicity, may play an important role in chronic allograft nephropathy.  相似文献   

13.
Posttransplant lymphoproliferative disorder (PTLD) is associated with Epstein-Barr virus (EBV), and may clinically resemble acute allograft rejection. Three methods to show EBV in tissue were evaluated in 15 liver allograft biopsies from 12 patients including four with PTLD: (1) semiquantitative polymerase chain reaction (PCR) for EBV DNA; (2) in situ hybridization for EBV RNA (EBER); and (3) immunoperoxidase for EBV latent membrane protein (LMP). Index cases had a PCR dot blot result of "positive" or "weak positive." Findings were correlated with histology, clinical data, therapy, and outcome. All four PTLD patients had a clinical diagnosis of acute rejection. All four showed EBV: PCR 4, EBER 4, LMP 3, Liver function tests were elevated in three, but EBV viral capsid antigen (VCA) IgM was not increased in three, but EBV viral capsid antigen (VCA) IgM was not increased in three. Immunosuppression was withdrawn and all four patients underwent a second transplantation. One died 4 days posttransplant with disseminated PTLD, two died of sepsis at 1.5 and 14 months, and one is well at 3 years without PTLD. Eleven biopsies without PTLD showed: acute rejection 7, acute rejection and hepatitis 1, hepatitis B 1, and non-inflammatory changes 2. In this group, EBV results included: PCR weak positive in 10 and 1+ in one, EBER negative in ten and rare positive cells in one, LMP negative in 11. Liver function tests were elevated in 10, whereas VCA IgM was not increased in three and increased in one. Patients with acute rejection were treated with increased immunosuppression: none developed PTLD, with follow-up of at least 6 months in nine cases. Two patients died within 4 months of biopsy. One patient with PTLD in tonsils had a liver biopsy showing both acute rejection and EBV (PCR 1+, rare EBER + small cells). Histological studies combined with special EBV detection methods, can be useful to evaluate atypical lymphoid infiltrates in liver allograft biopsies and confirmation of a diagnosis of PTLD. All three methods are useful; EBER and PCR are the most sensitive. EBER and LMP can use paraffin sections.  相似文献   

14.
BACKGROUND: Graft coronary artery disease (CAD) is an increasingly important problem during long-term survival after heart transplantation, but the importance of cellular rejection, in particular late after transplantation, remains undetermined. METHODS and RESULTS: We analyzed 492 coronary angiographies (967+/-705 days after transplantation; range, 49 days to 9.4 years) and 5201 endomyocardial biopsies (518+/-648 days after transplantation) from 156 patients (age, 47+/-11 years). Patients with angiographically detectable graft CAD had significantly more episodes of rejection requiring augmentation of immunosuppressive therapy (i.e., International Society of Heart and Lung Transplantation score > or = 3A) than those without graft CAD during the first (3.7+/-2.6 vs. 2.2+/-2.0, P<0.001) as well as subsequent years after transplantation (1.2+/-1.9 vs. 0.4+/-0.9, P<0.01). Multivariate logistic regression analysis including established risk factors for CAD, ischemic time, gender and age of donors and recipients, number of mismatches, cytomegalovirus infection, and drug therapy showed that the number of rejections during the first [odds ratio (OR)=1.39, P<0.005] as well as subsequent years (OR=1.49, P<0.05), previous cytomegalovirus infection (OR=3.21, P<0.05), donor age >40 years (OR=2.97, P<0.05), and current or former smoker status (OR=2.76, P<0.05) were independent predictors of graft CAD. In patients without angiographically detectable graft CAD 1 year after transplantation, the number of rejections after the first year was even more strongly related to graft coronary artery disease than in the total patient population, underlining the importance of late cellular rejection (OR=1.74, P<0.005). CONCLUSION: Rejection requiring augmentation of immunosuppression early and late after transplantation is an independent risk factor for the development of angiographically detectable graft CAD. Hence, the search for and treatment of moderate or severe rejection seems to be prudent even late after transplantation.  相似文献   

15.
BACKGROUND: The main causes of allograft failure after cardiac transplantation are primary graft dysfunction, intractable acute rejection, and coronary graft disease. Despite the important progress in the last several years in graft preservation, surgical techniques, immunosuppression, and treatment of coronary graft disease, retransplantation in selected cases is the only way to achieve long-term recipient survival. METHODS: We compare here in a case-control study 24 retransplantations with 47 first transplants in patients matched for date of transplantation. RESULTS: Between 1973 and 1996, 1,063 patients underwent cardiac transplantation in our institution. In this cohort, 22 patients had a total of 24 retransplantations (2 second-time retransplantations). The causes of retransplantations were primary graft failure (n=4), acute rejection (n=7), coronary graft disease (n=11), and miscellaneous (n=2). Survival at 1 and 5 years of patients with retransplantations is 45.5% and 31.2%, and survival of control patients is 59.4% and 38.8% (p=0.07). An interval between first transplantation and retransplantation shorter (n=11) or longer (n=13) than 1 year is associated with a 1-year survival of 27.3% and 61.5% and a 4-year survival of 27.3% and 46%, respectively (not significant). Intervals shorter than 1 year between first transplantation and retransplantation were exclusively secondary to primary graft failure or intractable acute rejection. CONCLUSIONS: In the face of lack of donor grafts, these and other data indicate that retransplantation should be considered cautiously, especially when the interval between the first transplantation and retransplantation is short.  相似文献   

16.
BACKGROUND: It was reported that autoantibodies against cyclophilin are present in sera from systemic lupus erythematosus. We hypothesized that autoantibodies against FKBP12, another immunophilin, may be present in the plasma of liver allograft recipients, which may affect the clinical outcome of liver allografts. METHODS: We investigated the relationship between the presence of anti-FKBP12 autoantibodies and rejection episodes in 47 patients treated with FK506 after living-related partial liver transplantation (LRLT). The patients consisted of two groups: 22 with rejection [R(+) group] and 25 without rejection [R(-) group]. The autoantibodies were measured by an indirect ELISA, and the specificity was confirmed by absorption with antigen and immunoblotting. RESULTS: The autoantibodies were detected in 13 of 22 in the R(+) group (IgG: 5; IgM: 6; both: 2) and in 6 of 25 in the R(-) group (IgG: 2; IgM: 3; both: 1) before LRLT (P=0.0193). After LRLT, they were also detected more frequently in the R(+) group (12 of 22; IgG: 1; IgM: 8; both: 3) than in the R(-) group (2 of 25; IgG: 1; IgM: 1) (P=0.001). In the R(+) group, the mortality of the patients who were positive and negative for the autoantibodies was 6 of 12 and 2 of 10, respectively. The autoantibodies were detected in all four patients with chronic or refractory acute rejection. The autoantibodies were not detected in any of the 34 healthy subjects. CONCLUSIONS: These results suggest that the presence of the autoantibodies in patients before transplantation is related to rejection, and the presence after transplantation may be associated with patient outcome.  相似文献   

17.
BACKGROUND: The effect of mycophenolate mofetil (MMF) and sirolimus (rapamycin, RAPA) mono- and combination-therapy was examined in prevention of acute heart, pancreas, and kidney allograft rejection and in reversal of ongoing heart allograft rejection in the rat. METHODS: Both drugs were administered orally for up to 30 days. Eleven groups (n=6) were involved in the first part of the heart allografting model. Brown Norway (RT1n) to Lewis (RT1(1)) combination was used in the heart and pancreas transplantation models, whereas Buffalo (RT1b) to Wistar Furth (RT1u) was used in the kidney transplantation model. RESULTS: The naive control group showed a mean survival time of 6.5+/-0.6 days. There were graded dose-responses to monotherapy of MMF 10 and 20 mg(kg/ day (12.5+/-2.6 days; 19.3+/-9.0 days) and RAPA 0.2, 0.4, 0.8, and 1.8 mg/kg/day (19.2+/-2.0 days; 30.0+/-7.3 days; 50.8+/-12.5 days; 51.2+/-2.6 days), respectively (P=0.001). Results with the combined use of drugs indicate that a synergistic or very strong synergistic interaction was produced when compared with monotherapy of MMF or RAPA: MMF 10 mg(kg/day+RAPA 0.2 mg/kg(day (52.7+/-5.7 days, combination index [CI] =0.189), MMF 20 mg(kg/day+RAPA 0.2 mg/kg/day (57.7+/-5.7 days, CI=0.084), MMF 10 mg/kg/day+RAPA 0.4 mg(kg/day (50.2+/-13.5 days, CI=0.453), and MMF 20 mg/kg(day+ RAPA 0.4 mg/kg(day (51.5+/-6.8 days, CI=0.439), respectively. These results were repeatable in the prevention of acute pancreas and kidney allograft rejection in the rat. In the second part of the study of reversal of ongoing acute heart allograft rejection model, the combined treatment of MMF 10 mg/kg(day+RAPA 0.2 mg/ kg(day (35.5+/-16.0 days, CI=0.794) and MMF 20 mg/kg day+RAPA 0.2 mg(kg/day (57.2+/-4.7 days, CI=0.310) represented synergistic interaction compared with monotherapy of MMF or RAPA. CONCLUSIONS: Concomitant therapy of MMF and RAPA produces a synergistic effect in prevention of heart, pancreas, and kidney allograft rejection and in reversal of ongoing heart allograft rejection in the rat.  相似文献   

18.
BACKGROUND: Mechanisms by which delayed allograft function reduces renal allograft survival are poorly understood. This study evaluated the relationship of delayed allograft function to acute rejection and long-term survival of cadaveric allografts. METHODS: 338 recipients of cadaveric allografts were followed until death, resumption of dialysis, retransplantation, loss to follow-up, or the study's end, which ever came first. Delayed allograft function was defined by dialysis during the first week following transplantation. Multivariate Cox proportional hazards survival analysis was used to assess the relationship of delayed allograft function to rejection and allograft survival. RESULTS: Delayed allograft function, recipient age, preformed reactive antibody levels, prior kidney transplantation, recipient race, rejection during the first 30 days and rejection subsequent to 30 days following transplantation were predictive of allograft survival in multivariate survival models. Delayed allograft function was associated with shorter allograft survival after adjustment for acute rejection and other covariates (relative rate of failure [RR]+1.72 [95% CI, 1.07, 2.76]). The adjusted RR of allograft failure associated with any rejection during the first 30 days was 1.99 (1.23, 3.21), and for rejection subsequent to the first 30 days was 3.53 (2.9 08, 6.00). The impact of delayed allograft function did not change substantially (RR=1.84 [1.15, 2.95]) in models not controlling for acute rejection. These results were stable among several subgroups of patients and using alternative definitions of allograft survival and delayed allograft function. CONCLUSIONS: This study demonstrates that delayed allograft function and acute allograft rejection have important independent and deleterious effects on cadaveric allograft survival. These results suggest that the effect of delayed allograft function is mediated, in part, through mechanisms not involving acute clinical rejection.  相似文献   

19.
In this study, we analyzed the relative impact of donor and recipient variables on cadaveric renal allograft function and survival. The unique feature of the study population is that each pair of recipients received their allografts from a single donor. The study includes 378 adult patients. In 129 pairs both recipients were Caucasian, and in 60 pairs one recipient was Caucasian and the other was African-American. All transplants were done in one center, thus minimizing differences in preservation time and providing uniform posttransplant management. The initial analysis showed a relationship between the function of the allograft 6 months after transplantation (serum creatinine [SCr]6 mo) and donor variables (P = 0.0004, analysis of variance). Furthermore, it was calculated that 64% of the variability in the SCr 6mo among patients was due to donor factors and 36% was due to recipient factors. An elevated SCr 6 mo was significantly associated with older donors, male recipients, and patients with acute rejection episodes. Furthermore, other unidentified donor factors may have an impact on allograft function. Reflecting the importance of donor factors, there was a significant relationship between SCr 6mo in paired recipients (P < 0.0008 by Spearman). Analysis of racially dissimilar pairs showed that the SCr 6mo and graft survival 6 months after transplantation were not significantly different between Caucasians and African-Americans. However, beyond 6 months, graft survival was worse in African-Americans (P < 0.0001 by Cox). Compared with Caucasians, graft survival was significantly worse in African-Americans with poorly controlled blood pressure (mean arterial pressure > 105 mmHg) (P = 0.002, Cox), but not in those patients with mean arterial pressure < 105 mmHg. In conclusion, donor factors are major determinants of renal allograft function. However, those factors may not be easily identifiable or quantifiable. Donor factors do not contribute to racial differences in allograft survival. However, poorly controlled hypertension correlates with poor renal graft survival in African-Americans.  相似文献   

20.
BACKGROUND: Experiments were designed to determine expression of type II (iNOS) and type III (ecNOS) nitric oxide synthase in lung parenchyma and systemic endothelial cells with rejection and/or infection of single lung allografts. METHODS: After single lung allotransplantation, dogs were maintained on standard triple immunosuppressive therapy for 5 days and then placed into one of three groups. Group I (n=4) was maintained on immunosuppressants, group II (n=7) immunosuppression was withdrawn to allow acute rejection of the allograft, and group III (n=6) infection was induced by bronchoscopic inoculation of Escherichia coli. RESULTS: At postoperative days 7-9, no histological evidence of rejection or infection was observed in transplanted lungs of group I. In lungs of group II, rejection ranged from mild to severe; in lungs of group III, infection was severe. Some animals had both rejection and infection (n=8) and were studied separately. Plasma levels of nitric oxide increased comparably with rejection and/or infection compared to preoperative values. Expression of mRNA for ecNOS decreased significantly in lung parenchyma but not in aortic endothelial cells from dogs of groups II and III. However, expression of mRNA for iNOS increased with both rejection and/or infection in both lung parenchyma and aortic endothelial cells. CONCLUSIONS: iNOS is induced locally within the graft and systemically in aortic endothelial cells with rejection and/or infection of lung allografts. Plasma levels of nitric oxide are elevated with both rejection and infection and may not be useful in the differential diagnosis of these processes after lung transplantation.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号