首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 62 毫秒
1.
BACKGROUND: In cadaveric renal transplantation, delayed graft function (DGF) correlates with poor long-term graft survival; however, whether its effects are independent of acute rejection is controversial. We wished to study the effect of DGF on graft survival, controlling for acute rejection, discharge creatinine, and human leukocyte antigen match. METHODS: We analyzed 27,096 first cadaveric donor renal transplants reported to the UNOS Scientific Renal Transplant Registry between January 1994 and November 1997. DGF was defined as dialysis need in the first week. Acute rejection was recorded for initial hospitalization and within 6 months. Kaplan Meier survival curves were analyzed with the log rank test. RESULTS: DGF increased the incidence of acute rejection before discharge (8% without DGF; 25% with DGF, P<0.01) and any acute rejections by 6 months (25% without DGF, 42% with DGF, P<0.01). Without early rejection, DGF reduced 1-year graft survival from 91 to 75% (P<0.0001) and graft half-life from 12.9 to 8.0 years. In kidneys with acute rejection within 6 months, DGF decreased 3-year graft survival from 77 to 60% and graft half-life from 9.4 to 6.2 years (P<0.001). With a discharge creatinine of less than 2.5 mg/dl, the difference in graft half-life between no DGF and no rejection (13.4 years) and DGF with rejection (9.8 years) was significant (P<0.001). Increased donor age and cold ischemia time additionally decreased graft survival, whereas a good human leukocyte antigen match could not overcome the deleterious effects of DGF or acute rejection. CONCLUSIONS: DGF is an important independent predictor of poor graft survival. Newer immunosuppressive strategies must minimize nonimmune and immune renal injury if long-term graft survival is to improve.  相似文献   

2.
BACKGROUND: Because of the shortage of kidneys available for transplantation, we began in 1985 to harvest kidneys from non-heartbeating (NHB) donors. METHODS: We compared the results of a group of 66 kidney recipients from NHB donors (NHB group) with 122 kidney recipients from heartbeating donors (HB group). We analyzed, in the NHB group, the influence of ischemia times in graft survival and we tested the best cut-offs by receiver operating characteristic curves. We also studied, using a univariate and multivariate Cox hazard model, the capacity of different variables to predict graft loss. RESULTS: Patient and graft survival were similar in both groups during the follow-up. The percentage of delayed graft function was the only significant difference between both groups (NHB group 62% vs. HB group 32%; P=0.0001). Delayed graft function, in the NHB group, is influenced by the warm ischemia time, which is directly related to the number of days to achieve a serum creatinine<300 mmol/L (P=0.0001). The best cut-off times in this group were 45 min for warm ischemia time and 22 hr for cold ischemia time. Recipients have a greater likelihood of losing the graft beyond those limits (P=0.017, relative risk: 7.3). The incidence of acute rejection was similar in both groups, and it was the only predictor factor of graft loss in the complete series of patients (P=0.0001), in the NHB group (P=0.007), and in the HB group (P=0.02). CONCLUSIONS: Reducing the incidence of acute rejection and shortening ischemia time are conditions needed to guarantee a long graft survival of kidneys from NHB donors.  相似文献   

3.
BACKGROUND: We reviewed 843 first cadaver kidney transplants carried out consecutively at our center to examine the effect on long-term graft survival of the duration of delayed graft function (DGF), defined as the time taken for the kidney to attain the threshold of a Cockcroft calculated creatinine clearance (cCCr) > or = 10 ml/min. METHODS: Using a multivariate Cox survival analysis we evaluated the consequences of DGF on allograft survival, and then by regression analysis identified the factors contributing to the occurrence of DGF. Finally, using a Kaplan Meier analysis we compared the profiles of graft failure according to the duration of DGF. RESULTS: Defining DGF in terms of cCCr rather than necessity for dialysis after transplantation allowed better prediction of long-term graft loss. Indeed, patients with a Cockcroft-based DGF > six days who did not require dialysis (12%) had a significantly poorer long-term graft outcome than those with a DGF < or = six days. Furthermore, we showed that a DGF of six days could be taken as a cut-off point that marked a significant difference in the long-term graft survival rate (P < 0.0001). Surprisingly, further extension of the duration of DGF > six days was not associated with further worsening of graft survival (except in DGF > 30 days). CONCLUSION: Our results suggest a threshold effect in the lesions that ultimately results in long-term functional deficiency. In addition, we show that the need for dialysis is not an adequate criterium for DGF in terms of long-term outcome prediction.  相似文献   

4.
BACKGROUND: Long-term administration of cyclosporin carries a risk of renal toxicity, and immunosuppressants are associated with an increased rate of malignant disorders. We undertook an open randomised study of the risks and benefits of two long-term maintenance regimens of cyclosporin in kidney-allograft recipients. The primary endpoint was graft function; secondary endpoints were survival and occurrence of cancer and rejection. METHODS: 231 recipients of a first allograft with at most one previous rejection episode were randomised 1 year after transplantation. Most were receiving cyclosporin and azathioprine. One group received cyclosporin doses adjusted to yield trough blood concentrations of 75-125 ng/mL (low-dose group); the second received doses that yielded trough concentrations of 150-250 ng/mL (normal-dose group). Analysis was by intention to treat. FINDINGS: At 66 months' follow-up, the low-dose and normal-dose groups were similar in mean serum creatinine (182 [SD 160] vs 184 [157] micromol/L; p=0.9) and mean creatinine clearance (47.5 [25.1] vs 45.3 (22.5] mL/min; p=0.6). Nine of 116 patients in the low-dose group and one of 115 in the normal-dose group had symptoms of rejection (p<0.02). There was no difference between the low-dose and normal-dose groups in survival (95 vs 92%; p=0.7) or graft survival (89 vs 82%; p=0.17) at 6 years. 60 patients developed cancers, 37 in the normal-dose group and 23 in the low-dose group (p<0.034); 66% were skin cancers (26 vs 17; p<0.05). INTERPRETATION: We found no evidence that halving of trough blood cyclosporin concentrations significantly changes graft function or graft survival. The low-dose regimen was associated with fewer malignant disorders but more frequent rejection. The design of long-term maintenance protocols for transplant recipients based on powerful immunosuppressant combinations should take these potential risks into account.  相似文献   

5.
To determine if cardiac allograft outcome is improved among patients with fewer HLA-DR mismatches with their donors, we studied 132 recipients of a primary cardiac allograft who were transplanted between December 1985 and December 1991. These recipients and their donors all had high-confidence-level serological HLA-DR typing, previously shown to correlate highly with DNA DR typing. Patients were divided in two groups based on the HLA-DR mismatch with their donors. Group I consisted of 78 patients with 1 or zero DR mismatch and group II of 54 patients with 2 DR mismatches. Allograft outcome measurements included incidence of moderate rejection, incidence of allograft vasculopathy at 12 months, cardiac function measured as left ventricular ejection fraction (LVEF) and cardiac index (CI), and actuarial graft survival up to 7 years. Groups I and group II were not different with regard to recipient age, donor age, ischemia time, pulmonary vascular resistance, sex, or PRA greater than 0%. Group II had a higher incidence of moderate rejection on the first-week biopsy (47% vs. 25%, P = 0.019), and during the first month (84% vs. 58%, P = 0.006), but no difference was found in frequency of rejection from months 2 to 12. LVEF was not different in the groups at any point. CI was better in group I at 12 months (2.76 vs. 2.5, P = 0.03). No statistically significant difference was found in incidence of allograft vasculopathy (17% vs. 26%, P = 0.204). Actual graft survival at 1 year was better for group I (91% vs. 74%, P = 0.008), and actuarial graft survival at 6 years also favored group I (76% vs. 56%, P = 0.04). Using high-confidence-level serological HLA-DR typing assignments we demonstrated that HLA-DR mismatching correlates highly with cardiac allograft outcome. Implications are that heart transplant survival could be improved if prospective matching were feasible and prioritized or if immunosuppression were tailored to the HLA-DR match.  相似文献   

6.
The ethnic origin of renal graft recipients is recognized as an important determinant of graft survival. In liver transplantation, the effect of racial origin has been studied in black American recipients and has suggested a trend toward inferior graft survival in this group. In this study, we have analyzed outcome of transplantation in a large multiethnic liver transplant program. Non-Caucasoid recipients had an inferior patient survival compared with Caucasoids and, in particular, European Caucasoids at 1, 3, and 5 years after transplantation (46.7% vs. 60.2% at 3 years, P = 0.05). Non-European recipients had an inferior graft survival compared with European recipients at 1, 2, and 3 years after transplantation (e.g., north Europeans 53.5%, south Europeans 48.5%, Middle Eastern 40%, and non-Caucasoids 27% at 3 years, P < 0.01). Different frequencies of chronic allograft rejection in the ethnic groups contributed to the rates of graft survival, with the non-European recipients developing chronic rejection at over twice the rate of European recipients (12.6% vs. 5.9%, respectively, P = 0.002). The findings in this study support the evidence from renal transplant programs that the ethnic origin of recipients is an important determinant of outcome after transplantation, with increasing frequency of chronic rejection in recipients nonindigenous to the donor population contributing to the variations in patient and graft survival rates.  相似文献   

7.
We studied multiple determinants of graft survival at a single center and the effects of nonimmunologic graft loss on transplant survival. This retrospective study examined the results of 589 cadaver donor transplants performed between 1986 and 1992. Graft survival rates were calculated using Kaplan-Meier estimates for both overall graft survival (all causes of graft loss) and immunologic graft survival (function lost due to acute or chronic rejection and noncompliance). Cadaver graft survival was significantly poorer with an increasing degree of DR mismatch (P=0.02). An analysis of pretransplant variables showed graft loss risk was highest with greater DR mismatches, two B-antigen mismatch, higher donor serum creatinine, and younger recipient age. After transplantation, acute rejection was the most significant factor associated with long-term graft survival. Our data demonstrate a significant advantage for zero DR and one DR mismatch cadaver donor transplants, with excellent immunologic graft survival. This study suggests that a combination of immediate graft function, prevention of acute rejection by appropriate early immunosuppressive therapy, and acceptable DR match enhances cadaveric graft survival.  相似文献   

8.
BACKGROUND: Delayed graft function (DGF) remains an important complication in renal transplantation. In this multicenter study, we investigated the influence of donor and recipient factors on the occurrence of DGF and DGF's effect on long-term graft survival. METHODS: A total of 547 transplanted kidney allografts, retrieved from multi-organ donors, were analyzed, and results were compared with literature on kidney-only donors. RESULTS: Median follow-up of patients without graft failure was 3.4 years. Twenty-four percent of the recipients developed DGF. In univariate analysis, the following factors significantly increased the incidence of DGF: (a) among the donor factors, mean creatinine level >120 micromol/L and prolonged cold ischemia time (CIT); and (b) among the recipient factors, previous transplant(s), no intraoperative use of mannitol, poor quality of reperfusion, absence of intraoperative diuresis, and pretransplant anuria or oliguria. After stepwise logistic regression, donor age, CIT, recipient's number of previous transplants, and intraoperative diuresis proved to be of independent prognostic value for the occurrence of DGF. Overall graft survival was 91%, 87%, and 72% at 3 months, 1 year, and 4 years after transplantation, respectively. In case of DGF, graft survival was approximately 10% lower when compared with cases with immediate graft function (P<0.001). No difference in incidence of DGF was found between grafts of multi-organ donors and kidney-only donors. CONCLUSIONS: DGF results in an approximately 10% higher rate of graft failure. DGF incidence can be reduced by the administration of mannitol during transplantation, which minimizes CIT and optimizes donor management. Grafts from multi-organ donors and kidney-only donors appear to be of equal quality.  相似文献   

9.
BACKGROUND: We previously reported excellent outcome at 6 months after transplantation in recipients of expanded criteria donor kidneys that other local centers had declined, kidneys that nobody wanted (KNW), versus controls. We now report follow-up after 23 months. METHODS: We retrospectively reviewed 27 donor and 24 recipient characteristics in 126 adult recipients of transplants from January 1, 1995, to November 25, 1996. RESULTS: Donors of control kidneys versus KNW were younger and had significantly higher minimum 4-hr urine output. Recipients of control kidneys versus KNW had significantly more HLA matches and lower 3-month posttransplant serum creatinine levels. Patient and graft survival rates were similar between the control kidneys versus the KNW. We also compared the control kidneys and KNW with regard to prompt function or delayed graft function and satisfactory versus unsatisfactory function (unsatisfactory: serum creatinine > or =2.5 ml/dl or graft loss at 6 months) to identify donor and recipient characteristics associated with delayed graft function and unsatisfactory outcome. The incidence of rejection was significantly lower in control kidneys and KNW with satisfactory function versus control kidneys and KNW with unsatisfactory function. CONCLUSIONS: These data demonstrate: (1) similar graft survival at 12 months, (2) lower donor age, (3) higher minimum 4-hr urine output, and (4) more HLA matches in recipients of control kidneys versus KNW. Optimal outcome was achieved in recipients of control kidneys and KNW with prompt function and satisfactory function based upon serum creatinine in the first 6 months and in recipients with lower rates of rejection. Although outcome is dependent upon many donor and recipient variables, we believe that with careful donor and recipient selection, excellent outcome can be achieved using expanded criteria donor kidneys.  相似文献   

10.
BACKGROUND: To maximize the renal donor pool, cadaveric pediatric en bloc kidneys have been transplanted as a dual unit by some transplant centers. We compared the short- and long-term outcomes of adult recipients of cadaveric pediatric en bloc renal transplants versus those of matched recipients of cadaveric adult kidneys. METHODS: Thirty-three adults who received pediatric en bloc kidney transplants between April 1990 and September 1997 were retrospectively identified and were compared with 33 matched adults who received adult cadaveric kidney transplants. The groups were identical for transplantation era, immunosuppression, recipient sex, race, cause of renal failure, mean weight, and follow-up duration (37.8 vs. 37.5 months). The mean recipient age study versus control was lower (36.3 vs. 48.9 years, P=0.0003). Results. There was no difference between the en bloc and adult donor groups in the 3-year patient survival rates (95% vs. 87%, P=0.16) or the 3-year graft survival rates (87.3% vs. 84.2%, P=0.35). Further, there was no difference in en bloc patient or en bloc graft survival time stratified by recipient age (14-44 vs. >45 years, P=0.11), en bloc donor age (<24 vs. >24 months, P=0.39), or recipient weight (<60, 61-75, >75 kg; P=0.60). Differences in serum creatinine (mg/dl) for the en bloc versus the control group at the time of discharge (3.0 vs. 7.8 mg/dl, P=0.06), at 1 year (1.4 vs. 2.0 mg/dl, P=0.06), and at 2 years (1.1 vs. 1.6 mg/dl, P=0.14) had dissipated by the time of the 5-year follow-up examination (1.1 vs. 1.6 mg/dl, P=0.14). Vascular complications were more prevalent in the en bloc group: renal vein thrombosis (one case), thrombosis of donor aorta (two cases), arterial thrombosis of one renal moiety (two cases), and renal artery stenosis (two cases). There were no differences between groups in delayed graft function, acute or chronic rejection, posttransplant hypertension, posttransplant protein-uria, or long-term graft function. CONCLUSIONS: Collectively, these data indicate that transplanting pediatric en bloc kidneys into adult recipients results in equivalent patient and graft survival compared with adult cadaveric kidneys. Further, the data also suggest that pediatric en bloc kidneys need not be strictly allocated based on recipient weight or age criteria.  相似文献   

11.
BACKGROUND: Recipient hepatitis C virus (HCV) seropositivity has been associated with inferior outcomes in renal transplantation (RTx). We sought to determine whether donor HCV+ status influenced the incidence of rejection, liver dysfunction, and graft survival in HCV+ recipients. METHODS: We reviewed 44 HCV+ recipients (R+) receiving RTx from HCV+ (D+) and HCV- (D-) donors between February 1991 and September 1996. All patients were followed to the end of the study period (mean=36 months, range=12-60 months). We compared the R+ group with a demographically matched cohort of 44 HCV- recipients (R-). RESULTS: Of the 44 R+, 25 (57%) had a total of 48 rejection episodes. Among the 44 R-, 32 (73%) had 58 rejection episodes (P>0.1). Within the R+ group, 28 were D+/R+; of these 14 (50%) had 27 rejection episodes, whereas among the 16 D-/R+, 11 (68%) had 21 rejection episodes (P>0.3). Graft and patient survival was similar in both the groups (86.4% and 91%, respectively). Liver dysfunction was slightly increased in the R+ group (4/44 vs. 0/44, P>0.1), with one death due to liver failure in this group. CONCLUSION: Donor HCV+ status had no influence on outcomes in HCV+ recipients after kidney transplantation in the short term. The incidence of rejection, graft loss, and mortality was comparable between the D+/R+ and D-/R+ groups. Furthermore, rejection, graft loss, and death were identical in R+ and R-groups throughout the 5-year study period. We therefore conclude that HCV+ recipients can safely receive kidney transplants without concern about donor HCV status or fear of adverse events from their own HCV+ status.  相似文献   

12.
In January 1988, we initiated a prospective, randomized comparison of prophylactic antilymphoblast globulin (ALG; quadruple therapy) versus no prophylactic ALG (triple therapy) in the setting of immediate graft function (defined by a brisk diuresis and a 20% decline in serum creatinine within 24 hr). Recipients were stratified according to presence of diabetes and age greater or less than 50 years. Recipients on quadruple therapy (n = 61) received 7 days of prophylactic Minnesota ALG (5 mg/kg on day 1, 10 mg/kg on day 2, 20 mg/kg on days 3-7). CsA, 10 mg/kg/day, began on day 6. AZA began at 2.5 mg/kg/day and was adjusted according to white blood cell count. Recipients on triple therapy (n = 60) began immediate CsA, 10 mg/kg/day orally and AZA, 5 mg/kg/day, tapering to 2.5 mg/kg/day by day 8. Both groups received identical prednisone tapers beginning at 1 mg/kg/day, decreasing to 0.5 mg/kg/day by 2 weeks and to 0.15 mg/kg/day by 6 months. Demographic characteristics between groups were not different with respect to diabetes, age, sex, race, per cent panel-reactive antibodies (PRA), or HLA matching. Follow-up ranged from 2 to 4.5 years. Patient survival was 93% for the quadruple therapy group and 90% for triple therapy. Actuarial graft survival was 79% in the quadruple group and 72% in the triple group (P = 0.18). Graft loss due to rejection occurred in 6/61 receiving ALG versus 7/60 in the immediate CsA group. Three of 4 high PRA recipients in the immediate CsA group lost their grafts within 30 days compared with none in the ALG group. The mean time to graft loss was significantly longer for the quadruple therapy group (17 +/- 8 months) compared with the triple therapy group (4 +/- 5 months), P = 0.006. The total number of rejection episodes was similar for both groups (29/61 vs. 31/60), as was the number who were rejection free (51% vs. 47%). The use of OKT3 was also similar between groups (28% vs. 30%). The quadruple therapy group had a higher incidence of CMV infection: 20% vs. 7% (P < 0.05), but no grafts or patients were lost as a result. Serum Cr was not different at 1 and 12 months (1.5 and 1.6 vs. 1.6 and 1.7, respectively), nor were Cr clearances (63 and 68 vs. 60 and 63). Conclusion. Early initiation of oral CsA in the setting of immediate graft function is not associated with significant nephrotoxicity.(ABSTRACT TRUNCATED AT 400 WORDS)  相似文献   

13.
BACKGROUND: Mechanisms by which delayed allograft function reduces renal allograft survival are poorly understood. This study evaluated the relationship of delayed allograft function to acute rejection and long-term survival of cadaveric allografts. METHODS: 338 recipients of cadaveric allografts were followed until death, resumption of dialysis, retransplantation, loss to follow-up, or the study's end, which ever came first. Delayed allograft function was defined by dialysis during the first week following transplantation. Multivariate Cox proportional hazards survival analysis was used to assess the relationship of delayed allograft function to rejection and allograft survival. RESULTS: Delayed allograft function, recipient age, preformed reactive antibody levels, prior kidney transplantation, recipient race, rejection during the first 30 days and rejection subsequent to 30 days following transplantation were predictive of allograft survival in multivariate survival models. Delayed allograft function was associated with shorter allograft survival after adjustment for acute rejection and other covariates (relative rate of failure [RR]+1.72 [95% CI, 1.07, 2.76]). The adjusted RR of allograft failure associated with any rejection during the first 30 days was 1.99 (1.23, 3.21), and for rejection subsequent to the first 30 days was 3.53 (2.9 08, 6.00). The impact of delayed allograft function did not change substantially (RR=1.84 [1.15, 2.95]) in models not controlling for acute rejection. These results were stable among several subgroups of patients and using alternative definitions of allograft survival and delayed allograft function. CONCLUSIONS: This study demonstrates that delayed allograft function and acute allograft rejection have important independent and deleterious effects on cadaveric allograft survival. These results suggest that the effect of delayed allograft function is mediated, in part, through mechanisms not involving acute clinical rejection.  相似文献   

14.
BACKGROUND: A total of 110 patients, in whom kidneys from 95 living related and 15 cadaver donor, had experienced renal transplantation between February 1985 and October 1996 in our clinic. This study was conducted to evaluate the influence of the various pre-operative factors to the graft survivals and clinical course of patients in living related renal transplantation. METHODS: In 95 recipients, 17 adult patients had long term graft survivals over 5 years including 6 recurrent or denovo nephritis without chronic allografts nephropathy. Eight failed to graft loss attributed to chronic allografts nephropathy diagnosed within 5 years. Retrospective analysis were performed to elucidate the differences of these recipients. RESULTS: Donors of long graft survival recipients were younger (49.1 +/- 12.1 v.s. 58.9 +/- 10. 2) and had a better renal function evaluated by preoperative creatinine clearance in living related donors (115.5 +/- 37.0 v.s. 79.7 +/- 22.0 1/day). Graft long survival recipients had experienced less frequencies of acute rejection within 6 months (0.53 +/- 0.62: 8 patients, 9 times) compared with chronic allografts nephropathy recipients (1.00 +/- 0.53: 7 patients, 8 times). Long graft survival recipients had better responses to the antirejection therapy. Additionally acute rejection over 6 months were experienced only in chronic allografts nephropathy recipients. Higher serum creatinine level was revealed in recipients with chronic allografts nephropathy at 1 year after transplantation (1.27 +/- 0.27 v.s. 1.88 +/- 0.42 mg/dl). CONCLUSIONS: We concluded that donor age and renal function are related to the graft long survival as background factors. Long graft survival recipients had less frequency of acute rejection and good response to the antirejection therapy. In recipients with of acute rejection and good response to the antirejection therapy. In recipients with chronic allografts nephropathy, serum cretine level had already increased gradually within 1 year.  相似文献   

15.
We report the results of 41 consecutive renal transplantations performed on 39 children (median age 2.7 years). Twenty-six recipients were less than 5 years old. Twenty-one recipients (13 under the age of 5 years) received cadaver (CAD) grafts. All grafts except 2 were from adult donors and were placed extraperitoneally. Patients were on triple immunosuppression (cyclosporine plus azathioprine plus methylprednisolone). Mean follow-up time was 2.3 years. No vascular and only one ureteral complication was seen. Acute tubular necrosis occurred in 3 patients (7.3%). No grafts were lost due to acute rejection. Three-year patient survival and 1-year graft survival were 100%. The overall 3-year actuarial graft survival was 86%. Three-year survival of grafts from living-related donors (LRD) was 92% and that of CAD grafts 75%. In recipients younger than 5 years, 3-year LRD graft survival was '89% and CAD graft survival 73%. No significant differences in graft survival between recipients of different age groups or between LRD and CAD grafts were found. We conclude that results of renal transplantation in children under 5 years of age are comparable to those of older children, even using CAD grafts, when adult donors and triple immunosuppression are used.  相似文献   

16.
BACKGROUND: To increase the utilization of cadaveric donor kidneys, we have recently expanded our acceptable criteria to include aged donors (frequently with a history of hypertension), by selectively using both donor kidneys (dual transplant) into a single recipient. METHODS: To define when these expanded criteria donor (ECD) kidneys should be used as a single versus a dual kidney transplant, we retrospectively reviewed 52 recipients of ECD kidneys that had been turned down by all other local centers between 1/1/95 and 11/15/96. Fifteen patients received dual transplants, whereas the remaining 37 received single kidneys. Of the dual kidney recipients, 14 of 15 ECD were > or = 59 years of age, 10 of 15 were hypertensive, and 9 of 15 were both. Of the single recipients, 11 of 37 ECD were > or = 59 years of age, 11 of 37 were hypertensive, and 7 of 37 were both. All patients received cyclosporine-based triple-drug therapy. We compared seven donor (D) and sixteen recipient outcome variables in single versus dual kidney transplants as subgrouped by: (1) donor admission creatinine clearance (D-AdC(Cr)) < 90 ml/min; (2) D-age > or = 59 years; and (3) cold storage (Cld Stg) < or > 24 hr. RESULTS: In the group with D-AdC(Cr) < 90, there was a significantly higher incidence of delayed graft function (DGF) in single versus dual recipients (9 of 20 [45%] vs. 1 of 11 [9%]; P=0.04) and worse early graft function based upon mean serum creatinine at 1 and 4 weeks (5.3+/-3.3 and 2.8+/-2.0 vs. 1.7+/-0.6 and 1.4+/-0.5 mg] dl; P<0.05). In the group with D-age > or = 59, recipients of single kidneys had significantly higher mean serum creatinine at 1, 4, and 12 weeks versus recipients of dual kidneys (5.1+/-3.3, 3.4+/-2.1, 2.8+/-1.5 versus 2.8+/-2.5, 1.5+/-0.6, 1.6+/-0.5 mg/dl; P<0.05). Cld Stg time also had an impact on DGF and early outcome. Recipients of dual kidneys stored less than 24 hr had a significantly lower incidence of DGF versus single kidneys stored more than 24 hr (10% vs. 46%; P<0.05) and better early graft function based on mean serum creatinine at 1, 4, and 12 weeks (1.9+/-0.8, 1.3+/-0.4, 1.5+/-0.2 vs. 6.6+/-3.4, 3.0+/-1.6, 2.9+/-1.9 mg/dl; P<0.05). The overall 1-year patient and graft survivals were 96% and 81% vs. 93% and 87% (P=NS) in recipients of single ECD versus dual ECD kidneys. CONCLUSIONS: In conclusion, we believe that kidneys from ECD with D-AdC(Cr) < 90 ml/min and D-age > or = 59 should be used as dual kidney transplants, keeping the Cld Stg time at < 24 hr to minimize the effect of Cld Stg on early graft function.  相似文献   

17.
BACKGROUND: Female heart transplant recipients are able to carry pregnancies successfully. This study evaluates the effect of subsequent pregnancies on newborn and maternal outcomes and graft survival. METHODS: Subjects were identified through a previously reported multicenter study, case reports from literature review, and recipients entered in the National Transplantation Pregnancy Registry. A retrospective analysis was completed of 35 heart transplant recipients with first pregnancies (FP) and 12 who had one or two additional pregnancies (P>1). Newborns were assessed for gestational age, neonatal birth weight, and complications. Maternal data included pregnancy outcome, peripartum complications, including infection and rejection, current graft function, and recipient survival. RESULTS: Forty-seven pregnancies (35 FP and 12 P>1) from 35 heart transplant recipients were studied. FP outcomes included 26 live births (one set of twins), four miscarriages, and six therapeutic abortions, whereas P>1 outcomes included 11 live births (one set of twins), and two miscarriages. There was no significant difference between mean birth weights (2353+/-986 gm vs 2588+/-521 g, P>1 vs FP; mean+/-SD; p=NS) or prematurity incidence (<37 weeks; 50% vs 40%; p=NS) for the live-born infants. Compared with the FP group, there was a trend toward increased neonatal complications in P>1 (40% vs 12%; p=NS). Complications were significantly more common in premature newborns compared with full-term newborns (33% vs 5%; p < 0.05). No structural malformations were identified in the live-born infants. Maternal complication rates were the same in both groups (40%). Of 28 recipients available for follow-up, the maternal survival rate was 75% for the FP group and 89% for the P> group. Mean rejection rate per year was slightly increased after pregnancy in the P>1 group. Surviving recipients had similar graft function by echocardiographic left ventricular ejection fraction. CONCLUSIONS: Post-heart transplantation pregnancies often have successful outcomes, but there is a high incidence of prematurity and low birth weight. Subsequent pregnancies do not seem to significantly increase the incidence of complications in either the newborn or mother or increase graft rejection or failure. Larger studies of posttransplantation pregnancies may provide more definitive information.  相似文献   

18.
BACKGROUND: Mycophenolate mofetil (MMF; Cell-Cept) is a potent and selective inhibitor of B and T lymphocyte proliferation that has proven effective in reducing the incidence of acute rejection in cadaveric kidney transplant recipients in several randomized, blinded clinical studies. Because the frequency and characteristics of rejection episodes may be different and more severe after combined pancreas-kidney transplantation, we hypothesized that MMF would have a significant impact on pancreas-kidney rejection and graft outcome. Therefore, we compared the efficacy of MMF versus azathioprine (AZA) in cyclosporine-treated simultaneous pancreas-kidney transplantations. METHODS: A retrospective comparison of 358 consecutive primary SPK transplantations performed from 1990 to 1997 was conducted. Patients received either MMF (n=109, 3 g/day) or AZA (n=249, 2 mg/kg q.d.) in combination with cyclosporine-based immunosuppression. All patients received a quadruple-drug sequential induction protocol with either OKT3 or Atgam. Several outcome parameters, including patient and graft survival rates and frequency of rejection, were analyzed. RESULTS: MMF-treated patients demonstrated a markedly reduced rate of biopsy-proven kidney rejection (31 vs. 75% AZA, P=0.0001), clinically significant pancreas rejection (7 vs. 24% AZA; P=0.003), and steroid-refractory rejection (15 vs. 52% AZA; P=0.01). As a result, kidney and pancreas allograft survival was significantly better in MMF patients compared with AZA patients (2-year survival rates: kidney, 95 vs. 86%; and pancreas, 95 vs. 83%). Although surgical infections after transplantation were more frequent in MMF patients, MMF patients were more likely to have undergone enteric drainage. Importantly, we did not observe an increased incidence of any of the bacterial, fungal, or viral infections that typically plague immunosuppressed transplant recipients. CONCLUSIONS: This retrospective study demonstrates that MMF is a highly effective immunosuppressant in SPK transplantation. It is not associated with an increased risk of opportunistic infections when a balanced immunosuppressive management approach is used. MMF strikingly reduces the frequency of acute cellular and steroid-resistant rejection. As a result of this combined experience, it is not unexpected then that we observe significantly improved graft survival rates in MMF-treated SPK patients compared with patients receiving a more traditional immunosuppressive regimen.  相似文献   

19.
The increasing success of clinical liver transplantation has brought rejection to the forefront as a cause of morbidity and graft loss. The relationship of immunosuppressive drug doses and levels to acute and chronic rejection remains a matter of debate. The effect of blood CsA levels and drug doses on the incidence of acute and chronic rejection and the impact of acute rejection episodes on the occurrence of chronic rejection were studied in 146 grafts in 132 patients. These patients were transplanted in the 4-year period from June 1989 using CsA-based immunosuppression (CsA, azathioprine, prednisolone). Liver grafts in patients maintained on median CsA levels (whole blood, trough level) of > or = 175 micrograms/L in the first 28 days posttransplant had a significantly lower incidence of chronic rejection (2 out of 49 vs. 22 out of 97; P = 0.002). There was no significant difference in incidence of graft loss due to fatal sepsis (6% vs. 5%) or nephrotoxicity between the high and low CsA level groups. The overall graft loss rate was lower in the higher CsA level group (22% vs. 37%). The total doses of the individual drugs did not correlate with the incidence of acute or chronic rejection. Although the occurrence of acute rejection itself did not determine later chronic rejection, late occurrence (P < 0.00001) and multiple episodes (two or more; P = 0.0002) of acute rejection were significant risk factors for the occurrence of chronic rejection. We conclude that to minimize graft loss to rejection, CsA levels should be maintained at greater than 175 micrograms/L in the early posttransplant period, and late and recurrent episodes of acute rejection should be prevented.  相似文献   

20.
The long-term graft function after withdrawal of steroids from maintenance immunosuppression was analyzed in 98 kidney recipients (59 on cyclosporin monotherapy, 39 on cyclosporin plus azathioprine) who had not developed an early rejection episode when prednisolone was discontinued. Seven years after steroid withdrawal the probability of an increase in serum creatinine (> 20% of baseline levels) was 51%. The increase in creatinine was associated with sclerosing arteriopathy as a marker of chronic rejection in 29 of 43 graft biopsies. The addition of azathioprine had no effect on the stability of long-term graft function and did not influence the 7-year graft survival rate in this highly selected group of patients.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号