首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
PURPOSE: To determine if an intensive preparative regimen of busulfan (BU), cyclophosphamide (CY), and total-body irradiation (TBI) could improve outcome after marrow transplantation for advanced morphology myelodysplasia (refractory anemia with excess blasts [RAEB], RAEB in transformation [RAEB-T], and chronic myelomonocytic leukemia [CMML]) compared with that obtained with conventional CY/TBI and to analyze prognostic factors for transplantation for myelodysplasia. PATIENTS AND METHODS: A phase II study was conducted of 31 patients (median age, 41 years) treated with BU (7 mg/kg), CY (50 mg/kg), TBI (12 Gy), and human leukocyte antigen (HLA)-matched (n = 23) or -mismatched (n = 2) related or unrelated donor (n = 6) marrow transplantation. Results were compared with 44 historical control patients treated with CY (120 mg/kg) and TBI. RESULTS: The 3-year actuarial disease-free survival (DFS) rate was similar for the BU/CY/TBI group and the CY/TBI group (23% v 30%, P = .6), but there were trends toward lower relapse rates (28% v 54%, P = .27) and higher nonrelapse mortality rates (68% v 36%, P = .12) among the current patients compared with historical controls. Multivariate analysis showed that a normal karyotype pretransplant and the use of methotrexate as part of posttransplant immunosuppression were associated with improved survival and reduced nonrelapse mortality. Univariate analysis showed significant differences in relapse rates based on marrow source (57% for HLA genotypically matched marrow v 18% for all others, P = .04) and on disease morphology (66% for RAEB-T v 38% for RAEB and CMML, P = .05). CONCLUSION: Patients with advanced morphology myelodysplasia tolerated the intensified BU/CY/TBI preparative regimen and reduced posttransplant immunosuppression poorly. Novel transplant procedures are needed to reduce relapse rates without increasing nonrelapse mortality rates. In addition, transplantation before progression to RAEB-T, if possible, may reduce the risk of relapse.  相似文献   

2.
OBJECTIVE: The authors determined the long-term outcome of patients undergoing hepatic retransplantation at their institution. Donor, operative, and recipient factors impacting on outcome as well as parameters of patient resource utilization were examined. SUMMARY BACKGROUND DATA: Hepatic retransplantation provides the only available option for liver transplant recipients in whom an existing graft has failed. However, such patients are known to exhibit patient and graft survival after retransplantation that is inferior to that expected using the same organs in naiive recipients. The critical shortage of donor organs and resultant prolonged patient waiting periods before transplantation prompted the authors to evaluate the results of a liberal policy of retransplantation and to examine the factors contributing to the inferior outcome observed in retransplanted patients. METHODS: A total of 2053 liver transplants were performed at the UCLA Medical Center during a 13-year period from February 1, 1984, to October 1, 1996. A total of 356 retransplants were performed in 299 patients (retransplant rate = 17%). Multivariate regression analysis was performed to identify variables associated with survival. Additionally, a case-control comparison was performed between the last 150 retransplanted patients and 150 primarily transplanted patients who were matched for age and United Network of Organ Sharing (UNOS) status. Differences between these groups in donor, operative, and recipient variables were studied for their correlation with patient survival. Days of hospital and intensive care unit stay, and hospital charges incurred during the transplant admissions were compared for retransplanted patients and control patients. RESULTS: Survival of retransplanted patients at 1, 5, and 10 years was 62%, 47%, and 45%, respectively. This survival is significantly less than that seen in patients undergoing primary hepatic transplantation at the authors' center during the same period (83%, 74%, and 68%). A number of variables proved to have a significant impact on outcome including recipient age group, interval to retransplantation, total number of grafts, and recipient UNOS status. Recipient primary diagnosis, cause for retransplantation, and whether the patient was retransplanted before or after June 1, 1992, did not reach statistical significance as factors influencing survival. In the case-control comparison, the authors found that of the more than 25 variables studied, only preoperative ventilator status showed both a significant difference between control patients and retransplanted patients and also was a factor predictive of survival in retransplanted patients. Retransplant patients had significantly longer hospital and intensive care unit stays and accumulated total hospitalization charges more than 170% of those by control patients. CONCLUSIONS: Hepatic retransplantation, although life-saving in almost 50% of patients with a failing liver allograft, is costly and uses scarce donor organs inefficiently. The data presented define patient characteristics and preoperative variables that impact patient outcome and should assist in the rational application of retransplantation.  相似文献   

3.
End-stage liver disease secondary to hepatitis C virus (HCV) infection is the leading indication for liver transplantation in the United States. Recurrence of HCV infection is nearly universal. We studied the patients enrolled in the National Institute of Diabetes and Digestive and Kidney Diseases Liver Transplantation Database to determine whether pretransplantation patient or donor variables could identify a subset of HCV-infected recipients with poor patient survival. Between April 15, 1990, and June 30, 1994, 166 HCV-infected and 509 HCV-negative patients underwent liver transplantation at the participating institutions. Median follow-up was 5.0 years for HCV-infected and 5.2 years for HCV-negative recipients. Pretransplantation donor and recipient characteristics, and patient and graft survival, were prospectively collected and compared. Cumulative patient survival for HCV-infected recipients was similar to that of recipients transplanted for chronic non-B-C hepatitis, or alcoholic and metabolic liver disease, better than that of patients transplanted for malignancy or hepatitis B (P = .02 and P = .003, respectively), and significantly worse than that of patients transplanted for cholestatic liver disease (P = .001). Recipients who had a pretransplantation HCV-RNA titer of > or = 1 x 10(6) vEq/mL had a cumulative 5-year survival of 57% versus 84% for those with HCV-RNA titers of < 1 x 10(6) vEq/mL (P = .0001). Patient and graft survival did not vary with recipient gender, HCV genotype, or induction immunosuppression regimen among the HCV-infected recipients. While long-term patient and graft survival following liver transplantation for end-stage liver disease secondary to HCV are generally comparable with that of most other indications, higher pretransplantation HCV-RNA titers are strongly associated with poor survival among HCV-infected recipients.  相似文献   

4.
STUDY OBJECTIVE: Few studies have examined predictors of quality of life and adjustment after lung transplantation. This study determined whether pretransplant psychological measures predicted physical health, quality of life, and overall adjustment posttransplant. Cross-sectional analyses also examined differences in adjustment and quality of life for lung transplant candidates and recipients. DESIGN AND PARTICIPANTS: Seventeen transplant candidates and 60 transplant recipients completed questionnaires measuring adjustment and quality of life. In addition, we examined archival data on 107 transplant candidates who had received pretransplant psychological assessments, and posttransplant physical health status data were collected on these patients. Of the 107 patients who provided a pretransplant psychological assessment, 32 completed the questionnaires measuring posttransplant adjustment and quality of life. SETTING: University medical center transplant service. RESULTS: Cross-sectional analyses indicated significantly better adjustment and quality of life posttransplant. Pretransplant psychological variables were not associated with measures of posttransplant physical health. Hierarchical multiple regression analyses found that pretransplant anxiety and psychopathology predicted posttransplant adjustment (beta's ranging from 0.32 to 0.68) and greater pretransplant anxiety also predicted worse posttransplant quality of life (beta's ranging from 0.29 to 0.62). Subjective sleep disturbances were associated with poorer adjustment and quality of life (beta's ranging from 0.36 to 0.75), and were found to mediate the relationship between presurgical anxiety and posttransplant adjustment and quality of life. CONCLUSIONS: This study found that psychological status pretransplant predicted adjustment and quality of life posttransplant. Moreover, increased anxiety levels pretransplant predicted subsequent subjective sleep disturbances, which were, in turn, associated with poorer adjustment and quality of life. The benefits of pretransplant stress management interventions are discussed.  相似文献   

5.
OBJECTIVE: The purpose of this study was to explore the value of patient self-report assessment in heart transplant candidacy evaluation, utilizing the Millon Behavioral Health Inventory (MBHI). Patient's MBHI measures were related to important pretransplant patient characteristics and posttransplant measures of health behavior, medical morbidity, and mortality. METHOD: Ninety heart patients with end-stage cardiac disease completed the MBHI during pretransplant candidacy evaluations, and also were interviewed concerning their coping effectiveness, support resources, and compliance history. Postransplant follow-up of 61 living and 29 deceased patients included measures of survival time, postsurgical medical care, rejection and infection episodes, and nurse ratings of medication compliance and problematic interpersonal health behaviors. RESULTS: The MBHI coping scales were found to significantly discriminate good and poor pretransplant compliance, and interview judgments of good and poor coping and support resources, with modest accuracy. The MBHI also was superior to these interview judgments in predicting posttransplant survival time and medical care used. Certain scales were also positively associated with physical parameters of pretransplant and posttransplant status. CONCLUSIONS: Patient self-report with the MBHI can contribute to identification of patients at risk for a problematic outcome with transplant, by providing information pertinent to clinical decision making and outcome management analysis with this special population of cardiac patients.  相似文献   

6.
Despite improved preservation methods, graft dysfunction after liver transplantation continues to contribute considerably to postoperative morbidity and mortality. In clinical and experimental studies prostaglandin (PG)I2 analogs proved effective in the treatment of liver damage of different origin. Using in vivo fluorescence microscopy in a rat liver transplantation model, we studied the effect of donor bolus pretreatment with the PGI2 analog epoprostenol on hepatic graft revascularization. After epoprostenol bolus pretreatment (group 1: liver transplantation/PGI2), perfusion of liver sinusoids after reperfusion was significantly improved as compared with untreated donor livers (group 2: liver transplantation (95.2+/-0.6% vs. 75.3+/-3.8%, mean +/- SEM; P=0.001) and epoprostenol was found almost in the range of that in normal nontransplanted livers (99.4+/-0.2%). In addition, leukocyte adherence in liver lobules (21.0+/-3.5 vs. 115+/-11.5 n/lobule; P=0.001) and postsinusoidal venules (23.0+/-3.8 vs. 113+/-11.3 n/mm2 endothelial surface; P=0.002) was significantly reduced in the pretreated grafts. Bile production in the recipient was significantly increased by epoprostenol pretreatment of the donor (1.88+/-0.4 vs. 0.63+/-0.13 g/100 g liver*1 hr; P=0.015), indicating restored liver function. These results suggest that the prostacyclin analog epoprostenol is effective in preconditioning the graft prior to transplantation, i.e., improving preservation and increasing graft resistance to ischemia/reperfusion injury. Thus, favorable effects on early graft function after clinical liver transplantation may be achieved by introducing epoprostenol pretreatment into the harvesting procedure.  相似文献   

7.
BACKGROUND: The occurrence of peritonitis in peritoneal dialysis patients after renal transplantation during immunosuppression might increase morbidity and mortality. Hence the timing of catheter removal is still controversial. The associated risk factors of this complication have not been analyzed. METHODS: We analyzed, retrospectively, the incidence of peritonitis within 90 days after transplantation, its associated morbidity and mortality, as well as risk factors. From 1980 until March 1995, 238 consecutive kidney transplants in peritoneal dialysis patients were performed. Univariate and multivariated logistic regression analysis were used to identify risk factors for the development of peritonitis. RESULTS: 232 cases (141 men, 91 women) were available for analysis. In 191 patients, the catheter was removed with a mean interval after transplantation of 122 days (range 0-573). Thirty peritonitis episodes with predominantly Staphylococcus aureus (10/30) or gram-negative bacteria (12/30) were observed. Independent risk factors before transplantation were the total number of peritonitis episodes (P<10(-5)), previous peritonitis with S. aureus bacteria (P<10(-5)), and male sex (P<0.004). Risk factors after transplantation were technical surgical problems (P<10(-5)), more than two rejection episodes (P<0.02), permanent graft nonfunction (P<0.026), and urinary leakage (P<0.035). CONCLUSIONS: Transplantation without simultaneous peritoneal catheter removal is feasible. However, this increases the risk of peritonitis after transplantation. Early catheter removal should be considered seriously in those patients at risk. When peritonitis develops, antibiotic treatment should be directed against gram-positive as well as gram-negative bacteria until culture results are available.  相似文献   

8.
BACKGROUND: Patients must wait increasingly longer periods on the kidney waiting list (WL) before receiving a transplant. Although patients can be maintained on dialysis, many deaths occur while waiting. To determine whether the risk of mortality on the WL is different from that related to the transplant procedure, data from the Organ Procurement and Transplantation Network and Scientific Registry were used to analyze all adult patients entered on the United Network for Organ Sharing (UNOS) kidney WL for a primary transplant between April 1, 1994, and December 31, 1994 (n=9925). METHODS: To account for the time spent on the WL before transplant, a time dependent, nonproportional hazards model was used to assess the risk of mortality after transplant for both well-matched (zero to two HLA mismatches) and poorly-matched (three to six HLA mismatches) transplants compared with the mortality risk of remaining on the WL. This model incorporated an exponential decay component to account for the transient increased risk after kidney transplantation. Patients were stratified by age, race, creatinine level, panel-reactive antibody at listing, and blood group. RESULTS: Although there was an increased risk of mortality in the initial posttransplant period, the risk of mortality at 1 year for transplanted patients was 59% (three to six mismatches) to 67% (zero to two mismatches) less than that of patients who remained on the waiting list for an additional year. CONCLUSIONS: Kidney transplantation is more beneficial than remaining on the waiting list. Even poorly-matched kidneys provided a significant reduction in the risk of mortality by 6 months as compared with the mortality risk of continuing to wait. Patients receive the maximum benefit when transplanted with well-matched kidneys.  相似文献   

9.
This study examined the prevalence and predictors of posttraumatic stress disorder (PTSD) symptoms in 70 men and women treated with bone marrow transplantation for cancer. Findings indicated that number of symptoms present ranged from 0 to a possible high of 17 (M = 3.0, SD = 3.9). As predicted lower social support and higher avoidance coping I month pretransplant predicted greater PTS symptom severity an average of 7 months posttransplant. These variables remained significant predictors of symptom severity even after accounting for pretransplant levels of psychological distress. Addition analyses indicated the presence of a significant interaction between social support and avoidance coping with patients high in avoidance coping and low in social support reporting the most severe symptoms. These findings identify patients at risk for psychological disturbance posttransplant and can serve guide future intervention efforts. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

10.
BACKGROUND: To reduce the mortality rate associated with liver transplantation, it is important to identify the risk factors for increased mortality among liver transplant recipients. It has been suggested that cytomegalovirus (CMV) infection is one such risk factor, but no studies have examined mortality rates associated with the CMV serologic status of the donor and recipient by using multivariate techniques. OBJECTIVE: To study the effect of CMV on 1-year mortality rates in orthotopic liver transplant recipients. DESIGN: Intention-to-treat analysis of a cohort. PATIENTS: 146 liver transplant recipients who were enrolled in a multicenter, randomized, placebo-controlled intervention trial. SETTING: Four university-affiliated transplantation centers. RESULTS: 1-year mortality rates for the four strata of donor and recipient CMV serologic status before transplantation were as follows: seronegative donor and recipient, 11%; seronegative donor and seropositive recipient, 22%; seropositive donor and recipient, 30%; and seropositive donor and seronegative recipient, 44% (P = 0.0091). Multivariate analysis using a time-dependent Cox proportional hazards model showed that retransplantation (relative risk, 4.6 [95% CI, 1.9 to 10.7]; P < 0.001); total number of units of blood products administered during transplantation (relative risk, 1.006 per unit [CI, 1.003 to 1.010]; P < 0.001); and presence of CMV disease (relative risk, 3.9 [CI, 1.8 to 8.5]; P < 0.001), invasive fungal disease (relative risk, 3.3 [CI, 1.5 to 7.1]; P = 0.0020), and bacteremia (relative risk, 2.5 [CI, 1.2 to 5.2]; P = 0.0136) were independently associated with higher mortality rates. If post-transplantation variables that were highly correlated with donor and recipient CMV serologic status were restricted from the model, donor and recipient CMV serologic status was the only pretransplantation variable independently associated with higher mortality rates (P = 0.002). CONCLUSION: Donor and recipient CMV serologic status is a significant pretransplantation determinant for death in liver transplant recipients.  相似文献   

11.
BACKGROUND: The purpose of this study was to determine the effects of early postoperative tube feeding on outcomes of liver transplant recipients. METHODS: Fifty transplant patients were randomized prospectively to receive enteral formula via nasointestinal feeding tubes (tube-feeding [TF] group) or maintenance i.v. fluid until oral diets were initiated (control group). Thirty-one patients completed the study. Resting energy expenditure, nitrogen balance, and grip strength were measured on days 2, 4, 7, and 12 after liver transplantation. Calorie and protein intakes were calculated for 12 days posttransplant. RESULTS: Tube feeding was tolerated in the TF group (n = 14). The TF patients had greater cumulative 12-day nutrient intakes (22,464 +/- 3554 kcal, 927 +/- 122 g protein) than did the control patients (15,474 +/- 5265 kcal, 637 +/- 248 g protein) (p < .002). Nitrogen balance was better in the TF group on posttransplant day 4 than in the control group (p < .03). There was a rise in the overall mean resting energy expenditure in the first two posttransplant weeks from 1487 +/- 338 to 1990 +/- 367 kcal (p = .0002). Viral infections occurred in 17.7% of control patients compared with 0% of TF patients (p = .05). Although other infections tended to occur more frequently in the control group vs the TF group (bacterial, 29.4% vs 14.3%; overall infections, 47.1% vs 21.4%), these differences were not statistically significant. Early posttransplant tube feeding did not influence hospitalization costs, hours on the ventilator, lengths of stay in the intensive care unit and hospital, rehospitalizations, or rejection during the first 21 posttransplant days. CONCLUSIONS: Early posttransplant tube feeding was tolerated and promoted improvements in some outcomes and should be considered for all liver transplant patients.  相似文献   

12.
Antineutrophil cytoplasmic antibody-associated systemic vasculitis (AASV) frequently leads to end-stage renal disease (ESRD). Potentially fatal disease activity can continue after the onset of ESRD in both dialysis and transplant patients, despite the immunosuppressive effects of uremia and rejection prophylaxis, leading to concerns that such patients have greater morbidity and mortality. To assess the outcome of AASV patients receiving renal replacement therapy, a retrospective analysis of 59 patients from our unit who received chronic dialysis, renal transplantation, or both, was performed. The survival of AASV patients with ESRD was comparable to national registry controls, as were both graft and patient survival after renal transplantation. Ther is no evidence that standard immunosuppressive protocols should be altered for AASV patients receiving renal transplants. The rate of relapse of vasculitis for patients on chronic dialysis and after transplantation was 0.09 and 0.02 per patient per year, respectively. These rates are lower than those of other series and support the contention that continued immunosuppression after ESRD, as practiced in our unit, is warranted. Relapses usually responded to cyclophosphamide and high-dose prednisolone treatment. Significantly, vasculitic flare-ups in dialysis patients were sometimes initially misdiagnosed as dialysis complications, leading to fatal delays in effective treatment. Follow-up by physicians experienced in the diagnosis and treatment of vasculitis activity should continue in these patients.  相似文献   

13.
Neonatal intensive care unit survival rates have improved significantly over the past decade. This improvement primarily reflects declining mortality rates among preterm infants. Neurologic morbidity increases with prematurity and is the major predictor of long-term disability. Accordingly, concern has been expressed that the burden of neurologic dysfunction among contemporary neonatal intensive care unit survivors may be increasing. To define the trends of neurologic disorders in the contemporary neonatal intensive care unit, all 4164 admissions between 1986 and 1995 to a tertiary neonatal intensive care unit were examined. Neonatal intensive care unit admissions (413 +/- 49 per year), proportion of births at less than 37 weeks (70 +/- 3% per year), and referral patterns were stable between 1986 and 1995. Over the study period, 773 (18%) of 4164 neonatal intensive care unit infants had a total of 1062 neurologic disorders. The neonatal intensive care unit mortality rate declined from 12% in 1986 to 4.2% in 1995 (P < .01). Neurologic disorders declined, from 27% of infants born in 1986 to 12% in 1995 (P < .001): 356 had seizures (14% in 1986 to 4% in 1995; P < .001), 235 had hypoxic-ischemic encephalopathy (8% in 1986 to 4% in 1995, P < .01), and 167 had intraventricular hemorrhage (7% in 1986 to 1.4% in 1995, P < .005). Frequency of congenital or chromosomal aberration affecting the nervous system was relatively constant (4.5% per year). Despite a three-fold improvement in neonatal intensive care unit survival between 1986 and 1995, the frequency of perinatally acquired neurologic disorders declined by more than 50%.  相似文献   

14.
BACKGROUND: Transplantation of lung allografts from the same donor into 2 recipients ("twinning") provides an opportunity to study recipient and donor factors that influence early allograft function. METHODS: Twenty-seven pairs of recipients were identified and evaluated using multivariate logistic regression analysis (p < 0.05). Four measures of early graft function were analyzed: alveolar-arterial gradient in the operating room, first alveolar-arterial gradient in the intensive care unit, alveolar-arterial gradient at 24 hours, and days of mechanical ventilation. RESULTS: Analysis of the pooled data without regard to pairing showed that alveolar-arterial gradient in the operating room was influenced by donor age, length of donor hospitalization, recipient mean pulmonary artery (PA) pressure at unclamping, and transplantation of a left lung. The alveolar-arterial gradient in the intensive care unit was correlated with donor age, donor cause of death, and mean PA pressure on arrival in that unit. Only mean PA pressure remained significant at 24 hours. Days of mechanical ventilation was determined by mean PA pressure on arrival in the intensive care unit, drop in mean PA pressure during operation, and diagnosis of the recipient. In the paired analysis, receiving a left lung, recipient diagnosis (pulmonary hypertension worse than others), and need of cardiopulmonary bypass were significantly associated with immediate graft dysfunction, although these influences did not persist beyond the immediate postoperative period. Donor arterial oxygen tension and time of ischemia were not significant predictors in any analysis. CONCLUSIONS: Immediate allograft function was associated with donor-related characteristics initially, but these lost importance over the ensuing 24 hours. Recipient PA pressure was an immediate and persisting influence. In the analysis of differences in function between the members of each pair, transplantation of the left lung, recipient diagnosis, and cardiopulmonary bypass were identified, but their influence did not persist beyond the first 6 hours.  相似文献   

15.
S Bhagwanjee  DJ Muckart  PM Jeena  P Moodley 《Canadian Metallurgical Quarterly》1997,314(7087):1077-81; discussion 1081-4
OBJECTIVES: (a) To assess the impact of HIV status (HIV negative, HIV positive, AIDS) on the outcome of patients admitted to intensive care units for diseases unrelated to HIV; (b) to decide whether a positive test result for HIV should be a criterion for excluding patients from intensive care for diseases unrelated to HIV. DESIGN: A prospective double blind study of all admissions over six months. HIV status was determined in all patients by enzyme linked immunosorbent assay (ELISA), immunofluorescence assay, western blotting, and flow cytometry. The ethics committee considered the clinical implications of the study important enough to waive patients' right to informed consent. Staff and patients were blinded to HIV results. On discharge patients could be advised of their HIV status if they wished. SETTING: A 16 bed surgical intensive care unit. SUBJECTS: All 267 men and 135 women admitted to the unit during the study period. INTERVENTIONS: None. MAIN OUTCOME MEASURES: APACHE II score (acute physiological, age, and chronic health evaluation), organ failure, septic shock, durations of intensive care unit and hospital stay, and intensive care unit and hospital mortality. RESULTS: No patient had AIDS. 52 patients were tested positive for HIV and 350 patients were tested negative. The two groups were similar in sex distribution but differed significantly in age, incidence of organ failure (37 (71%) v 171 (49%) patients), and incidence of septic shock (20 (38%) v 54 (15%)). After adjustment for age there were no differences in intensive care unit or hospital mortality or in the durations of stay in the intensive care unit or hospital. CONCLUSIONS: Morbidity was higher in HIV positive patients but there was no difference in mortality. In this patient population a positive HIV test result should not be a criterion for excluding a patient from intensive care.  相似文献   

16.
We evaluated the impact of concomitant infection with Hepatitis B virus (HBV) and Hepatitis C virus (HCV) on the clinical course after renal transplantation (Tx). In 335 patients (pts) transplanted between 1991 and 1993 we found 30 (9%) recipients who were positive for Hepatitis B surface antigen (HBsAg) (ELISA, Organon) and anti-HCV antibodies (immunoblot assay Lia Tek) preTx. Chronic liver disease (CLD) (two-fold or greater increase in serum ALT and AST levels for at least six months) developed in 40.7% coinfected pts as compared to 24.4% and 25.7% pts infected only with HCV or HBV, respectively. Maintenance immunosuppression consisted of P + Aza + CsA, mean follow-up time was 28 +/- 15 months. The mean time of the onset of CLD was 3.0 months (range: 1-18 months) after Tx. Percutaneous liver biopsy performed in 5 CLD pts revealed chronic active hepatitis (CAH) in 4 and chronic persistent hepatitis (CPH) in 1 pt. Four pts who had CAH and were positive for HCV RNA (RT PCR) in serum and for HBcAg in liver tissue, received interferon-alpha therapy for 6 months. Clinical improvement of liver function was observed in all of them, but none cleared HBsAg or HCV RNA. One pt lost his graft due to acute rejection. Concomitant infection with HBV and HCV is associated with the high risk of development of CLD early after Tx. We recommend that pretransplant evaluation of both anti-HCV and HBsAg positive pts should include liver biopsy to exclude potential recipients with CAH.  相似文献   

17.
BACKGROUND: Dialysis can be life-saving for patients with end-stage renal failure. However, not only is it associated with significant morbidity and a greater mortality than transplantation, but it is also expensive. Therefore renal transplantation is generally regarded as the treatment of choice for patients in whom this form of renal replacement therapy is appropriate. Transplantation usually takes place after a variable period of dialytic therapy, but pre-emptive kidney transplantation (PKT) has established itself as an attractive alternative. MATERIALS AND METHODS: 1463 consecutive first kidney transplants performed between January 1980 and December 1995 in a single centre were analysed. The 161 patients (11%) transplanted without prior dialysis were compared with the 1302 patients who had been dialysed prior to being transplanted. The pre-emptive group did not differ from the dialysis group in respect of donor age, donor and recipient gender, HLA mismatch, or cold ischaemic time, although there were more live donor transplants within the pre-emptive group. RESULTS: Delayed graft function occurred more frequently in the dialysis group (25% vs 16%) but more patients experienced an acute rejection episode in the pre-emptive group (67 vs 55%). The actuarial graft survival in the pre-emptive group at 1, 5, and 10 years (84, 76 and 67%) was significantly higher than the respective values in the dialysis group (83, 69, and 56%). Within the live donor recipient cohort the survival advantage for the pre-emptive group was even more striking. CONCLUSION: Pre-emptive kidney transplantation not only avoids the risks, cost, and inconvenience of dialysis, but is also associated with better graft survival than transplantation after a period of dialysis, particularly within the live donor cohort.  相似文献   

18.
Between March 1984 and August 1994, 13 orthotopic liver transplantations were performed in 13 patients < or = 25 years of age. The indications included Wilson's disease (n = 7), biliary atresia (n = 4), choledochal cyst (n = 1) and hepatitis C cirrhosis (n = 1). Technical variants included full-size (n = 11), reduced-size (n = 1) and living-related (n = 1) liver transplantation. These recent technical innovations have offered an expanded donor pool for earlier transplantation, shorter waiting times and excellent quality grafts. Surgical complications occurred in six patients; all required additional surgery. Biliary complications were encountered more commonly in our earlier patients. Our actuarial patient and graft survival rate is 92% at 2 years. The long-term follow-up of our liver-transplanted Wilson's disease patients provides confirmatory evidence that orthotopic liver transplantation cures the underlying metabolic defect with complete normalization of biochemical abnormalities of copper metabolism, reversal of neurological impairments and the disappearance of Kayser-Fleischer corneal rings. The high rate of patient survival and excellent rehabilitation indicate that with prudent clinical judgement, liver transplantation can be achieved with an acceptable rate of morbidity, mortality and cost in a setting where manpower and donor organs are very limited.  相似文献   

19.
OBJECTIVES: To identify ICU-specific predictors of mortality. DESIGN: An inception cohort study. SETTING: Barnes Hospital, an academic tertiary care center. PATIENTS: Consecutive patients, requiring mechanical ventilation, admitted to the medical intensive care unit (ICU) (75 patients), surgical ICU (100 patients), and cardiothoracic ICU (102 patients). INTERVENTIONS: Prospective data collection and outcomes evaluation. MEASUREMENTS AND MAIN RESULTS: Stepwise logistic regression analysis identified the following variables to be independent predictors of mortality for the individual ICUs: medical ICU, an Organ System Failure Index (OSFi) greater than or equal to 3; surgical ICU, OSFI greater than or equal to 3; cardiothoracic ICU, OSFI greater than or equal to 3, requiring acute dialysis, and the occurrence of an iatrogenic event. The same analysis was repeated after removing the OSFI as a potential confounding variable. Independent predictors of mortality identified in this subsequent analysis were as follows: medical ICU, occurrence of renal failure; surgical ICU, supine head positioning, acute physiology score greater than or equal to 10, preadmission lifestyle score greater than or equal to 2; cardiothoracic ICU, requiring acute dialysis, occurrence of ventilator-associated pneumonia, and the occurrence of an iatrogenic event. CONCLUSIONS: We identified the presence of ICU-specific predictors of mortality amongst the three ICUs examined. These data suggest that ICU-specific interventions could be developed to improve the quality of patient care and potentially to reduce patient mortality.  相似文献   

20.
BACKGROUND: Recipient hepatitis C virus (HCV) seropositivity has been associated with inferior outcomes in renal transplantation (RTx). We sought to determine whether donor HCV+ status influenced the incidence of rejection, liver dysfunction, and graft survival in HCV+ recipients. METHODS: We reviewed 44 HCV+ recipients (R+) receiving RTx from HCV+ (D+) and HCV- (D-) donors between February 1991 and September 1996. All patients were followed to the end of the study period (mean=36 months, range=12-60 months). We compared the R+ group with a demographically matched cohort of 44 HCV- recipients (R-). RESULTS: Of the 44 R+, 25 (57%) had a total of 48 rejection episodes. Among the 44 R-, 32 (73%) had 58 rejection episodes (P>0.1). Within the R+ group, 28 were D+/R+; of these 14 (50%) had 27 rejection episodes, whereas among the 16 D-/R+, 11 (68%) had 21 rejection episodes (P>0.3). Graft and patient survival was similar in both the groups (86.4% and 91%, respectively). Liver dysfunction was slightly increased in the R+ group (4/44 vs. 0/44, P>0.1), with one death due to liver failure in this group. CONCLUSION: Donor HCV+ status had no influence on outcomes in HCV+ recipients after kidney transplantation in the short term. The incidence of rejection, graft loss, and mortality was comparable between the D+/R+ and D-/R+ groups. Furthermore, rejection, graft loss, and death were identical in R+ and R-groups throughout the 5-year study period. We therefore conclude that HCV+ recipients can safely receive kidney transplants without concern about donor HCV status or fear of adverse events from their own HCV+ status.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号