首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
OBJECTIVE: The authors determined the long-term outcome of patients undergoing hepatic retransplantation at their institution. Donor, operative, and recipient factors impacting on outcome as well as parameters of patient resource utilization were examined. SUMMARY BACKGROUND DATA: Hepatic retransplantation provides the only available option for liver transplant recipients in whom an existing graft has failed. However, such patients are known to exhibit patient and graft survival after retransplantation that is inferior to that expected using the same organs in naiive recipients. The critical shortage of donor organs and resultant prolonged patient waiting periods before transplantation prompted the authors to evaluate the results of a liberal policy of retransplantation and to examine the factors contributing to the inferior outcome observed in retransplanted patients. METHODS: A total of 2053 liver transplants were performed at the UCLA Medical Center during a 13-year period from February 1, 1984, to October 1, 1996. A total of 356 retransplants were performed in 299 patients (retransplant rate = 17%). Multivariate regression analysis was performed to identify variables associated with survival. Additionally, a case-control comparison was performed between the last 150 retransplanted patients and 150 primarily transplanted patients who were matched for age and United Network of Organ Sharing (UNOS) status. Differences between these groups in donor, operative, and recipient variables were studied for their correlation with patient survival. Days of hospital and intensive care unit stay, and hospital charges incurred during the transplant admissions were compared for retransplanted patients and control patients. RESULTS: Survival of retransplanted patients at 1, 5, and 10 years was 62%, 47%, and 45%, respectively. This survival is significantly less than that seen in patients undergoing primary hepatic transplantation at the authors' center during the same period (83%, 74%, and 68%). A number of variables proved to have a significant impact on outcome including recipient age group, interval to retransplantation, total number of grafts, and recipient UNOS status. Recipient primary diagnosis, cause for retransplantation, and whether the patient was retransplanted before or after June 1, 1992, did not reach statistical significance as factors influencing survival. In the case-control comparison, the authors found that of the more than 25 variables studied, only preoperative ventilator status showed both a significant difference between control patients and retransplanted patients and also was a factor predictive of survival in retransplanted patients. Retransplant patients had significantly longer hospital and intensive care unit stays and accumulated total hospitalization charges more than 170% of those by control patients. CONCLUSIONS: Hepatic retransplantation, although life-saving in almost 50% of patients with a failing liver allograft, is costly and uses scarce donor organs inefficiently. The data presented define patient characteristics and preoperative variables that impact patient outcome and should assist in the rational application of retransplantation.  相似文献   

2.
The objective of the current study was twofold: (a) to determine whether subgroups of breast cancer patients could be identified on the basis of their distinct trajectory or pattern of fatigue following treatment for early stage cancer using growth mixture modeling and (b) to examine whether the subgroups could be distinguished on the basis of a cognitive-behavioral model. Growth mixture modeling and a prospective longitudinal design were used to examine the course of fatigue after treatment for early stage breast cancer. Women (n = 261; mean age = 55.2 years) provided fatigue ratings for 6 months following treatment. A low-fatigue group (n = 85) and a high-fatigue group (n = 176) were extracted. Women who were not married, had a lower income, had a higher body mass index, engaged in greater fatigue catastrophizing, and were lower in exercise participation were more likely to be in the high-fatigue group. Only body mass index and catastrophizing remained significant predictors in multivariate analysis. Findings suggest considerable heterogeneity in the experience of fatigue following treatment and support the utility of a cognitive-behavioral model in predicting the course of posttreatment fatigue. (PsycINFO Database Record (c) 2010 APA, all rights reserved)  相似文献   

3.
Merely, asking this question, which as posed by the Editor-in-Chief, implies that it cannon easily be answered with a simple yes or no. in fact, it is closely related to the question, Is there a future for clinical PET? When looking at the world-wide PET radiopharmaceutical production on the basis of a recent IAEA directory [1], it is disappointing to see that besides FDG, and to a certain extent FDOPA, fluorine-18 radiopharmaceuticals are not widely used. Other 18F products, such as fluoride, spiperone derivatives, FMISO, fluoroaltanserin, fluorofatty acids and fluoroamino acids, are used by only a few centres. The emphasis of this editorial is on clinical radiopharmaceuticals, and in this respect the impact of 18F thus far is extremely small, if on excludes FDG. The reasons are discussed in this editorial.  相似文献   

4.
5.
6.
A mathematical model has been developed which, by means of finite difference computation techniques, permits the prediction of carbon concentration profiles in carburized high temperature alloys. It is assumed that a proportion of the carbon which diffuses into the alloy reacts with elements such as chromium to form carbide precipitates. The amount of carbon remaining in solution is determined from the solubility product of the carbide. Only this carbon in solution is able to diffuse through the alloy matrix, and thus the carbide precipitation process reduces the rate of carburization. Applying the model, the diffusion coefficient of carbon in Alloy 800 H at 900 °C has been determined as (3.3 ± 0.5) × 10−8 cm2/s. The model can also treat the carburization of an alloy containing two carbide-forming elements, but application to alloys containing both chromium and niobium (columbium) was successful only to a limited extent, probably as a result of the slow, complex kinetics of carbide precipitation. The model can be used to adapt carbon concentration profiles from one geometrical configuration to another. On the basis of profiles determined experimentally on small, cylindrical test specimens, carbon concenration profiles have been predicted for thick section tubes of Alloy 800 H exposed to a carburizing environment for up to 100,000 h. Formerly of the Institute of Reactor Materials, Nuclear Research Centre (KFA), Jülich  相似文献   

7.
Previous models of quench sensitivity of age-hardening alloys have been extended to include loss of toughness as well as loss of yield strength upon postquench aging. Loss of toughness on slow quenching was modeled by the loss of solute to grain-boundary precipitates that promote intergranular fracture. The phenomena are modeled using differential equations, and the model includes temperature-dependent values of the minimum toughness and strength expected after extended isothermal hold times. Time-temperature-property (TTP) curves for the postaging yield strength and toughness were used to provide empirical kinetic and property data for fitting the proposed relationship. The model was tested against experimental data, both nominally isothermal and truly continuous cooling, for an Al-Cu-Li alloy plate. For nominally isothermally cooling, the model proved to be capable of accurately describing the loss of toughness and the loss of strength to a much larger loss in strength than previous models. The model also successfully predicted the loss of strength on continuous cooling but provided a conservative overestimate of the loss of toughness under the same continuous-cooling conditions. It is suggested that this bias arises from the lack of consideration of differences in the microstructure of the precipitates formed during isothermal treatments and those formed during continuous cooling.  相似文献   

8.
This paper investigates the social and economic circumstances of childhood that predict the probability of survival to age 85 among African-Americans. It uses a unique study design in which survivors are linked to their records in U.S. Censuses of 1900 and 1910. A control group of age and race-matched children is drawn from Public Use Samples for these censuses. It concludes that the factors most predictive of survival are farm background, having literate parents, and living in a two-parent household. Results support the interpretation that death risks are positively correlated over the life cycle.  相似文献   

9.
10.
End-stage liver disease secondary to hepatitis C virus (HCV) infection is the leading indication for liver transplantation in the United States. Recurrence of HCV infection is nearly universal. We studied the patients enrolled in the National Institute of Diabetes and Digestive and Kidney Diseases Liver Transplantation Database to determine whether pretransplantation patient or donor variables could identify a subset of HCV-infected recipients with poor patient survival. Between April 15, 1990, and June 30, 1994, 166 HCV-infected and 509 HCV-negative patients underwent liver transplantation at the participating institutions. Median follow-up was 5.0 years for HCV-infected and 5.2 years for HCV-negative recipients. Pretransplantation donor and recipient characteristics, and patient and graft survival, were prospectively collected and compared. Cumulative patient survival for HCV-infected recipients was similar to that of recipients transplanted for chronic non-B-C hepatitis, or alcoholic and metabolic liver disease, better than that of patients transplanted for malignancy or hepatitis B (P = .02 and P = .003, respectively), and significantly worse than that of patients transplanted for cholestatic liver disease (P = .001). Recipients who had a pretransplantation HCV-RNA titer of > or = 1 x 10(6) vEq/mL had a cumulative 5-year survival of 57% versus 84% for those with HCV-RNA titers of < 1 x 10(6) vEq/mL (P = .0001). Patient and graft survival did not vary with recipient gender, HCV genotype, or induction immunosuppression regimen among the HCV-infected recipients. While long-term patient and graft survival following liver transplantation for end-stage liver disease secondary to HCV are generally comparable with that of most other indications, higher pretransplantation HCV-RNA titers are strongly associated with poor survival among HCV-infected recipients.  相似文献   

11.
The authors describe a new technique that used the donor common iliac vein and its bifurcation into the external iliac and internal iliac veins to replace the retrohepatic vena cava; this was used in a recipient who underwent her second reduced-size transplantation (segments II and III). Anastomosis of the donor hepatic vein to the internal iliac vein, with use of this segment of the venous graft to replace the retrohepatic vena cava, is for patients who have had more than one surgical procedure before liver transplantation.  相似文献   

12.
《钢铁冶炼》2013,40(1):63-68
Abstract

Condition monitoring intervals are usually set at fixed intervals of length typically set by a mixture of British Standards, manufacturer's recommendations, and personal experience. These rather ad hoc methods have little scientific basis. A recently developed condition based maintenance model is described which utilises reliability data combined with condition monitoring measurements. This model provides the necessary basis to optimise condition monitoring intervals. Results obtained using artificial data, based on typical machines found in a hot strip mill, show how the model can be used as part of a condition based maintenance strategy.  相似文献   

13.
BACKGROUND: This study was performed to validate the prognostic significance of residual axillary lymph node metastases in patients with locally advanced breast cancer (LABC) treated with neoadjuvant chemotherapy and to analyze other clinicopathologic factors that might be independent predictors of disease-free survival (DFS) in an attempt to identify patients in whom axillary dissection might be omitted. METHODS: One hundred sixty-five assessable patients with LABC were treated in a prospective trial of neoadjuvant chemotherapy utilizing four cycles of 5-fluorouracil, doxorubicin, and cyclophosphamide. Responding patients were treated with segmental mastectomy and axillary dissection or modified radical mastectomy. Patients subsequently received additional chemotherapy followed by irradiation of the breast or chest wall and draining lymphatics. The median follow-up was 35 months. RESULTS: Clinical tumor response to neoadjuvant chemotherapy (P = 0.046) and the number of residual metastatic axillary lymph nodes found at axillary dissection (P = 0.05) were the only independent predictors of DFS. Patients with a complete clinical response had a predictably excellent DFS and those with no change or progressive disease had a poor DFS. In patients with a partial response, the number of residual metastatic lymph nodes further stratified patients with respect to DFS (P = 0.006). CONCLUSIONS: Clinical response and residual metastatic axillary lymph nodes following neoadjuvant chemotherapy are important predictors of DFS. Patients with a clinically positive axilla following neoadjuvant chemotherapy should undergo axillary dissection to ensure local control. However, the benefit of axillary dissection in patients with a clinically negative axilla may be minimal if the axilla will be irradiated, and histologic staging does not affect subsequent systemic treatment. A prospective randomized trial of axillary dissection versus axillary radiotherapy in patients with a clinically negative axilla following neoadjuvant chemotherapy is presently under way to evaluate this hypothesis.  相似文献   

14.
15.
16.
17.
18.
19.
20.
The second AIDS-defining condition diagnosed chronologically is referred to in this report as the secondary AIDS diagnosis. In this study, we examined survival following a secondary AIDS diagnosis using demographic and clinical factors known within 1 year before secondary AIDS diagnosis. In a prospective cohort of 2412 HIV-seropositive homosexual men observed in the Multicenter AIDS Cohort Study (MACS), 609 presented with a secondary AIDS diagnosis between January 1, 1988 and March 31, 1995. To analyze the data, we used survival analysis methods including the Kaplan-Meier product-limit estimator and extended Cox models that allow for nonproportional hazards. The median survival time after a secondary diagnosis was 10.3 months. Rapidity of progression from an initial AIDS diagnosis to a secondary diagnosis was not associated with survival. Drug treatment did not show a beneficial effect because of confounding by indication (i.e., selection bias) and limited efficacy on advanced disease of treatments available prior to 1995. However, a beneficial effect was captured by the use of calendar periods as a proxy measure for the relative exposure to drug treatments. Later calendar year of secondary diagnosis, secondary Kaposi's sarcoma, and higher CD4+ cell count were found to be significantly (p < .05) associated with longer survival time. However, secondary AIDS diagnosis was a significant factor only in the short term. Using secondary Pneumocystis carinii pneumonia as the reference diagnosis, the relative hazard of death 3 months after the time of secondary Kaposi's sarcoma diagnosis was 0.56 (95% confidence interval [CI] = 0.36-0.89) whereas the relative hazard after concurrently diagnosed multiple secondary illnesses was 2.06 (95% CI = 1.26-3.38). After approximately 1 year from the secondary diagnosis, the type of diagnosis was no longer significantly associated with survival.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号