首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
2.
3.
The Aksys PHD system is designed for short quotidian dialysis employing a 52‐liter batch of ultrapure dialysate and up to 30 in situ hot water reuses of the entire extracorporeal circuit including a 40‐liter physical cleaning before each dialysis. Methods:  We studied the effect of the 52‐liter tank during 108 long 5–8 hour dialysis 3.5–6 times/week in 5 patients and one 50‐liter patient simulator for 4 weeks. Phosphate (PO4), beta‐2 microglobulin (b‐2), urea (BUN), and creatinine (creat) were measured pre‐, during, and post‐dialysis 86 times and in total dialysate 74 times during long dialysis. Tank saturation, Kt/V, and monthly chemistries were also measured. Results:  Patient weight 76 ± 2 kg, QB 234 ± 23 ml/min, QD 498 ± 13 ml/min. Dialysate was recirculated 4.8 times during 8 hours.  
  相似文献   

4.
Regulation of phosphate (PO4) in hemodialysis patients is very difficult and ideal levels are rarely maintained. A high removal and a normal phosphate level is important, as high and low levels are both associated with morbidity and a very high mortality.
We studied phosphate dynamics and its relation to other small "uremic" molecules in 48 patients by measuring pre‐ and postdialysis levels and all removed phosphate, urea and creatinine (creat) in all dialysate during 455 dialyses done at different frequencies (freq): 3.7 ± 1.2, range 3–6 treatments per week and durations of dialysis (t): mean: 196 ± 95, range 80–560 min and with high (HF) and low flux membranes.
Kt/V‐PO4, Kt/V‐urea and Kt/V‐creat, volumes (Vr) for all solutes and their relationships to frequency and duration of dialysis, urea clearance and predialysis phosphate were calculated.  
  相似文献   

5.
Background:  Hemodialyzers can be used once or reused after treatment with chemicals or hot water. SU results in infusion of plastic compounds, particularly phthalic acid metabolites, into patients and chemical reuse releases formaldehyde, glutaraldehyde, or peracetic acid into the blood during dialysis.
Methods:  We studied the increase in pulse rate (PR) and fall in systolic and diastolic blood pressure (BP) and patients' subjective overall quality evaluation (OE) of dialysis (1 worst, 5 best) during 3706 daily dialyses in 23 patients. Fall in blood pressure and rise in PR during dialysis and overall quality evaluation were compared as patients changed from SU or chemical reuse to hot water reuse. During SU and chemical reuse, dialysis time was shorter (121 vs. 148 min), urea clearance higher (241 vs. 175 ml/min) but ultrafiltration lower (1.5 vs. 1.7 kg/dialysis) than during hot water reuse.
Results:  The results are summarized in the table.  
  相似文献   

6.
Acute renal failure with concomitant sepsis in the intensive care unit is associated with significant mortality. The purpose of this study was to determine if the timing of initiation of renal replacement therapy (RRT) in septic patients had an effect on the 28-day mortality. Retrospective data on medical intensive care unit patients with sepsis and acute renal failure requiring RRT were included. Renal replacement therapy started with a blood urea nitrogen (BUN) of <100 mg/dL was defined as "early" initiation, and initiation with a BUN ≥100 mg/dL was defined as "late." Multivariate logistic regression analysis with the primary outcome of death at 14, 28, and 365 days following the initiation of RRT was performed. One hundred thirty patients were studied. The early dialysis (mean BUN 66 mg/dL) group had 85 patients; the late group (mean BUN 137 mg/dL) had 62 patients. The mean acute physiology and chronic health evaluation II score was 24.5 in both groups. The overall 14, 28, and 365-day survival rates were 58.1%, 41.9%, and 23.6%. Survival rates for the early group were 67%, 47.7%, and 30.7% at 14, 28, and 365 days. Survival rates for the late group were 46.7%, 31.7%, and 13.3% at 14, 28, and 365 days. Upon logistic regression analysis, initiating dialysis with a BUN >100 mg/dL predicted death at 14 days (odds ratio [OR] 3.6, 95% confidence interval [CI] 1.7–7.6, P=0.001), 28 days (OR 2.6, 95% CI 1.2–5.7, P=0.01), and 365 days (OR 3.5, 95% CI 1.2–10, P=0.02). Septic patients who started dialysis with a BUN <100 mg/dL had improved mortality rates up to 1 year after initiation of dialysis in this single-center, retrospective analysis.  相似文献   

7.
Introduction:  Methanol poisoning can lead to complications that include metabolic acidosis, visual impairment and death. Treatment options include ethanol, fomepizole, and hemodialysis (HD). Objective:  To report on the occurrence of post dialysis methanol rebound during treatment. Method and Findings:  A 40‐year‐old male with a history of schizophrenia and suicide attempts presented to the emergency room after reportedly ingesting 1 quart of windshield washer fluid. The patient presented with a preliminary blood chemistry of methanol 390 mg/dL, ethanol 48 mg/dL, glucose 93 mg/dL, Na 138 meq/L, K 3.8 meq/L, Cl 98 mmol/L, CO2 26 mmol/L, urea 16 mg/dL, creatinine 1.2 mg/dL, and an anion gap of 14 mmol/L. The patient was started on 1360 mg of fomepizole (12:50 AM) followed by HD for 4 hours. A second dose of fomepizole (900 mg) was administered at 8:00 AM. In addition, another HD session was started at 12:00 PM and continued for 4 hours. A third dose of fomepizole (700 mg) was administered at 8:50 PM. Finally, a third HD session was started the next day at 3:05 PM and lasted 3 hours. Table 1 illustrates methanol levels in relation to each HD session. Findings:  Methanol concentration after the first HD increased from 100 mg/dL to 127 mg/dL (27%) in 5 h 20 m. It also increased from 35 mg/dL to 50 mg/dL (43%) 14 h 45 m after the second HD. Conclusions:  Close attention must be paid to the potential for post dialysis methanol rebound. It is recommended that methanol levels continue to be monitored for several hours after HD.  

  Table 1   Methanol levels before and after each hemodialysis  相似文献   


8.
9.
The most common complication of tunneled‐cuffed hemodialysis catheters is catheter‐related bacteremia (CRB), which contributes to patient morbidity and loss of vascular access. Gram positive microorganisms are the most common etiologic agents; coagulase negative staphylococcus and corynebacterium species are the two most prevalent strains in our center. These are the common inhabitants of skin flora, suggesting that infection of catheters occur through the exit site. The Biopatch is a chlorhexidine impregnated dressing designed to keep the exit site from colonization with skin flora. This may decrease the incidence of CRB due to organisms from the skin. Objective:  To investigate whether the application of the biopatch at the exit site has any effect on the incidence and the etiology of CRB. Methods:  Chart review of 63 pediatric chronic hemodialysis patients who were dialysed between January 1999 and December 2003 was performed. The mean age at start of hemodialysis was 13.9 ± 4.6 years. The pre‐Biopatch era started in January 1999 till the end of June 2001, and the Biopatch era started in July 2001 to December 2003. Biopatch was applied at the beginning of every dialysis week after Betadine cleansing of the exit site, which was then covered with a transparent dressing. In the pre‐Biopatch era, the exit site was cleansed with Betadine at every dialysis session and then covered with a transparent dressing. Results:  The use of the Biopatch at the exit site caused a significant decrease in the exit site infections. However, contrary to what was expected, there was no decrease in the incidence of CRB.  
  相似文献   

10.
Purpose: Limited information exists on the use of any intravenous iron preparation in pediatric HD patients. This study was designed to describe the PK parameters of FG, now approved for use in children on HD. Methods: Iron‐deficient pediatric HD pts (≤15 yr) were randomized to 2 doses of FG. Blood samples taken during a 1 hr infusion and at intervals over 48 hrs were analyzed for total iron, transferrin‐bound iron (TBI), and FG‐bound iron (FGI). Results: 49% of pts were male, 88% white, 57% age 6–12 yr, wt 16.3–63.2 Kg, ht 100–177.5 cm. Mean serum iron concentrations (total iron and FGI) rapidly increased in a dose‐dependent manner, approximately proportional to the FG dose administered. A rapid rise in total serum iron was followed by a slower, less prominent rise in TBI. Single‐dose PK of FGI was adequately described using non‐compartmental analytical methods. A standard 2‐compartment NONMEM model successfully fit the data and accurately described the time‐course of FGI concentrations.  
  相似文献   

11.
Purpose: Arteriovenous fistula (AVF) is the preferred blood access for hemodialysis due to its longevity and resistance to infection. Little attention is given to the long-term hemodynamic consequences of large left‐to‐right shunts, particularly in patients with brachial artery fistulae. Materials and Methods:  We describe 9 patients (8 on dialysis, 1 post‐transplant), aged 25–73, who developed clinical heart failure, primarily due to large, upper arm AVFs. Results:  4/9 had access flows in excess of 2 liters/min, assessed by blood temperature monitoring. 6/9 had cardiac output measured by right heart catheterization, before and after shunt compression. One also underwent left heart catheterization with ventriculography. 3/9 had surgical reduction of the fistula, either by banding or by serial interposition of small caliber GoreTex graft. In 2/9 the shunt was ligated. One patient had heart failure in association with 2 large, upper arm AVFs, one of which was ligated. After years of improved cardiac symptoms, heart failure recurred in association with marked hypertrophy of his remaining AVF. Resting cardiac output in this patient was in excess of 11 liters/min. 2/9 experienced acute onset of heart failure within 1–3 days of angioplasty of a venous stenosis. One of these, with very poor baseline cardiac function, expired. Surgical revision or ligation was accompanied by clinical improvement in the 5 patients so treated. One of these expired of a stroke after two months of cardiac improvement. Conclusion:  High output heart failure is under‐diagnosed in dialysis patients. Patients with large upper arm shunts are particularly at risk. Access flow should be assessed regularly and those with outputs >1.5 liters/min should be monitored closely for development of heart failure. Surgical correction is beneficial and indicated in symptomatic patients.  
  相似文献   

12.
Hyperphosphatemia and poor uremia control are established cardiovascular risk factors in patients with end-stage renal disease (ESRD) associated with impaired endothelial dependent and independent vasodilation (EDV and EIV). Nocturnal hemodialysis [6 × 8 h/week] augments dialysis dose and offers normal phosphate (Pi) balance. We hypothesized that NHD would restore EDV (endothelial function) and EIV (vascular smooth muscle cell function) by simultaneously improving uremia and Pi control. 2 groups of ESRD patients (mean age 41 ± 2 years) stratified according to their baseline plasma Pi levels (normal Pi <1.8 mM, high Pi >1.8 mM) were studied. Dialysis dose (Kt/V per session), plasma Pi, blood pressure (BP) and brachial artery responses to reactive hyperemia (EDV), and sublingual nitroglycerin (EIV) were examined before, 1 and 2 months after conversion from conventional hemodialysis (CHD) [3 × 4 h/week] to NHD. After 2 months, NHD increased dialysis dose (from 1.24 ± 0.06 to 2.04 ± 0.08; p = 0.02) and lowered BP (from 140 ± 5/82 ± 3 to 119 ± 1/71 ± 3, p = 0.01) in all patients. In patients with adequate Pi control during CHD, EDV was normalized after 1 month of NHD. In contrast, in the high Pi group, 1 month of NHD was sufficient to reduce plasma phosphate levels, but 2 months of NHD was required for EDV to improve.  
  相似文献   

13.
Decrease of HCV viral load and HGF plasma levels increase have been related to HD sessions. Beneficial effects of HGF stimulation in HD on the outcome of HCV liver disease have been described. Aim was to analyze potential differences between intermittent (3 × week) and short daily (6 × week) HD, examining differences between HCV+ and – pts. We studied 41 pts from 2 HD centres, 26 on intermittent HD (6 on line HF), 8 HCV+, and 15 on short‐daily HD with 4 HCV+ 40 pts used synthetic HD membranes (low‐flux and high‐flux). Among HCV + we determined viral load by Amplicor (Roche) pre- and post- HD. All pts were studied for HGF levels (ELISA) baseline, 15 min, end, and at start of the following session viral load is significantly higher preHD and decreases over session. High‐flux membranes were more efficient in reducing viremia (67% vs 45%), which level was higher pre‐ and post‐HD principally in patients using low‐flux membranes. Viremia in DHD is lower than in intermittent (470067.3 ± 663974.5 vs 1015695.5 ± 1202679.0).  
  相似文献   

14.
Background:  Among patients receiving chronic HD, therapy with paricalcitol (Z) was associated with reduced mortality and morbidity compared to treatment with calcitriol (C). Furthermore, patients who did not receive any vitamin D therapy experienced highest mortality and morbidity. This study examined the hypothesis of a relationship between the dose of Z and subsequent hospitalization, more specifically, that patients receiving lower doses of Z would have a higher risk of being hospitalized. Methods:  We performed a retrospective cohort study of patients new to HD who received treatment with Z or C between Jan 1999 and Dec 2001 using Poisson regression models. The primary exposure variable was the average dose/day, examined as categorical data by quintiles, during a 3‐month and then 12‐month follow‐up period excluding the days in hospitals. This latter exclusion was made due to uncertainty of the treatment during the period of hospitalization. Additional covariates included vitamin‐D group (Z vs. C), age, gender, race, and diabetes status, serum albumin, alkaline phosphatase, calcium, phosphorus, and iPTH. Results:  We first examined dose of Z and C over a 3‐month period and then hospitalizations over the ensuing year. We did not find a dose‐response relationship in these analyses – i.e., dose over the first three months of dialysis is not associated with increased or reduced risk of hospitalizations during the ensuing 12 months. We then examined average dose of Z or C over the entire year and risk for hospitalization during the same year. Compared to total doses, average doses (total dose over the entire year divided by number of dialysis sessions during the same year) are less prone to bias. Risk of hospitalizations according to dose (lowest dose, Quintile 1) of injectable vitamin D is shown in the table below:  
  相似文献   

15.
Dialysis adequacy indexed by Kt/V in hemodialysis (HD) patients is recommended as a single-pool Kt/V of at least 1.2 per session thrice weekly. But many patients cannot achieve this adequacy target. Although dialysis time is the most important as a factor influencing Kt/V, it is difficult to prolong dialysis time in practice because of its economic impact and poor patient compliance.
Objective:  The aim of this study is to investigate the effect of increasing blood flow rate on dialysis adequacy in HD patients with low Kt/V.
Methods:  This study enrolled 36 HD patients with single-pool Kt/V <1.2 per session thrice weekly, which was measured in dialyzer blood flow rate of 230 mL/min. We increased 15% of blood flow rate in patients <65 kg of body weight and 20% in patients >65 kg. And then we compared Kt/V and urea reduction ratio (URR) between before and after increasing blood flow rate.
Results:  The mean age was 48 ± 11 years (23–73 years), and the number of males was 25. Of the total patients, 24 patients had dry weight <65 kg. Mean dialysis duration was 52 ± 50 months (3–216 months). Mean Kt/V before increasing blood flow rate was 1.02 ± 0.09. It increased to 1.14 ± 0.12 after increasing blood flow rate (p < 0.001). Of the total 36 patients, 13 patients (36.1%) achieved adequacy target (Kt/V ≤ 1.2). Mean URR before increasing blood flow rate was 56.9 ± 4.0%. It also increased to 60.8 ± 4.1% (p < 0.001).
Conclusion:  Our data suggest that increasing blood flow rate by 15–20% of previous flow rate is effective in achieving dialysis adequacy in HD patients with low Kt/V.  相似文献   

16.
Background:  Acute renal failure (ARF) after cardiac surgery is associated with significant morbidity and mortality, irrespective of the need for dialysis. Previous studies have attempted to identify predictors of ARF and develop risk stratification algorithms. This study aims to validate the algorithm in an independent cohort of patients that includes a significant proportion of female and black patients and compares two different definitions of renal outcome.
Methods:  A large single center cardiac surgery database was examined (n, 24,660; 1993–2000) which included 29.9% females and 3.7% black patients. Post‐operative ARF was defined as: a) ARF requiring dialysis, b) > 50% reduction in creatinine clearance relative to baseline or requiring dialysis. Clinical variables related to baseline renal function and cardiovascular disease were used in recursive partitioning analysis for both outcome definitions. Chi‐square goodness of fit analysis was performed to validate the algorithm.
Results:  The frequency of post‐operative ARF requiring dialysis ranged between 0.5 and 15.5% based on the risk categories with the area under the receiver operating characteristic (ROC) curve of 0.78. Using the more inclusive definition of ARF, the frequency was significantly higher ranging from 2.6 to 25%(P < 0.001) with an area under ROC curve of 0.65.
Conclusions:  The renal risk stratification algorithm is valid in predicting post‐operative ARF in an independent cohort of patients, well represented by differences in gender and race. Since the need for dialysis remains subjective, a more objective and inclusive definition of ARF may help in identifying a larger number of patients 'at‐risk'.  相似文献   

17.
Acute renal failure requiring dialysis therapy after cardiac surgery occurs in 1–5% of patients; however, the optimal timing for the initiation of dialysis therapy still remains undetermined. To assess the validity of early start of dialysis therapy, we studied the comparative survival between 14 patients who started to receive dialysis therapy with the timing of decrease of urine volume less than 30 mL/h and other 14 patients who waited to begin dialysis therapy until the level of urine volume of less than 20 mL/h during 14 days. Overall mortality of those patients was 50%. Twelve of 14 patients who received the early intervention survived. In contrast, only 2 of 14 patients in the other group survived. There was a significant difference of p < 0.01 between the two groups. Between the two groups, there were no significant differences in age, sex ratio, the score of APACHE (Acute Physiologic and Chronic Health Evaluation) II, and the levels of serum creatinine at the start of dialysis therapy (2.9 + 0.2 vs. 3.1 + 0.2 mg/dL) as well as in the levels of serum creatinine at admission. The start timing for the treatment of acute renal failure following cardiac surgery would be determined by the decrease of urine volume but not by the levels of serum creatinine. The early start of dialysis therapy might be preferable for the improvement of survival of the patients suffering from acute renal failure following cardiac surgery.  相似文献   

18.
Introduction:  The Kidney Disease Outcomes Quality Initiative (K/DOQI) has established target hemoglobin (Hb) level of 11–12 g/dL for all dialysis patients. For patients who leave an inpatient hospitalization with an Hb under this target, it is hypothesized that several factors contribute to the length of time required to achieve an Hb of 11 g/dL after hospitalization.
Objective:  To identify factors contributing to a decreased likelihood of reaching this target Hb.
Methods:  Using the first hospitalization of patients who initiated HD in 1999 and who were regularly treated with EPO, we identified those with a mean Hb of less than 11 g/dL on EPO claims during the same month as their index hospitalization. Patients were then followed up to see how long it took them to achieve an Hb of 11 g/dL, censored at death, re-hospitalization, a switch of modality, or suspension of EPO treatment.
Results:  A total of 6050 HD patients were identified. 3 months after hospitalization, 70% had achieved 11 g/dL, and 12% had been censored. For the remaining patients who eventually reached 11 g/dL, the average number of additional months required was 2.69 (SE of 0.09). From proportional hazards regression on the time (in months) to achieving an Hb of 11 g/dL, factors that significantly decreased the likelihood of reaching a target Hb included: a diagnosis on the index hospitalization of CHF or hepatic disease, a low Hb prior to the hospitalization, a high dose of EPO prior to the hospitalization, and a longer hospital stay.
Conclusions:  Patients with anemia after hospitalization are at high risk of both persistent anemia and rehospitalization. It is important to address patient comorbidities, to ensure adequate medication usage, and to monitor patient progress to prevent hospitalizations and potential impact on Hb levels.  相似文献   

19.
Bacteremia is a common complication for hemodialysis (HD) patients (pts) with an indwelling central venous catheter (CVC). We studied our experience with CAB and noted that CAB occurred at an average of 96 ± 98 days after CVC insertion. We wondered what percentage of CAB occurred in the first 21 days after CVC insertion and the spectrum of organisms. We prospectively collected data on all HD pts from 3 centers with a CVC who developed bacteremia between 1/1/03 and 8/31/04. Pts who developed CAB with an identifiable source of bacteremia were excluded. 131 episodes of CAB were identified; 34 (25.95%) occurred in ≤21 days. The mean ± SDEV age of the pts developing CAB > 21 days and ≤21 days was 63 ± 17 and 61.5 ± 15.4 years, respectively. Table 1 outlines the spectrum of organisms. There was a significantly greater incidence of CAB with Staphylococcus aureus (SA) and a significantly lower incidence of Staphylococcus epidermidis (SE) in the pts in whom a CAB developed ≤21 days after insertion.  

  Table 1  相似文献   


20.
Background and Purpose:  Colloid osmotic pressure (COP) in plasma rises by ultrafiltration during hemodialysis, and it consequently causes plasma refilling in which water moves from interstitial tissue to capillary space. Although hemodynamic stability is one of the important factors for good dialysis outcome, no informative and convenient indicators are available other than monitoring of blood pressure. Thus, we measured COP during hemodialysis whether COP can be used as an indicator for the hemodynamic status in comparison with hematocrit (Ht). Plasma osmolality, ultrafiltration volume, and the alteration of blood pressure were also measured to examine whether COP is associated with them.
Method:  Sixteen patients hospitalized in this hospital were examined. Amongst them, 10 patients underwent both dialysis and ultrafiltration, while 4 patients received only dialysis and 2 patients were with ultrafiltration only by extracorporeal ultrafiltration method. Ultrafiltration was performed with constant speed to the dry weight for 4 h. The measurements of COP, plasma osmolality, Ht levels, and blood pressure were performed at 30 min (12.5% of the total water removal), 1 h (25%), 2 h (50%), and 3 h (75%) after the start of hemodialysis and also at the end of dialysis (100%).
Result:  COP markedly rose by 26.0% (±13.3%) in the patients who received both dialysis and ultrafiltration, whereas Ht rose by only 13.6% (±5.21%). And the curve for COP increase was sigmoid shape, whereas that for Ht showed linear change. On the other hand, in the patients whose Ht levels showed low values, the curves for both COP and Ht showed similar pattern.
Conclusion:  These results suggest that COP is a more sensitive indicator to be monitored for the hemodynamic status than Ht during hemodialysis.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号