Accuracy of Hemodialysis Bloodstream Infection Pathogen Reporting to the National Healthcare Safety Network: Results of an Academic Dialysis Program Audit.
Accuracy of Hemodialysis Bloodstream Infection Pathogen Reporting to the National Healthcare Safety Network: Results of an Academic Dialysis Program Audit.
Project description:Little is known about patients exiting home hemodialysis. We sought to characterize the reasons, clinical characteristics, and pre-exit health care team interactions of patients on home hemodialysis who died or underwent modality conversion (negative disposition) compared with prevalent patients and those who were transplanted (positive disposition). We conducted an audit of all consecutive patients incident to home hemodialysis from January of 2010 to December of 2014 as part of ongoing quality assurance. Records were reviewed for the 6 months before exit, and vital statistics were assessed up to 90 days postexit. Ninety-four patients completed training; 25 (27%) received a transplant, 11 (12%) died, and 23 (25%) were transferred to in-center hemodialysis. Compared with the positive disposition group, patients in the negative disposition group had a longer mean dialysis vintage (3.15 [SD=4.98] versus 1.06 [SD=1.16] years; P=0.003) and were performing conventional versus a more intensive hemodialysis prescription (23 of 34 versus 23 of 60; P<0.01). In the 6 months before exit, the negative disposition group had significantly more in-center respite dialysis sessions, had more and longer hospitalizations, and required more on-call care team support in terms of phone calls and drop-in visits (each P<0.05). The most common reason for modality conversion was medical instability in 15 of 23 (65%) followed by caregiver or care partner burnout in three of 23 (13%) each. The 90-day mortality among patients undergoing modality conversion was 26%. Over a 6-year period, approximately one third of patients exited the program due to death or modality conversion. Patients who die or transfer to another modality have significantly higher health care resource utilization (e.g., hospitalization, respite treatments, nursing time, etc.).
Project description:ObjectivesDetermine the economic cost or benefit of expanding electronic case reporting (eCR) for 29 reportable conditions beyond the initial eCR implementation for COVID-19 at an academic health center.Materials and methodsThe return on investment (ROI) framework was used to quantify the economic impact of the expansion of eCR from the perspective of an academic health system over a 5-year time horizon. Sensitivity analyses were performed to assess key factors such as personnel cost, inflation, and number of expanded conditions.ResultsThe total implementation costs for the implementation year were estimated to be $5031.46. The 5-year ROI for the expansion of eCR for the 29 conditions is expected to be 142% (net present value of savings: $7166). Based on the annual ROI, estimates suggest that the savings from the expansion of eCR will cover implementation costs in approximately 4.8 years. All sensitivity analyses yielded a strong ROI for the expansion of eCR.Discussion and conclusionOur findings suggest a strong ROI for the expansion of eCR at UHealth, with the most significant cost savings observed implementing eCR for all reportable conditions. An early effort to ensure data quality is recommended to expedite the transition from parallel reporting to production to improve the ROI for healthcare organizations. This study demonstrates a positive ROI for the expansion of eCR to additional reportable conditions beyond COVID-19 in an academic health setting, such as UHealth. While this evaluation focuses on the 5-year time horizon, the potential benefit could extend further.
Project description:ObjectivePediatric patients spend significant time on maintenance hemodialysis (HD) and traveling. They are often not capable of participating in sports activities. To assess the effects of exercise training during HD on dialysis efficacy in children and adolescents, we set up a multi-center randomized controlled trial (RCT).MethodsPatients on HD, age 6 to 18 years, were randomized either to 3× weekly bicycle ergometer training or to no training during HD for 12 weeks. Change in single-pool Kt/V (spKt/V) was the primary outcome parameter.ResultsWe randomized 54 patients of whom 45 qualified (23 in the intervention and 22 in the waiting control group, 14.5 ± 3.01 years, 32 male and 13 female) for the intention-to-treat (ITT) population. Only 26 patients finished study per-protocol (PP). Training was performed for an average of 11.96 weeks (0.14-13.14) at 2.08 ± 0.76 times per week and for a weekly mean of 55.52 ± 27.26 min. Single-pool Kt/V was similar in the intervention compared to the control group (1.70 [0.33] vs. 1.79 [0.55]) at V0 and (1.70 [0.36] vs. 1.71 [0.51]) at V1; secondary endpoints also showed no difference in both ITT and PP analysis. No significant adverse events were reported. No bleeding or needle dislocation occurred in 1670 training sessions.ConclusionsIntradialytic bicycle training is safe, but does not improve dialysis efficacy and physical fitness. However, the study can be considered underpowered, particularly because of high dropout rates. Future studies need better strategies to increase motivation and compliance and other more effective/intensive exercise measures should be evaluated.Trial registrationThe trial was registered in ClinicalTrials.Gov ( Clinicaltrials.gov identifier: NCT01561118) on March 22, 2012.
Project description:AIM:To evaluate the prevalence and clinical features of Fabry disease in patients with end-stage renal disease (ESRD) undergoing chronic hemodialysis. METHODS:α-Galactosidase A activity was measured in the dried blood spots by tandem mass spectrometry in 5,572 dialysis patients (63.7% males). Diagnosis of Fabry disease was confirmed by sequencing of the GLA gene and by evaluating the globotriaosylsphingosine level in the dried blood spots. RESULTS:Fabry disease was diagnosed in 20 (0.36%) patients at the median age of 43 years (28; 58). There were 19 males and 1 female. The prevalence of Fabry disease in dialysis patients was 0.53% in males and 0.05% in females. However, it was higher in males aged 30-49 years. Seventeen different GLA mutations were identified; 5 of them were novel. The median age at the initiation of hemodialysis was similar between patients with missense and nonsense mutations. Sixteen patients (80.0%) presented with typical symptoms of Fabry disease from childhood (neuropathic pain in 16, angiokeratoma in 7 and hypohidrosis/anhidrosis in 16). All patients had left ventricular hypertrophy, and 8 patients (40%) had a history of ischemic stroke. Two patients died (recurrent stroke in one and sudden cardiac death in another patient). CONCLUSIONS:Screening in at-risk patients remains the feasible approach to diagnose Fabry disease in patients with ESRD and their family members, given a low awareness of Fabry disease among the Russian nephrologists.
Project description:Automated reporting of estimated GFR (eGFR) with serum creatinine measurement is now common. We surveyed nephrologists in four countries to determine whether eGFR reporting influences nephrologists' recommendations for dialysis initiation. Respondents were randomly allocated to receive a survey of four clinical vignettes that included either serum creatinine concentration only or serum creatinine and the corresponding eGFR. For each scenario, the respondent was asked to rank his or her likelihood of recommending dialysis initiation on a modified 8-point Likert scale, ranging from 1 ("definitely not") to 8 ("definitely would"). Analysis of the 822 eligible responses received showed that the predicted likelihood of recommending dialysis increased by 0.55 points when eGFR was reported (95% confidence interval, 0.33 to 0.76), and this effect was larger for eGFRs >5 ml/min per 1.73 m(2) (P<0.001). Subgroup analyses suggested that physicians who had been in practice ?13 years were more affected by eGFR reporting (P=0.03). These results indicate that eGFR reporting modestly increases the likelihood that dialysis is recommended, and physicians should be aware of this effect when assessing patients with severe CKD.
Project description:BackgroundEmerging data suggest that sodium disarrays including hyponatremia are potential risk factors for infection ensuing from impairments in host immunity, which may be exacerbated by coexisting conditions (i.e. mucosal membrane and cellular edema leading to breakdown of microbial barrier function). While dysnatremia and infection-related mortality are common in dialysis patients, little is known about the association between serum sodium levels and the risk of bloodstream infection in this population.MethodsAmong 823 dialysis patients from the national Biospecimen Registry Grant Program who underwent serum sodium testing over the period January 2008-December 2014, we examined the relationship between baseline serum sodium levels and subsequent rate of bloodstream infection. Bloodstream infection events were directly ascertained using laboratory blood culture data. Associations between serum sodium level and the incidence of bloodstream infection were estimated using expanded case mix-adjusted Poisson regression models.ResultsIn the overall cohort, ∼10% of all patients experienced one or more bloodstream infection events during the follow-up period. Patients with both lower sodium levels <134 mEq/l and higher sodium levels ≥140 mEq/l had higher incident rate ratios (IRRs) of bloodstream infection in expanded case mix analyses (reference 136-<138 mEq/l), with adjusted IRRs of 2.30 [95% confidence interval (CI) 1.19-4.44], 0.77 (95% CI 0.32-1.84), 1.39 (95% CI 0.78-2.47), 1.88 (95% CI 1.08-3.28) and 1.96 (95% CI 1.08-3.55) for sodium levels <134, 134-<136, 138-<140, 140-<142 and ≥142 Eq/l, respectively.ConclusionsBoth lower and higher baseline serum sodium levels were associated with a higher rate of subsequent bloodstream infections in dialysis patients. Further studies are needed to determine whether correction of dysnatremia ameliorates infection risk in this population.
Project description:ObjectiveDialysis efficacy is one of the important issues in patients undergoing hemodialysis. This study aimed to determine the adequacy of dialysis with mortality and hospital admissions in patients undergoing hemodialysis.MethodsThis retrospective cohort study was conducted on patients who underwent dialysis. Dialysis adequacy was measured based on the Kt/V criterion. Age, sex, disease etiology, duration of dialysis, and access dialysis were evaluated.Results128 patients with a mean age of 61.48 ± 13.36 years were included in the study. 8 patients had a history of kidney transplantation. The mean dialysis time in the patients was 4.30 ± 3.39 years. The mean Kt/V in the patients was 1.40 ± 1.8 years. Of the 128 patients, 53 were hospitalized for cardiac or renal reasons. The number of fatalities was 9 cases out of 128. The cause of death in all the cases was heart problems. There was a statistically significant correlation between the adequacy of dialysis in terms of Kt/V and mortality, but it was not associated with hospitalization.ConclusionInadequate dialysis in terms of Kt/V is likely to increase the rate of mortality among dialysis patients.
Project description:BackgroundVascular calcification is seen in most patients on dialysis and is strongly associated with cardiovascular mortality. Vascular calcification is promoted by phosphate, which generally reaches higher levels in hemodialysis than in peritoneal dialysis. However, whether vascular calcification develops less in peritoneal dialysis than in hemodialysis is currently unknown. Therefore, we compared coronary artery calcification (CAC), its progression, and calcification biomarkers between patients on hemodialysis and peritoneal dialysis.MethodsWe measured CAC in 134 patients who had been treated exclusively with hemodialysis (n = 94) or peritoneal dialysis (n = 40) and were transplantation candidates. In 57 of them (34 on hemodialysis and 23 on peritoneal dialysis), we also measured CAC progression annually up to 3 years and the inactive species of desphospho-uncarboxylated matrix Gla protein (dp-ucMGP), fetuin-A, osteoprotegerin. We compared CAC cross-sectionally with Tobit regression. CAC progression was compared in 2 ways: with linear mixed models as the difference in square root transformed volume score per year (ΔCAC SQRV) and with Tobit mixed models. We adjusted for potential confounders.ResultsIn the cross-sectional cohort, CAC volume scores were 92 mm3 in hemodialysis and 492 mm3 in peritoneal dialysis (adjusted difference 436 mm3; 95% CI -47 to 919; p = 0.08). In the longitudinal cohort, peritoneal dialysis was associated with significantly more CAC progression defined as ΔCAC SQRV (adjusted difference 1.20; 95% CI 0.09 to 2.31; p = 0.03), but not with Tobit mixed models (adjusted difference in CAC score increase per year 106 mm3; 95% CI -140 to 352; p = 0.40). Peritoneal dialysis was associated with higher osteoprotegerin (adjusted p = 0.02) but not with dp-ucMGP or fetuin-A.ConclusionsPeritoneal dialysis is not associated with less CAC or CAC progression than hemodialysis, and perhaps with even more progression. This indicates that vascular calcification does not develop less in peritoneal dialysis than in hemodialysis.