Project description:<h4>Background</h4>Disrupted vital-sign circadian rhythms in the intensive care unit (ICU) are associated with complications such as immune system disruption, delirium and increased patient mortality. However, the prevalence and extent of this disruption is not well understood. Tools for its detection are currently limited.<h4>Methods</h4>This paper evaluated and compared vital-sign circadian rhythms in systolic blood pressure, heart rate, respiratory rate and temperature. Comparisons were made between the cohort of patients who recovered from the ICU and those who did not, across three large, publicly available clinical databases. This comparison included a qualitative assessment of rhythm profiles, as well as quantitative metrics such as peak-nadir excursions and correlation to a demographically matched 'recovered' profile.<h4>Results</h4>Circadian rhythms were present at the cohort level in all vital signs throughout an ICU stay. Peak-nadir excursions and correlation to a 'recovered' profile were typically greater throughout an ICU stay in the cohort of patients who recovered, compared to the cohort of patients who did not.<h4>Conclusions</h4>These results suggest that vital-sign circadian rhythms are typically present at the cohort level throughout an ICU stay and that quantitative assessment of these rhythms may provide information of prognostic use in the ICU.
Project description:The circadian clock drives extensive temporal gene expression programs controlling daily changes in behavior and physiology. In mouse liver, transcription factors dynamics, chromatin modifications, and RNA Polymerase II (PolII) activity oscillate throughout the 24-hour (24h) day, regulating the rhythmic synthesis of thousands of transcripts. Also, 24h rhythms in gene promoter-enhancer chromatin looping accompany rhythmic mRNA synthesis. However, how chromatin organization impinges on temporal transcription and liver physiology remains unclear. Here, we applied time-resolved chromosome conformation capture (4C-seq) in livers of WT and arrhythmic Bmal1 knockout mice. In WT, we observed 24h oscillations in promoter-enhancer loops at multiple loci including the core-clock genes Period1, Period2 and Bmal1. In addition, we detected rhythmic PolII activity, chromatin modifications and transcription involving stable chromatin loops at clock-output gene promoters representing key liver function such as glucose metabolism and detoxification. Intriguingly, these contacts persisted in clock-impaired mice in which both PolII activity and chromatin marks no longer oscillated. Finally, we observed chromatin interaction hubs connecting neighbouring genes showing coherent transcription regulation across genotypes. Thus, both clock-controlled and clock-independent chromatin topology underlie rhythmic regulation of liver physiology.
Project description:<h4>Objective</h4>The current standard for hospital glucose management is point-of-care (POC) testing. We conducted a randomized controlled trial of real-time continuous glucose monitoring (RT-CGM) compared with POC in a non-intensive care unit (ICU) hospital setting.<h4>Research design and methods</h4>A total of 110 adults with type 2 diabetes on a non-ICU floor received RT-CGM with Dexcom G6 versus usual care (UC). RT-CGM data were wirelessly transmitted from the bedside. Hospital telemetry monitored RT-CGM data and notified bedside nursing of glucose alerts and trends. Standardized protocols were used for interventions.<h4>Results</h4>The RT-CGM group demonstrated significantly lower mean glucose (M∆ = -18.5 mg/dL) and percentage of time in hyperglycemia >250 mg/dL (-11.41%) and higher time in range 70-250 mg/dL (+11.26%) compared with UC (<i>P</i> values <0.05). Percentage of time in hypoglycemia was very low.<h4>Conclusions</h4>RT-CGM can be used successfully in community-based hospital non-ICU settings to improve glucose management. Continuously streaming glucose readings may truly be the fifth vital sign.
Project description:OBJECTIVES:We validate a machine learning-based sepsis-prediction algorithm (InSight) for the detection and prediction of three sepsis-related gold standards, using only six vital signs. We evaluate robustness to missing data, customisation to site-specific data using transfer learning and generalisability to new settings. DESIGN:A machine-learning algorithm with gradient tree boosting. Features for prediction were created from combinations of six vital sign measurements and their changes over time. SETTING:A mixed-ward retrospective dataset from the University of California, San Francisco (UCSF) Medical Center (San Francisco, California, USA) as the primary source, an intensive care unit dataset from the Beth Israel Deaconess Medical Center (Boston, Massachusetts, USA) as a transfer-learning source and four additional institutions' datasets to evaluate generalisability. PARTICIPANTS:684?443 total encounters, with 90?353 encounters from June 2011 to March 2016 at UCSF. INTERVENTIONS:None. PRIMARY AND SECONDARY OUTCOME MEASURES:Area under the receiver operating characteristic (AUROC) curve for detection and prediction of sepsis, severe sepsis and septic shock. RESULTS:For detection of sepsis and severe sepsis, InSight achieves an AUROC curve of 0.92 (95% CI 0.90 to 0.93) and 0.87 (95% CI 0.86 to 0.88), respectively. Four hours before onset, InSight predicts septic shock with an AUROC of 0.96 (95% CI 0.94 to 0.98) and severe sepsis with an AUROC of 0.85 (95% CI 0.79 to 0.91). CONCLUSIONS:InSight outperforms existing sepsis scoring systems in identifying and predicting sepsis, severe sepsis and septic shock. This is the first sepsis screening system to exceed an AUROC of 0.90 using only vital sign inputs. InSight is robust to missing data, can be customised to novel hospital data using a small fraction of site data and retains strong discrimination across all institutions.
Project description:Vital sign instability on discharge could be a clinically objective means of assessing readiness and safety for discharge; however, the association between vital sign instability on discharge and post-hospital outcomes is unclear.To assess the association between vital sign instability at hospital discharge and post-discharge adverse outcomes.Multi-center observational cohort study using electronic health record data. Abnormalities in temperature, heart rate, blood pressure, respiratory rate, and oxygen saturation were assessed within 24 hours of discharge. We used logistic regression adjusted for predictors of 30-day death and readmission.Adults (?18 years) with a hospitalization to any medicine service in 2009-2010 at six hospitals (safety-net, community, teaching, and non-teaching) in north Texas.Death or non-elective readmission within 30 days after discharge.Of 32,835 individuals, 18.7 % were discharged with one or more vital sign instabilities. Overall, 12.8 % of individuals with no instabilities on discharge died or were readmitted, compared to 16.9 % with one instability, 21.2 % with two instabilities, and 26.0 % with three or more instabilities (p?<?0.001). The presence of any (?1) instability was associated with higher risk-adjusted odds of either death or readmission (AOR 1.36, 95 % CI 1.26-1.48), and was more strongly associated with death (AOR 2.31, 95 % CI 1.91-2.79). Individuals with three or more instabilities had nearly fourfold increased odds of death (AOR 3.91, 95 % CI 1.69-9.06) and increased odds of 30-day readmission (AOR 1.36, 95 % 0.81-2.30) compared to individuals with no instabilities. Having two or more vital sign instabilities at discharge had a positive predictive value of 22 % and positive likelihood ratio of 1.8 for 30-day death or readmission.Vital sign instability on discharge is associated with increased risk-adjusted rates of 30-day mortality and readmission. These simple vital sign criteria could be used to assess safety for discharge, and to reduce 30-day mortality and readmissions.
Project description:Increase in mortality and in recurrent infections in the year following ICU discharge continues in survivors of septic shock, even after total clinical recovery from the initial septic event and its complications. This supports the hypothesis that sepsis could induce persistent long-term immune dysfunctions. To date, there is almost no data on ICU discharge and long-term evolution of sepsis-induced immunosuppression in septic shock survivors. The aim of this study was to assess the persistence of sepsis-induced immunosuppression by measuring expression of human leukocyte antigen DR on monocytes (mHLA-DR), CD4+ T cells, and regulatory T cells (Treg) at ICU discharge and 6 months after ICU discharge in patients admitted to the ICU for septic shock.In this prospective observational study, septic shock survivors with no preexisting immune suppression or treatment interfering with the immune system were included. mHLA-DR, CD4+ T cells, and Treg expression were assessed on day 1-2, 3-4, and 6-8 after ICU admission, at ICU discharge, and 6 months after ICU discharge.A total of 40 patients were enrolled during their ICU stay: 21 males (52.5%) and 19 females, median age 68 years (IQR 58-77), median SOFA score on day 1-2 was 8 (IQR 7-9), and median ICU length of stay was 11 days (IQR 7-24). Among these 40 patients, 33 were studied at ICU discharge and 15 were disposed for blood sampling 6 months after ICU discharge. On day 1-2, mHLA-DR expression was abnormally low for all patients [median 4212 (IQR 2640-6047) AB/C] and remained abnormally low at ICU discharge for 75% of them [median 10,281 (IQR 7719-13,035) AB/C]. On day 3-4, 46% of patients presented CD4+ lymphopenia [median 515 (IQR 343-724) mm-3] versus 34% at ICU discharge [median 642 (IQR 459-846) mm-3]. Among patients with a 6-month blood sample, normal values of mHLA-DR were found for all patients [median 32,616 (IQR 24,918-38,738) AB/C] except for one and only another one presented CD4+ lymphopenia.While immune alterations persist at ICU discharge, there is, at cellular level, no persistent immune alterations among septic shock survivors analyzed 6 months after ICU discharge.
Project description:<h4>Objective</h4>To determine whether a small, wearable multisensor device can discriminate between progressive versus relapsing multiple sclerosis (MS) and capture limb progression over a short interval, using finger and foot tap data.<h4>Methods</h4>Patients with MS were followed prospectively during routine clinic visits approximately every 6 months. At each visit, participants performed finger and foot taps wearing the MYO-band, which includes accelerometer, gyroscope, and surface electromyogram sensors. Metrics of within-patient limb progression were created by combining the change in signal waveform features over time. The resulting upper (UE) and lower (LE) extremity metrics' discrimination of progressive versus relapsing MS were evaluated with calculation of AUROC. Comparisons with Expanded Disability Status Scale (EDSS) scores were made with Pearson correlation.<h4>Results</h4>Participants included 53 relapsing and 15 progressive MS (72% female, baseline mean age 48 years, median disease duration 11 years, median EDSS 2.5, median 10 months follow-up). The final summary metrics differentiated relapsing from secondary progressive MS with AUROC UE 0.93 and LE 0.96. The metrics were associated with baseline EDSS (UE P = 0.0003, LE P = 0.0007). While most had no change in EDSS during the short follow-up, several had evidence of progression by the multisensor metrics.<h4>Interpretation</h4>Within a short follow-up interval, this novel multisensor algorithm distinguished progressive from relapsing MS and captured changes in limb function. Inexpensive, noninvasive and easy to use, this novel outcome is readily adaptable to clinical practice and trials as a MS vital sign. This approach also holds promise to monitor limb dysfunction in other neurological diseases.
Project description:Ameloblasts, the cells responsible for making enamel, modify their morphological features in response to specialized functions necessary for synchronized ameloblast differentiation and enamel formation. Secretory and maturation ameloblasts are characterized by the expression of stage-specific genes which follows strictly controlled repetitive patterns. Circadian rhythms are recognized as key regulators of the development and diseases of many tissues including bone. Our aim was to gain novel insights on the role of clock genes in enamel formation and to explore the potential links between circadian rhythms and amelogenesis. Our data shows definitive evidence that the main clock genes (Bmal1, Clock, Per1 and Per2) oscillate in ameloblasts at regular circadian (24 h) intervals both at RNA and protein levels. This study also reveals that the two markers of ameloblast differentiation i.e. amelogenin (Amelx; a marker of secretory stage ameloblasts) and kallikrein-related peptidase 4 (Klk4, a marker of maturation stage ameloblasts) are downstream targets of clock genes. Both, Amelx and Klk4 show 24h oscillatory expression patterns and their expression levels are up-regulated after Bmal1 over-expression in HAT-7 ameloblast cells. Taken together, these data suggest that both the secretory and the maturation stages of amelogenesis might be under circadian control. Changes in clock gene expression patterns might result in significant alterations of enamel apposition and mineralization.
Project description:Using large-scale interaction data from a virtual world, we show that people's propensity to socialize (forming new social connections) varies by hour of the day. We arrive at our results by longitudinally tracking people's friend-adding activities in a virtual world. Specifically, we find that people are most likely to socialize during the evening, at approximately 8 p.m. and 12 a.m., and are least likely to do so in the morning, at approximately 8 a.m. Such patterns prevail on weekdays and weekends and are robust to variations in individual characteristics and geographical conditions.
Project description:The mechanistic basis of eukaryotic circadian oscillators in model systems as diverse as Neurospora, Drosophila, and mammalian cells is thought to be a transcription-and-translation-based negative feedback loop, wherein progressive and controlled phosphorylation of one or more negative elements ultimately elicits their own proteasome-mediated degradation, thereby releasing negative feedback and determining circadian period length. The Neurospora crassa circadian negative element FREQUENCY (FRQ) exemplifies such proteins; it is progressively phosphorylated at more than 100 sites, and strains bearing alleles of frq with anomalous phosphorylation display abnormal stability of FRQ that is well correlated with altered periods or apparent arrhythmicity. Unexpectedly, we unveiled normal circadian oscillations that reflect the allelic state of frq but that persist in the absence of typical degradation of FRQ. This manifest uncoupling of negative element turnover from circadian period length determination is not consistent with the consensus eukaryotic circadian model.