Project description:Parametric time-to-event analysis is an important pharmacometric method to predict the probability of an event up until a certain time as a function of covariates and/or drug exposure. Modeling is performed at the level of the hazard function describing the instantaneous rate of an event occurring at that timepoint. We give an overview of the parametric time-to-event analysis starting with graphical exploration by Kaplan-Meier plotting for the event data including censoring and nonparametric hazard estimators such as the kernel-based visual hazard comparison for the underlying hazard. The most common hazard functions including the exponential, Gompertz, Weibull, log-normal, log-logistic, and circadian functions are described in detail. A Shiny application was developed to graphically guide the modeler which of the most common hazard functions presents a similar shape compared to the data in order to guide which hazard functions to test in the parametric time-to-event analysis. For the chosen hazard function(s), the Shiny application can additionally be used to explore corresponding parameter values to inform on suitable initial estimates for parametric modeling as well as on possible covariate or treatment relationships to certain parameters. Moreover, it can be used for the dissemination of results as well as communication, training, and workshops on time-to-event analysis. By guiding the modeler on which functions and what parameter values to test and compare as well as to assist in dissemination, the Shiny application developed here greatly supports the modeler in complicated parametric time-to-event modeling.
Project description:Survival analysis (also referred to as time-to-event analysis) is the study of the time elapsed from a starting date to some event of interest. In practice, these analyses can be challenging and, if methodological errors are to be avoided, require the application of appropriate techniques. By using simulations and real-life data based on the French national registry of patients with primary immunodeficiencies (CEREDIH), we sought to highlight the basic elements that need to be handled correctly when performing the initial steps in a survival analysis. We focused on non-parametric methods to deal with right censoring, left truncation, competing risks, and recurrent events. Our simulations show that ignoring these aspects induces a bias in the results; we then explain how to analyze the data correctly in these situations using non-parametric methods. Rare disease registries are extremely valuable in medical research. We discuss the application of appropriate methods for the analysis of time-to-event from the CEREDIH registry. The objective of this tutorial article is to provide clinicians and healthcare professionals with better knowledge of the issues facing them when analyzing time-to-event data.
Project description:Event-related potentials (ERPs) and magnetic fields (ERFs) are typically analyzed via ANOVAs on mean activity in a priori windows. Advances in computing power and statistics have produced an alternative, mass univariate analyses consisting of thousands of statistical tests and powerful corrections for multiple comparisons. Such analyses are most useful when one has little a priori knowledge of effect locations or latencies, and for delineating effect boundaries. Mass univariate analyses complement and, at times, obviate traditional analyses. Here we review this approach as applied to ERP/ERF data and four methods for multiple comparison correction: strong control of the familywise error rate (FWER) via permutation tests, weak control of FWER via cluster-based permutation tests, false discovery rate control, and control of the generalized FWER. We end with recommendations for their use and introduce free MATLAB software for their implementation.
Project description:BackgroundMost patients with congenital heart disease survive into adulthood; however, residual abnormalities remain and management of the patients is life-long and personalized. Patients with surgical repair of transposition of the great arteries, for example, face the risk to develop neoaortic valve regurgitation. Cardiologists update the prognosis of the patient intuitively with updated information of the cardiovascular status of the patient, for instance from echocardiographic imaging.MethodsUsually a time-dependent version of the Cox model is used to analyze repeated measurements with a time-to-event outcome. New statistical methods have been developed with multiple advantages, of which the most prominent one being the joint model for longitudinal and time-to-event outcome. In this tutorial, the joint modeling framework is introduced and applied to patients with transposition of the great arteries after surgery with a long-term follow-up, where repeated echocardiographic values of the neoaortic root are evaluated against the risk of neoaortic valve regurgitation.ResultsThe data are analyzed with the time-dependent Cox model as benchmark method, and the results are compared with a joint model, leading to different conclusions. The flexibility of the joint model is shown by adding the growth rate of the neoaortic root to the model and adding repeated values of body surface area to obtain a multimarker model. Lastly, it is demonstrated how the joint model can be used to obtain personalized dynamic predictions of the event.ConclusionsThe joint model for longitudinal and time-to-event data is an attractive method to analyze data in follow-up studies with repeated measurements. Benefits of the method include using the estimated natural trajectory of the longitudinal outcome, great flexibility through multiple extensions, and dynamic individualized predictions.
Project description:Acute graft-versus-host disease (GVHD) is a frequent complication following hematopoietic cell transplantation (HCT). Research on risk factors for acute GVHD has tended to ignore two important clinical issues. First, post-transplant mortality is high. In our motivating data, 100-day post-HCT mortality was 15.4%. Second, acute GVHD in its classic form is only diagnosed within 100 days of the transplant; beyond 100 days, a patient may be diagnosed with late onset acute or chronic GVHD. Standard modeling of time-to-event outcomes, however, generally conceive of patients being able to experience the event at any point on the time scale. In this paper, we propose a novel multi-state model that simultaneously: (i) accounts for mortality through joint modeling of acute GVHD and death, and (ii) explicitly acknowledges the finite time interval during which the event of interest can take place. The observed data likelihood is derived, with estimation and inference via maximum likelihood. Additionally, we provide methods for estimating the absolute risk of acute GVHD and death simultaneously. The proposed framework is compared via comprehensive simulations to a number of alternative approaches that each acknowledge some but not all aspects of acute GVHD, and illustrated with an analysis of HCT data that motivated this work.
Project description:Modern health data science applications leverage abundant molecular and electronic health data, providing opportunities for machine learning to build statistical models to support clinical practice. Time-to-event analysis, also called survival analysis, stands as one of the most representative examples of such statistical models. We present a deep-network-based approach that leverages adversarial learning to address a key challenge in modern time-to-event modeling: nonparametric estimation of event-time distributions. We also introduce a principled cost function to exploit information from censored events (events that occur subsequent to the observation window). Unlike most time-to-event models, we focus on the estimation of time-to-event distributions, rather than time ordering. We validate our model on both benchmark and real datasets, demonstrating that the proposed formulation yields significant performance gains relative to a parametric alternative, which we also propose.
Project description:Time-resolved multivariate pattern analysis (MVPA), a popular technique for analyzing magneto- and electro-encephalography (M/EEG) neuroimaging data, quantifies the extent and time-course by which neural representations support the discrimination of relevant stimuli dimensions. As EEG is widely used for infant neuroimaging, time-resolved MVPA of infant EEG data is a particularly promising tool for infant cognitive neuroscience. MVPA has recently been applied to common infant imaging methods such as EEG and fNIRS. In this tutorial, we provide and describe code to implement time-resolved, within-subject MVPA with infant EEG data. An example implementation of time-resolved MVPA based on linear SVM classification is described, with accompanying code in Matlab and Python. Results from a test dataset indicated that in both infants and adults this method reliably produced above-chance accuracy for classifying stimuli images. Extensions of the classification analysis are presented including both geometric- and accuracy-based representational similarity analysis, implemented in Python. Common choices of implementation are presented and discussed. As the amount of artifact-free EEG data contributed by each participant is lower in studies of infants than in studies of children and adults, we also explore and discuss the impact of varying participant-level inclusion thresholds on resulting MVPA findings in these datasets.
Project description:Continuous-time event sequences represent discrete events occurring in continuous time. Such sequences arise frequently in real-life. Usually we expect the sequences to follow some regular pattern over time. However, sometimes these patterns may be interrupted by unexpected absence or occurrences of events. Identification of these unexpected cases can be very important as they may point to abnormal situations that need human attention. In this work, we study and develop methods for detecting outliers in continuous-time event sequences, including unexpected absence and unexpected occurrences of events. Since the patterns that event sequences tend to follow may change in different contexts, we develop outlier detection methods based on point processes that can take context information into account. Our methods are based on Bayesian decision theory and hypothesis testing with theoretical guarantees. To test the performance of the methods, we conduct experiments on both synthetic data and real-world clinical data and show the effectiveness of the proposed methods.
Project description:Interrupted time series (ITS) analysis is a valuable study design for evaluating the effectiveness of population-level health interventions that have been implemented at a clearly defined point in time. It is increasingly being used to evaluate the effectiveness of interventions ranging from clinical therapy to national public health legislation. Whereas the design shares many properties of regression-based approaches in other epidemiological studies, there are a range of unique features of time series data that require additional methodological considerations. In this tutorial we use a worked example to demonstrate a robust approach to ITS analysis using segmented regression. We begin by describing the design and considering when ITS is an appropriate design choice. We then discuss the essential, yet often omitted, step of proposing the impact model a priori. Subsequently, we demonstrate the approach to statistical analysis including the main segmented regression model. Finally we describe the main methodological issues associated with ITS analysis: over-dispersion of time series data, autocorrelation, adjusting for seasonal trends and controlling for time-varying confounders, and we also outline some of the more complex design adaptations that can be used to strengthen the basic ITS design.
Project description:The case-crossover design is widely used in environmental epidemiology as an effective alternative to the conventional time-series regression design to estimate short-term associations of environmental exposures with a range of acute events. This tutorial illustrates the implementation of the time-stratified case-crossover design to study aggregated health outcomes and environmental exposures, such as particulate matter air pollution, focusing on adjusting covariates and investigating effect modification using conditional Poisson regression. Time-varying confounders can be adjusted directly in the conditional regression model accounting for the adequate lagged exposure-response function. Time-invariant covariates at the subpopulation level require reshaping the typical time-series data set into a long format and conditioning out the covariate in the expanded stratum set. When environmental exposure data are available at geographical units, the stratum set should combine time and spatial dimensions. Moreover, it is possible to examine effect modification using interaction models. The time-stratified case-crossover design offers a flexible framework to properly account for a wide range of covariates in environmental epidemiology studies.