Project description:The average environmental and occupational physiologist may find statistics are difficult to interpret and use since their formal training in statistics is limited. Unfortunately, poor statistical practices can generate erroneous or at least misleading results and distorts the evidence in the scientific literature. These problems are exacerbated when statistics are used as thoughtless ritual that is performed after the data are collected. The situation is worsened when statistics are then treated as strict judgements about the data (i.e., significant versus non-significant) without a thought given to how these statistics were calculated or their practical meaning. We propose that researchers should consider statistics at every step of the research process whether that be the designing of experiments, collecting data, analysing the data or disseminating the results. When statistics are considered as an integral part of the research process, from start to finish, several problematic practices can be mitigated. Further, proper practices in disseminating the results of a study can greatly improve the quality of the literature. Within this review, we have included a number of reminders and statistical questions researchers should answer throughout the scientific process. Rather than treat statistics as a strict rule following procedure we hope that readers will use this review to stimulate a discussion around their current practices and attempt to improve them. The code to reproduce all analyses and figures within the manuscript can be found at https://doi.org/10.17605/OSF.IO/BQGDH.
Project description:BackgroundEven though real-time PCR has been broadly applied in biomedical sciences, data processing procedures for the analysis of quantitative real-time PCR are still lacking; specifically in the realm of appropriate statistical treatment. Confidence interval and statistical significance considerations are not explicit in many of the current data analysis approaches. Based on the standard curve method and other useful data analysis methods, we present and compare four statistical approaches and models for the analysis of real-time PCR data.ResultsIn the first approach, a multiple regression analysis model was developed to derive DeltaDeltaCt from estimation of interaction of gene and treatment effects. In the second approach, an ANCOVA (analysis of covariance) model was proposed, and the DeltaDeltaCt can be derived from analysis of effects of variables. The other two models involve calculation DeltaCt followed by a two group t-test and non-parametric analogous Wilcoxon test. SAS programs were developed for all four models and data output for analysis of a sample set are presented. In addition, a data quality control model was developed and implemented using SAS.ConclusionPractical statistical solutions with SAS programs were developed for real-time PCR data and a sample dataset was analyzed with the SAS programs. The analysis using the various models and programs yielded similar results. Data quality control and analysis procedures presented here provide statistical elements for the estimation of the relative expression of genes using real-time PCR.
Project description:Randomized controlled trials (RCT) are often underpowered for validating gene-treatment interactions. Using published data from the Diabetes Prevention Program (DPP), we examined power in conventional and genotype-based recall (GBR) trials. We calculated sample size and statistical power for gene-metformin interactions (vs. placebo) using incidence rates, gene-drug interaction effect estimates and allele frequencies reported in the DPP for the rs8065082 SLC47A1 variant, a metformin transported encoding locus. We then calculated statistical power for interactions between genetic risk scores (GRS), metformin treatment and intensive lifestyle intervention (ILI) given a range of sampling frames, clinical trial sample sizes, interaction effect estimates, and allele frequencies; outcomes were type 2 diabetes incidence (time-to-event) and change in small LDL particles (continuous outcome). Thereafter, we compared two recruitment frameworks: GBR (participants recruited from the extremes of a GRS distribution) and conventional sampling (participants recruited without explicit emphasis on genetic characteristics). We further examined the influence of outcome measurement error on statistical power. Under most simulated scenarios, GBR trials have substantially higher power to observe gene-drug and gene-lifestyle interactions than same-sized conventional RCTs. GBR trials are becoming popular for validation of gene-treatment interactions; our analyses illustrate the strengths and weaknesses of this design.
Project description:Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system.
Project description:The physicochemical properties of molecular crystals, such as solubility, stability, compactability, melting behaviour and bioavailability, depend on their crystal form1. In silico crystal form selection has recently come much closer to realization because of the development of accurate and affordable free-energy calculations2-4. Here we redefine the state of the art, primarily by improving the accuracy of free-energy calculations, constructing a reliable experimental benchmark for solid-solid free-energy differences, quantifying statistical errors for the computed free energies and placing both hydrate crystal structures of different stoichiometries and anhydrate crystal structures on the same energy landscape, with defined error bars, as a function of temperature and relative humidity. The calculated free energies have standard errors of 1-2 kJ mol-1 for industrially relevant compounds, and the method to place crystal structures with different hydrate stoichiometries on the same energy landscape can be extended to other multi-component systems, including solvates. These contributions reduce the gap between the needs of the experimentalist and the capabilities of modern computational tools, transforming crystal structure prediction into a more reliable and actionable procedure that can be used in combination with experimental evidence to direct crystal form selection and establish control5.
Project description:Monolithic quartz crystal microbalance (MQCM) has recently emerged as a very promising technology suitable for biosensing applications. These devices consist of an array of miniaturized QCM sensors integrated within the same quartz substrate capable of detecting multiple target analytes simultaneously. Their relevant benefits include high throughput, low cost per sensor unit, low sample/reagent consumption and fast sensing response. Despite the great potential of MQCM, unwanted environmental factors (e.g., temperature, humidity, vibrations, or pressure) and perturbations intrinsic to the sensor setup (e.g., mechanical stress exerted by the measurement cell or electronic noise of the characterization system) can affect sensor stability, masking the signal of interest and degrading the limit of detection (LoD). Here, we present a method based on the discrete wavelet transform (DWT) to improve the stability of the resonance frequency and dissipation signals in real time. The method takes advantage of the similarity among the noise patterns of the resonators integrated in an MQCM device to mitigate disturbing factors that impact on sensor response. Performance of the method is validated by studying the adsorption of proteins (neutravidin and biotinylated albumin) under external controlled factors (temperature and pressure/flow rate) that simulate unwanted disturbances.
Project description:The unraveling and control of protein stability at different temperatures is a fundamental problem in biophysics that is substantially far from being quantitatively and accurately solved, as it requires a precise knowledge of the temperature dependence of amino acid interactions. In this paper we attempt to gain insight into the thermal stability of proteins by designing a tool to predict the full stability curve as a function of the temperature for a set of 45 proteins belonging to 11 homologous families, given their sequence and structure, as well as the melting temperature (Tm) and the change in heat capacity (ΔCP) of proteins belonging to the same family. Stability curves constitute a fundamental instrument to analyze in detail the thermal stability and its relation to the thermodynamic stability, and to estimate the enthalpic and entropic contributions to the folding free energy. In summary, our approach for predicting the protein stability curves relies on temperature-dependent statistical potentials derived from three datasets of protein structures with targeted thermal stability properties. Using these potentials, the folding free energies (ΔG) at three different temperatures were computed for each protein. The Gibbs-Helmholtz equation was then used to predict the protein's stability curve as the curve that best fits these three points. The results are quite encouraging: the standard deviations between the experimental and predicted Tm's, ΔCP's and folding free energies at room temperature (ΔG25) are equal to 13° C, 1.3 kcal/(mol° C) and 4.1 kcal/mol, respectively, in cross-validation. The main sources of error and some further improvements and perspectives are briefly discussed.
Project description:IntroductionEstimating treatment effects as time savings in disease progression may be more easily interpretable than assessing the absolute difference or a percentage reduction. In this study, we investigate the statistical considerations of the existing method for estimating time savings and propose alternative complementary methods.MethodsWe propose five alternative methods to estimate the time savings from different perspectives. These methods are applied to simulated clinical trial data that mimic or modify the Clinical Dementia Rating Sum of Boxes progression trajectories observed in the Clarity AD lecanemab trial.ResultsOur study demonstrates that the proposed methods can generate more precise estimates by considering two crucial factors: (1) the absolute difference between treatment arms, and (2) the observed progression rate in the treatment arm.DiscussionQuantifying treatment effects as time savings in disease progression offers distinct advantages. To provide comprehensive estimations, it is important to use various methods.HighlightsWe explore the statistical considerations of the current method for estimating time savings. We proposed alternative methods that provide time savings estimations based on the observed absolute differences. By using various methods, a more comprehensive estimation of time savings can be achieved.
Project description:We study the relationships between the real-time psychophysiological activity of professional traders, their financial transactions, and market fluctuations. We collected multiple physiological signals such as heart rate, blood volume pulse, and electrodermal activity of 55 traders at a leading global financial institution during their normal working hours over a five-day period. Using their physiological measurements, we implemented a novel metric of trader’s “psychophysiological activation” to capture affect such as excitement, stress and irritation. We find statistically significant relations between traders’ psychophysiological activation levels and such as their financial transactions, market fluctuations, the type of financial products they traded, and their trading experience. We conducted post-measurement interviews with traders who participated in this study to obtain additional insights in the key factors driving their psychophysiological activation during financial risk processing. Our work illustrates that psychophysiological activation plays a prominent role in financial risk processing for professional traders.
Project description:Gene expression is tightly regulated in space and time. To dissect this process with high temporal resolution, we introduce an optogenetic tool termed BLInCR (Blue Light-Induced Chromatin Recruitment) that combines rapid and reversible light-dependent recruitment of effector proteins with a real-time readout for transcription. We used BLInCR to control the activity of a reporter gene cluster in the human osteosarcoma cell line U2OS by reversibly recruiting the viral transactivator VP16. RNA production was detectable ~2 minutes after VP16 recruitment and readily decreased when VP16 dissociated from the cluster in the absence of light. Quantitative assessment of the activation process revealed biphasic activation kinetics with a pronounced early phase in cells treated with the histone deacetylase inhibitor SAHA. Comparison with kinetic models for transcription activation suggests that the gene cluster undergoes a maturation process when activated. We anticipate that BLInCR will facilitate the study of transcription dynamics in living cells.