Project description:Few if any natural resource systems are completely understood and fully observed. Instead, there almost always is uncertainty about the way a system works and its status at any given time, which can limit effective management. A natural approach to uncertainty is to allocate time and effort to the collection of additional data, on the reasonable assumption that more information will facilitate better understanding and lead to better management. But the collection of more data, either through observation or investigation, requires time and effort that often can be put to other conservation activities. An important question is whether the use of limited resources to improve understanding is justified by the resulting potential for improved management. In this paper we address directly a change in value from new information collected through investigation. We frame the value of information in terms of learning through the management process itself, as well as learning through investigations that are external to the management process but add to our base of understanding. We provide a conceptual framework and metrics for this issue, and illustrate them with examples involving Florida scrub-jays (Aphelocoma coerulescens).
Project description:We consider how a signalling system can act as an information hub by multiplexing information arising from multiple signals. We formally define multiplexing, mathematically characterise which systems can multiplex and how well they can do it. While the results of this paper are theoretical, to motivate the idea of multiplexing, we provide experimental evidence that tentatively suggests that the NF-κB transcription factor can multiplex information about changes in multiple signals. We believe that our theoretical results may resolve the apparent paradox of how a system like NF-κB that regulates cell fate and inflammatory signalling in response to diverse stimuli can appear to have the low information carrying capacity suggested by recent studies on scalar signals. In carrying out our study, we introduce new methods for the analysis of large, nonlinear stochastic dynamic models, and develop computational algorithms that facilitate the calculation of fundamental constructs of information theory such as Kullback-Leibler divergences and sensitivity matrices, and link these methods to a new theory about multiplexing information. We show that many current models such as those of the NF-κB system cannot multiplex effectively and provide models that overcome this limitation using post-transcriptional modifications.
Project description:BackgroundThe expected value of sample information (EVSI) calculates the value of collecting additional information through a research study with a given design. However, standard EVSI analyses do not account for the slow and often incomplete implementation of the treatment recommendations that follow research. Thus, standard EVSI analyses do not correctly capture the value of the study. Previous research has developed measures to calculate the research value while adjusting for implementation challenges, but estimating these measures is a challenge.MethodsBased on a method that assumes the implementation level is related to the strength of evidence in favor of the treatment, 2 implementation-adjusted EVSI calculation methods are developed. These novel methods circumvent the need for analytical calculations, which were restricted to settings in which normality could be assumed. The first method developed in this article uses computationally demanding nested simulations, based on the definition of the implementation-adjusted EVSI. The second method is based on adapting the moment matching method, a recently developed efficient EVSI computation method, to adjust for imperfect implementation. The implementation-adjusted EVSI is then calculated with the 2 methods across 3 examples.ResultsThe maximum difference between the 2 methods is at most 6% in all examples. The efficient computation method is between 6 and 60 times faster than the nested simulation method in this case study and could be used in practice.ConclusionsThis article permits the calculation of an implementation-adjusted EVSI using realistic assumptions. The efficient estimation method is accurate and can estimate the implementation-adjusted EVSI in practice. By adapting standard EVSI estimation methods, adjustments for imperfect implementation can be made with the same computational cost as a standard EVSI analysis.HighlightsStandard expected value of sample information (EVSI) analyses do not account for the fact that treatment implementation following research is often slow and incomplete, meaning they incorrectly capture the value of the study.Two methods, based on nested Monte Carlo sampling and the moment matching EVSI calculation method, are developed to adjust EVSI calculations for imperfect implementation when the speed and level of the implementation of a new treatment depends on the strength of evidence in favor of the treatment.The 2 methods we develop provide similar estimates for the implementation-adjusted EVSI.Our methods extend current EVSI calculation algorithms and thus require limited additional computational complexity.
Project description:ObjectiveThis systematic review aimed to: (1) determine the prevalence and scope of existing research on human resource information systems (HRIS) in health organizations; (2) analyze, classify, and synthesize evidence on the processes and impacts of HRIS development, implementation, and adoption; and (3) generate recommendations for HRIS research, practice, and policy, with reference to the needs of different stakeholders.MethodsA structured search strategy was used to interrogate 10 electronic databases indexing research from the health, social, management, technology, and interdisciplinary sciences, alongside gray literature sources and reference lists of qualifying studies. There were no restrictions on language or publication year. Two reviewers screened publications, extracted data, and coded findings according to the innovation stages covered in the studies. The Critical Appraisal Skills Program checklist was adopted to assess study quality. The process of study selection was charted using a Preferred Items for Systematic Reviews and Meta-Analysis (PRISMA) diagram.ResultsOf the 6824 publications identified by the search strategy, 68, covering 42 studies, were included for final analysis. Research on HRIS in health was interdisciplinary, often atheoretical, conducted primarily in the hospital sector of high-income economies, and largely focused uncritically on use and realized benefits.Discussion and conclusionsWhile studies of HRIS in health exist, the overall lack of evaluative research raises unanswered questions about their capacity to improve quality and efficiency and enable learning health systems, as well as how sociotechnical complexity influences implementation and effectiveness. We offer this analysis to decision makers and managers considering or currently implementing an HRIS, and make recommendations for further research.Trial registrationInternational Prospective Register of Systematic Reviews (PROSPERO): CRD42015023581. http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42015023581#.VYu1BPlVjDU .
Project description:BackgroundThe purpose of external validation of a risk prediction model is to evaluate its performance before recommending it for use in a new population. Sample size calculations for such validation studies are currently based on classical inferential statistics around metrics of discrimination, calibration, and net benefit (NB). For NB as a measure of clinical utility, the relevance of inferential statistics is doubtful. Value-of-information methodology enables quantifying the value of collecting validation data in terms of expected gain in clinical utility.MethodsWe define the validation expected value of sample information (EVSI) as the expected gain in NB by procuring a validation sample of a given size. We propose 3 algorithms for EVSI computation and compare their face validity and computation time in simulation studies. In a case study, we use the non-US subset of a clinical trial to create a risk prediction model for short-term mortality after myocardial infarction and calculate validation EVSI at a range of sample sizes for the US population.ResultsComputation methods generated similar EVSI values in simulation studies, although they differed in numerical accuracy and computation times. At 2% risk threshold, procuring 1,000 observations for external validation, had an EVSI of 0.00101 in true-positive units or 0.04938 in false-positive units. Scaled by heart attack incidence in the United States, the population EVSI was 806 in true positives gained, or 39,500 in false positives averted, annually. Validation studies with >4,000 observations had diminishing returns, as the EVSIs were approaching their maximum possible value.ConclusionValue-of-information methodology quantifies the return on investment from conducting an external validation study and can provide a value-based perspective when designing such studies.HighlightsIn external validation studies of risk prediction models, the finite size of the validation sample leads to uncertain conclusions about the performance of the model. This uncertainty has hitherto been approached from a classical inferential perspective (e.g., confidence interval around the c-statistic).Correspondingly, sample size calculations for validation studies have been based on classical inferential statistics. For measures of clinical utility such as net benefit, the relevance of this approach is doubtful.This article defines the expected value of sample information (EVSI) for model validation and suggests algorithms for its computation. Validation EVSI quantifies the return on investment from conducting a validation study.Value-based approaches rooted in decision theory can complement contemporary study design and sample size calculation methods in predictive analytics.
Project description:Stochasticity inherent to biochemical reactions (intrinsic noise) and variability in cellular states (extrinsic noise) degrade information transmitted through signaling networks. We analyzed the ability of temporal signal modulation--that is, dynamics--to reduce noise-induced information loss. In the extracellular signal-regulated kinase (ERK), calcium (Ca(2+)), and nuclear factor kappa-B (NF-κB) pathways, response dynamics resulted in significantly greater information transmission capacities compared to nondynamic responses. Theoretical analysis demonstrated that signaling dynamics has a key role in overcoming extrinsic noise. Experimental measurements of information transmission in the ERK network under varying signal-to-noise levels confirmed our predictions and showed that signaling dynamics mitigate, and can potentially eliminate, extrinsic noise-induced information loss. By curbing the information-degrading effects of cell-to-cell variability, dynamic responses substantially increase the accuracy of biochemical signaling networks.
Project description:The expected value of sample information (EVSI) can be used to prioritize avenues for future research and design studies that support medical decision making and offer value for money spent. EVSI is calculated based on 3 key elements. Two of these, a probabilistic model-based economic evaluation and updating model uncertainty based on simulated data, have been frequently discussed in the literature. By contrast, the third element, simulating data from the proposed studies, has received little attention. This tutorial contributes to bridging this gap by providing a step-by-step guide to simulating study data for EVSI calculations. We discuss a general-purpose algorithm for simulating data and demonstrate its use to simulate 3 different outcome types. We then discuss how to induce correlations in the generated data, how to adjust for common issues in study implementation such as missingness and censoring, and how individual patient data from previous studies can be leveraged to undertake EVSI calculations. For all examples, we provide comprehensive code written in the R language and, where possible, Excel spreadsheets in the supplementary materials. This tutorial facilitates practical EVSI calculations and allows EVSI to be used to prioritize research and design studies.
Project description:Humans shift their gaze to a new location several times per second. It is still unclear what determines where they look next. Fixation behavior is influenced by the low-level salience of the visual stimulus, such as luminance, contrast, and color, but also by high-level task demands and prior knowledge. Under natural conditions, different sources of information might conflict with each other and have to be combined. In our paradigm, we trade off visual salience against expected value. We show that both salience and value information influence the saccadic end point within an object, but with different time courses. The relative weights of salience and value are not constant but vary from eye movement to eye movement, depending critically on the availability of the value information at the time when the saccade is programmed. Short-latency saccades are determined mainly by salience, but value information is taken into account for long-latency saccades. We present a model that describes these data by dynamically weighting and integrating detailed topographic maps of visual salience and value. These results support the notion of independent neural pathways for the processing of visual information and value.
Project description:In dynamic environments, split-second sensorimotor decisions must be prioritized according to potential payoffs to maximize overall rewards. The impact of relative value on deliberative perceptual judgments has been examined extensively [1-6], but relatively little is known about value-biasing mechanisms in the common situation where physical evidence is strong but the time to act is severely limited. In prominent decision models, a noisy but statistically stationary representation of sensory evidence is integrated over time to an action-triggering bound, and value-biases are affected by starting the integrator closer to the more valuable bound. Here, we show significant departures from this account for humans making rapid sensory-instructed action choices. Behavior was best explained by a simple model in which the evidence representation-and hence, rate of accumulation-is itself biased by value and is non-stationary, increasing over the short decision time frame. Because the value bias initially dominates, the model uniquely predicts a dynamic "turn-around" effect on low-value cues, where the accumulator first launches toward the incorrect action but is then re-routed to the correct one. This was clearly exhibited in electrophysiological signals reflecting motor preparation and evidence accumulation. Finally, we construct an extended model that implements this dynamic effect through plausible sensory neural response modulations and demonstrate the correspondence between decision signal dynamics simulated from a behavioral fit of that model and the empirical decision signals. Our findings suggest that value and sensory information can exert simultaneous and dynamically countervailing influences on the trajectory of the accumulation-to-bound process, driving rapid, sensory-guided actions.