Project description:In vaccine trials, the vaccination of one person might prevent the infection of another; a distinction can be drawn between the ways such a protective effect might arise. Consider a setting with 2 persons per household in which one of the 2 is vaccinated. Vaccinating the first person may protect the second person by preventing the first from being infected and passing the infection on to the second. Alternatively, vaccinating the first person may protect the second by rendering the infection less contagious even if the first is infected. This latter mechanism is sometimes referred to as an "infectiousness effect" of the vaccine. Crude estimators for the infectiousness effect will be subject to selection bias due to stratification on a postvaccination event, namely the infection status of the first person. We use theory concerning causal inference under interference along with a principal-stratification framework to show that, although the crude estimator is biased, it is, under plausible assumptions, conservative for what one might define as a causal infectiousness effect. This applies to bias from selection due to the persons in the comparison, and also to selection due to pathogen virulence. We illustrate our results with an example from the literature.
Project description:Estimation of treatment effects in randomized studies is often hampered by possible selection bias induced by conditioning on or adjusting for a variable measured post-randomization. One approach to obviate such selection bias is to consider inference about treatment effects within principal strata, that is, principal effects. A challenge with this approach is that without strong assumptions principal effects are not identifiable from the observable data. In settings where such assumptions are dubious, identifiable large sample bounds may be the preferred target of inference. In practice these bounds may be wide and not particularly informative. In this work we consider whether bounds on principal effects can be improved by adjusting for a categorical baseline covariate. Adjusted bounds are considered which are shown to never be wider than the unadjusted bounds. Necessary and sufficient conditions are given for which the adjusted bounds will be sharper (i.e., narrower) than the unadjusted bounds. The methods are illustrated using data from a recent, large study of interventions to prevent mother-to-child transmission of HIV through breastfeeding. Using a baseline covariate indicating low birth weight, the estimated adjusted bounds for the principal effect of interest are 63% narrower than the estimated unadjusted bounds.
Project description:SARS-CoV-2 breakthrough infections in vaccinated individuals and in those who had a prior infection have been observed globally, but the transmission potential of these infections is unknown. The RT-qPCR cycle threshold (Ct) value is inversely correlated with viral load and culturable virus. Here, we investigate differences in RT-qPCR Ct values across Qatar's national cohorts of primary infections, reinfections, BNT162b2 (Pfizer-BioNTech) breakthrough infections, and mRNA-1273 (Moderna) breakthrough infections. Our matched-cohort analyses of the randomly diagnosed infections show higher mean Ct value in all cohorts of breakthrough infections compared to the cohort of primary infections in unvaccinated individuals. The Ct value is 1.3 (95% CI: 0.9-1.8) cycles higher for BNT162b2 breakthrough infections, 3.2 (95% CI: 1.9-4.5) cycles higher for mRNA-1273 breakthrough infections, and 4.0 (95% CI: 3.5-4.5) cycles higher for reinfections in unvaccinated individuals. Since Ct value correlates inversely with SARS-CoV-2 infectiousness, these differences imply that vaccine breakthrough infections and reinfections are less infectious than primary infections in unvaccinated individuals. Public health benefits of vaccination may have been underestimated, as COVID-19 vaccines not only protect against acquisition of infection, but also appear to protect against transmission of infection.
Project description:While many results from the treatment-effect and related literatures are familiar and have been applied productively in health economics evaluations, other potentially useful results from those literatures have had little influence on health economics practice. With the intent of demonstrating the value and use of some of these results in health economics applications, this paper focuses on one particular class of parameters that describe probabilities that one outcome is larger or smaller than other outcomes ("inequality probabilities"). While the properties of such parameters have been exposited in the technical literature, they have scarcely been considered in informing practical questions in health evaluations. This paper shows how such probabilities can be used informatively, and describes how they might be identified or bounded informatively given standard sampling assumptions and information only on marginal distributions of outcomes. The logic of these results and the empirical implementation thereof-sampling, estimation, and inference-are straightforward. Derivations are provided and several health-related applications are presented.
Project description:Originally developed as a theory of consciousness, integrated information theory provides a mathematical framework to quantify the causal irreducibility of systems and subsets of units in the system. Specifically, mechanism integrated information quantifies how much of the causal powers of a subset of units in a state, also referred to as a mechanism, cannot be accounted for by its parts. If the causal powers of the mechanism can be fully explained by its parts, it is reducible and its integrated information is zero. Here, we study the upper bound of this measure and how it is achieved. We study mechanisms in isolation, groups of mechanisms, and groups of causal relations among mechanisms. We put forward new theoretical results that show mechanisms that share parts with each other cannot all achieve their maximum. We also introduce techniques to design systems that can maximize the integrated information of a subset of their mechanisms or relations. Our results can potentially be used to exploit the symmetries and constraints to reduce the computations significantly and to compare different connectivity profiles in terms of their maximal achievable integrated information.
Project description:Uncertain data are observations that cannot be uniquely mapped to a referent. In the case of uncertainty due to incompleteness, possibility theory can be used as an appropriate model for processing such data. In particular, granular counting is a way to count data in presence of uncertainty represented by possibility distributions. Two algorithms were proposed in literature to compute granular counting: exact granular counting, with quadratic time complexity, and approximate granular counting, with linear time complexity. This paper extends approximate granular counting by computing bounds for exact granular count. In this way, the efficiency of approximate granular count is combined with certified bounds whose width can be adjusted in accordance to user needs.
Project description:Quantum multiparameter estimation involves estimating multiple parameters simultaneously and can be more precise than estimating them individually. Our interest here is to determine fundamental quantum limits to the achievable multiparameter estimation precision in the presence of noise. We first present a lower bound to the estimation error covariance for a noisy initial probe state evolving through a noiseless quantum channel. We then present a lower bound to the estimation error covariance in the most general form for a noisy initial probe state evolving through a noisy quantum channel. We show conditions and accordingly measurements to attain these estimation precision limits for noisy systems. We see that the Heisenberg precision scaling of 1/N can be achieved with a probe comprising N particles even in the presence of noise. In fact, some noise in the initial probe state or the quantum channel can serve as a feature rather than a bug, since the estimation precision scaling achievable in the presence of noise in the initial state or the channel in some situations is impossible in the absence of noise in the initial state or the channel. However, a lot of noise harms the quantum advantage achievable with N parallel resources, and allows for a best precision scaling of [Formula: see text]. Moreover, the Heisenberg precision limit can be beaten with noise in the channel, and we present a super-Heisenberg precision limit with scaling of 1/N2 for optimal amount of noise in the channel, characterized by one-particle evolution operators. Furthermore, using γ-particle evolution operators for the noisy channel, where γ > 1, the best precision scaling attainable is 1/N2γ, which is otherwise known to be only possible using 2γ-particle evolution operators for a noiseless channel.
Project description:A recently developed lower bound theory for Coulombic problems (E. Pollak, R. Martinazzo, J. Chem. Theory Comput. 2021, 17, 1535) is further developed and applied to the highly accurate calculation of the ground-state energy of two- (He, Li+, and H-) and three- (Li) electron atoms. The method has been implemented with explicitly correlated many-particle basis sets of Gaussian type, on the basis of the highly accurate (Ritz) upper bounds they can provide with relatively small numbers of functions. The use of explicitly correlated Gaussians is developed further for computing the variances, and the necessary modifications are here discussed. The computed lower bounds are of submilli-Hartree (parts per million relative) precision and for Li represent the best lower bounds ever obtained. Although not yet as accurate as the corresponding (Ritz) upper bounds, the computed bounds are orders of magnitude tighter than those obtained with other lower bound methods, thereby demonstrating that the proposed method is viable for lower bound calculations in quantum chemistry applications. Among several aspects, the optimization of the wave function is shown to play a key role for both the optimal solution of the lower bound problem and the internal check of the theory.