Adaptive designs are increasingly used in clinical trials and online experiments to improve participant outcomes by dynamically updating treatment allocation based on accumulating data. However, in practice, experimenters often consider multiple candidate designs, each with distinct trade-offs, while only one can be implemented at a time, leaving benefits and costs of alternative designs unobserved and unquantified. To address this, we propose a novel meta-level adaptive design framework that enables real-time, data-driven evaluation and selection among candidate adaptive designs. Specifically, we define a new class of causal estimands to evaluate adaptive designs, estimate them with Targeted Maximum Likelihood Estimation, which yields an asymptotically normal estimator accommodating dependence in adaptive-design data without parametric assumptions, and support online design selection. We further apply this framework to a motivating example where multiple surrogates of a long-term primary outcome are considered for updating randomization probabilities in adaptive experiments. Unlike existing surrogate evaluation methods, our approach comprehensively quantifies the utility of surrogates to accelerate detection of heterogeneous treatment effects, expedite updates to treatment randomization and improve participant outcomes, facilitating dynamic selection among surrogate-guided adaptive designs. Overall, our framework provides a unified tool for evaluating opportunities and costs of various adaptive designs and guiding real-time decision-making in adaptive experiments.
翻译:暂无翻译