Methods based on Deep Learning have recently been applied on astrophysical parameter recovery thanks to their ability to capture information from complex data. One of these methods is the approximate Bayesian Neural Networks (BNNs) which have demonstrated to yield consistent posterior distribution into the parameter space, helpful for uncertainty quantification. However, as any modern neural networks, they tend to produce overly confident uncertainty estimates and can introduce bias when BNNs are applied to data. In this work, we implement multiplicative normalizing flows (MNFs), a family of approximate posteriors for the parameters of BNNs with the purpose of enhancing the flexibility of the variational posterior distribution, to extract $\Omega_m$, $h$, and $\sigma_8$ from the QUIJOTE simulations. We have compared this method with respect to the standard BNNs, and the flipout estimator. We found that MNFs combined with BNNs outperform the other models obtaining predictive performance with almost one order of magnitude larger that standard BNNs, $\sigma_8$ extracted with high accuracy ($r^2=0.99$), and precise uncertainty estimates. The latter implies that MNFs provide more realistic predictive distribution closer to the true posterior mitigating the bias introduced by the variational approximation and allowing to work with well-calibrated networks.
翻译:最近,由于能够从复杂数据中获取信息,在天体物理参数恢复方面采用了基于深层学习的方法。其中一种方法是近似于Bayesian神经网络(BNNS),这些网络显示在参数空间中产生一致的后部分布,有助于量化不确定性。然而,作为任何现代神经网络,它们往往产生过于自信的不确定性估计,并在对数据应用BNNS时引入偏差。在这项工作中,我们实施了多复制性正常流(MNFs),这是一组用于BNS参数的近似后部,目的是提高变异后部分布的灵活性,从QUIJOTE模拟中提取$\Om_m$、$h$和$sgma_8$sgma_8$。我们将这种方法与标准BNNS和Flipt 估测仪进行比较。我们发现,多国部队与BNNS结合,超越了其他模型的预测性性能,其规模几乎为BNNS、$_8$_8$$8$,从Qal_limabalimal imal imal imal imbal imsilling impeal impeal impeal impeal 提供更精确的准确的预测到更精确的预测,从而提供更精确的预测到更精确的精确的预测性推后, IMFIMULIFIFIFIF2 。