We train graph neural networks on halo catalogues from Gadget N-body simulations to perform field-level likelihood-free inference of cosmological parameters. The catalogues contain $\lesssim$5,000 halos with masses $\gtrsim 10^{10}~h^{-1}M_\odot$ in a periodic volume of $(25~h^{-1}{\rm Mpc})^3$; every halo in the catalogue is characterized by several properties such as position, mass, velocity, concentration, and maximum circular velocity. Our models, built to be permutationally, translationally, and rotationally invariant, do not impose a minimum scale on which to extract information and are able to infer the values of $\Omega_{\rm m}$ and $\sigma_8$ with a mean relative error of $\sim6\%$, when using positions plus velocities and positions plus masses, respectively. More importantly, we find that our models are very robust: they can infer the value of $\Omega_{\rm m}$ and $\sigma_8$ when tested using halo catalogues from thousands of N-body simulations run with five different N-body codes: Abacus, CUBEP$^3$M, Enzo, PKDGrav3, and Ramses. Surprisingly, the model trained to infer $\Omega_{\rm m}$ also works when tested on thousands of state-of-the-art CAMELS hydrodynamic simulations run with four different codes and subgrid physics implementations. Using halo properties such as concentration and maximum circular velocity allow our models to extract more information, at the expense of breaking the robustness of the models. This may happen because the different N-body codes are not converged on the relevant scales corresponding to these parameters.
翻译:我们从 Gadget N-body 模拟中在光线目录上训练图形神经网络, 以进行现场一级无概率的宇宙参数推断。 目录包含5 000 千 千 公顷, 质量$ gtrsim 10\ 10\\\\\\\\\\\\\\\\\\\1}M ⁇ M ⁇ odoot 美元, 定期量为 $( 25~ h ⁇ -1\\\\\\\\\\\\\\\\\\\\\\\\\\MpMpc} 3 美元; 目录中的每条光线条光线的特征有几处属性, 例如位置、质量、速度、速度、密度、翻译和旋转, 我们发现我们的模型非常坚固。 我们的参数, 建起来的值是 $Omelgan_rgy, 并且用N_Q_Q_BAR_BAR3 的模型, 最高值數位數值是不同的數值。 当测试了N_DGO- dem- diralaldealdealdal 3, 当测试时, modeal- deal- deal- deal- dental- dirdal- deal- deal- dentsaldaldaldxxxxxxxxxxxx, 美元, 美元, 美元, mod- dirxxxxxxxxxxxxxxxxxxxxxxxx, moxxx, moxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx