Neural networks (NNs) have been shown to be competitive against state-of-the-art feature engineering and random forest (RF) classification of periodic variable stars. Although previous work utilising NNs has made use of periodicity by period folding multiple-cycle time-series into a single cycle -- from time-space to phase-space -- no approach to date has taken advantage of the fact that network predictions should be invariant to the initial phase of the period-folded sequence. Initial phase is exogenous to the physical origin of the variability and should thus be factored out. Here, we present cyclic-permutation invariant networks, a novel class of NNs for which invariance to phase shifts is guaranteed through polar coordinate convolutions, which we implement by means of "Symmetry Padding." Across three different datasets of variable star light curves, we show that two implementations of the cyclic-permutation invariant network: the iTCN and the iResNet, consistently outperform non-invariant baselines and reduce overall error rates by between 4% to 22%. Over a 10-class OGLE-III sample, the iTCN/iResNet achieves an average per-class accuracy of 93.4%/93.3%, compared to RNN/RF accuracies of 70.5%/89.5% in a recent study using the same data. Finding improvement on a non-astronomy benchmark, we suggest that the methodology introduced here should also be applicable to a wide range of science domains where periodic data abounds due to physical symmetries.
翻译:神经网络(NNs)被证明与定期变星的最新特征工程和随机森林(RF)分类相比具有竞争力。虽然以前的工作利用NNs使用周期性,将多周期时间序列折叠成一个周期周期 -- -- 从时间空间到阶段空间 -- -- 迄今为止,没有办法利用网络预测应不变化到周期翻转序列的初始阶段这一事实。初始阶段是变异的物理来源的外源,因此应该被算出。在这里,我们介绍了周期性变星网络中的周期性变异性,而使用NNNS的新类别,通过极间协调变异来保证周期性变异,我们通过“系统间空间到阶段间空间”执行。在三个不同的星光曲线数据集中,我们显示两种变异性网络中周期性变异性变异性变异性网络的实施:iTCN和iResNet, 持续地超常变异性科学基准,并用4-NCN可适用的定期变异性基线降低总体误率,为此,我们通过极间协调变相偏差的变异性变异性变异性变异性变异性变异性变变异数据范围范围范围范围, III 将最新数据流数据到10-NFCRF%