Future surveys such as the Legacy Survey of Space and Time (LSST) of the Vera C. Rubin Observatory will observe an order of magnitude more astrophysical transient events than any previous survey before. With this deluge of photometric data, it will be impossible for all such events to be classified by humans alone. Recent efforts have sought to leverage machine learning methods to tackle the challenge of astronomical transient classification, with ever improving success. Transformers are a recently developed deep learning architecture, first proposed for natural language processing, that have shown a great deal of recent success. In this work we develop a new transformer architecture, which uses multi-head self attention at its core, for general multi-variate time-series data. Furthermore, the proposed time-series transformer architecture supports the inclusion of an arbitrary number of additional features, while also offering interpretability. We apply the time-series transformer to the task of photometric classification, minimising the reliance of expert domain knowledge for feature selection, while achieving results comparable to state-of-the-art photometric classification methods. We achieve a weighted logarithmic-loss of 0.507 on imbalanced data in a representative setting using data from the Photometric LSST Astronomical Time-Series Classification Challenge (PLAsTiCC). Moreover, we achieve a micro-averaged receiver operating characteristic area under curve of 0.98 and micro-averaged precision-recall area under curve of 0.87.
翻译:未来测量,如Vera C. Rubin 观测站的 " 空间和时间遗产调查(LSST) " 等未来测量,将观测比以往任何测量都更大规模的天体物理瞬变事件。由于光度数据如此庞大,所有这类事件不可能仅由人类分类。最近努力利用机器学习方法应对天文瞬变异分类的挑战,并越来越成功。变异器是一个最近开发的深层次学习结构,首先用于自然语言处理,最近显示了大量成功。在这项工作中,我们开发了一个新的变异器结构,其核心使用多头自关注,用于一般多变时间序列数据。此外,拟议的时序变变器结构将支持任意增加一些额外特征,同时提供可解释性。我们用时间序列变换工具来应对天文瞬变异分类任务,将专家领域知识对地貌选择的依赖降到最低程度,同时取得与最新光度光度测量分类方法相当的成果。我们从A. 0.507 和0.8 中将多头项自关注其核心,在一般多变数时间序列数据中进行加权对数计算。