We introduce the Momentum Transformer, an attention-based deep learning architecture which outperforms benchmark momentum and mean-reversion trading strategies. Unlike state-of-the-art Long Short-Term Memory (LSTM) architectures, which are sequential in nature, the attention mechanism provides our architecture with a direct connection to all previous time-steps. Our architecture enables us to learn longer-term dependencies, improves performance when considering returns net of transaction costs and naturally adapts to new market regimes, such as during the SARS-CoV-2 crisis. The Momentum Transformer is inherently interpretable, providing us with greater insights into our deep learning momentum trading strategy, including how it blends different classical strategies and the past time-steps which are of the greatest significance to the model.
翻译:我们引入了“动力变换器 ” ( Momentum 变换器 ), 这是一种基于关注的深层次学习结构,其表现超过了基准势头和平均回转交易战略。 与最先进的长期短期内存(LSTM)结构不同,后者是相继性质的,关注机制为我们的结构提供了与以往所有时间步骤的直接联系。 我们的架构使我们能够学习长期依赖关系,在考虑交易成本收益净额时提高绩效,并自然地适应新的市场制度,如SARS-COV-2危机期间。 Momentum 变换器本质上是可以解释的,让我们更深入地了解我们深层次的学习势头交易战略,包括它如何融合不同的经典战略和对模型最重要的过去时间步骤。