A Quantum Self-Attention Network (QSAN) that can be achieved on near-term quantum devices is investigated. First, the theoretical basis of QSAN, a linearized and reversible Quantum Self-Attention Mechanism (QSAM) including Quantum Logic Similarity (QLS) and Quantum Bit Self-Attention Score Matrix (QBSASM), is explored to solve the storage problem of Self-Attention Mechanism (SAM) due to quadratic complexity. More importantly, QLS uses logical operations instead of inner product operations to enable QSAN to be fully deployed on quantum computers and meanwhile saves quantum bits by avoiding numerical operations, and QBSASM is a by-product generated with the evolution of QSAN, reflecting the output attention distribution in the form of a density matrix. Then, the framework and the quantum circuit of QSAN are designed with 9 execution steps and 5 special functional sub-modules, which can acquire QBSASM effectively in the intermediate process, as well as compressing the number of measurements. In addition, a quantum coordinate prototype is proposed to describe the mathematical connection between the control and output bits in order to realize programming and model optimization conveniently. Finally, a miniaturized experiment is implemented and it demonstrates that QSAN can be trained faster in the presence of quantum natural gradient descent method, as well as produce quantum characteristic attention distribution QBSASM. QSAN has great potential to be embedded in classical or quantum machine learning frameworks to lay the foundation for quantum enhanced Natural Language Processing (NLP).
翻译:首先,对QSAN的理论基础进行了调查,QSAN是一个在线和可逆的量子自控机制(QSAM),包括Qantum逻辑相似性(QLS)和Qantum Bit自控分数矩阵(QBSASM),这是通过QSAN演进产生的副产品,反映了以密度矩阵形式出现的产出关注分布。然后,QSAN的框架和量子电路设计为9个执行步骤和5个特殊功能子模块,可以在中间流程中有效获得QBASSM,同时使QSAN完全部署在量子计算机上,通过避免数字操作而节省量子部分,而QSAM(QSAM)则是通过QSAN演进过程产生的副产品。此外,QSAN的框架和量子模块可以设计出9个执行步骤和5个特殊功能子模块,可以在中间流程中有效获得QSMASM,同时将质质质基质的量基离子(BSASM)的含量和精确的量基数基数级数据连接到不断升级的S)。