Variational quantum algorithms (VQAs) have emerged as a promising near-term technique to explore practical quantum advantage on noisy intermediate-scale quantum (NISQ) devices. However, the inefficient parameter training process due to the incompatibility with backpropagation and the cost of a large number of measurements, posing a great challenge to the large-scale development of VQAs. Here, we propose a parameter-parallel distributed variational quantum algorithm (PPD-VQA), to accelerate the training process by parameter-parallel training with multiple quantum processors. To maintain the high performance of PPD-VQA in the realistic noise scenarios, a alternate training strategy is proposed to alleviate the acceleration attenuation caused by noise differences among multiple quantum processors, which is an unavoidable common problem of distributed VQA. Besides, the gradient compression is also employed to overcome the potential communication bottlenecks. The achieved results suggest that the PPD-VQA could provide a practical solution for coordinating multiple quantum processors to handle large-scale real-word applications.
翻译:变化量子算法(VQAs)已成为探寻噪音中等规模量子装置实际量子优势的一种有希望的近期技术,但是,由于与反向反射不相容和大量测量费用高,因此参数培训程序效率低下,对大规模开发VQA提出了巨大挑战。在这里,我们提议了一种参数-平行分布式变量量算法(PPD-VQA),以加速使用多量子处理器进行参数-平行培训的培训过程。为了保持PPD-VQA在现实噪音情况下的高性能,建议了一种替代培训战略,以缓解多量子处理器之间噪音差异造成的加速减速,这是分布式VQA的一个不可避免的常见问题。此外,还采用梯度压缩法来克服潜在的通信瓶颈。已经取得的成果表明,PDD-VQA可以提供一种切实可行的解决办法,协调多种量子处理器处理大规模实际语言应用。