Efficient quantum control is necessary for practical quantum computing implementations with current technologies. Conventional algorithms for determining optimal control parameters are computationally expensive, largely excluding them from use outside of the simulation. Existing hardware solutions structured as lookup tables are imprecise and costly. By designing a machine learning model to approximate the results of traditional tools, a more efficient method can be produced. Such a model can then be synthesized into a hardware accelerator for use in quantum systems. In this study, we demonstrate a machine learning algorithm for predicting optimal pulse parameters. This algorithm is lightweight enough to fit on a low-resource FPGA and perform inference with a latency of 175 ns and pipeline interval of 5 ns with $~>~$0.99 gate fidelity. In the long term, such an accelerator could be used near quantum computing hardware where traditional computers cannot operate, enabling quantum control at a reasonable cost at low latencies without incurring large data bandwidths outside of the cryogenic environment.
翻译:用于确定最佳控制参数的常规算法在计算上是昂贵的,基本上将它们排除在模拟之外使用。现有的硬件解决方案作为外观表格的结构不准确,费用也很高。通过设计机器学习模型以近似传统工具的结果,可以产生一种效率更高的方法。然后可以将这种模型合成成硬件加速器,用于量子系统。在这个研究中,我们展示了用于预测最佳脉冲参数的机器学习算法。这种算法足够轻,适合低资源FPGA, 并且以175 ns和5 ns的管道间隔值进行推论, 值为0.99美元。从长远看,这种加速器可以在传统计算机无法运行的地方使用近量子计算硬件。在低迟到不会产生大量数据带宽的情况下,能够在低时间以合理的成本进行量控。