Quantum information quantities play a substantial role in characterizing operational quantities in various quantum information-theoretic problems. We consider numerical computation of four quantum information quantities: Petz-Augustin information, sandwiched Augustin information, conditional sandwiched Renyi entropy and sandwiched Renyi information. To compute these quantities requires minimizing some order-$\alpha$ quantum Renyi divergences over the set of quantum states. Whereas the optimization problems are obviously convex, they violate standard bounded gradient/Hessian conditions in literature, so existing convex optimization methods and their convergence guarantees do not directly apply. In this paper, we propose a new class of convex optimization methods called mirror descent with the Polyak step size. We prove their convergence under a weak condition, showing that they provably converge for minimizing quantum Renyi divergences. Numerical experiment results show that entropic mirror descent with the Polyak step size converges fast in minimizing quantum Renyi divergences.
翻译:量子信息数量量在确定各种量子信息理论问题的操作量方面起着重要作用。 我们考虑对四种量子信息数量进行数字计算: Petz- Augustin 信息、 Augustin 三明治信息、 有条件的三明治Renyi enpropy 和 Renyi 三明治信息。 要计算这些数量,就需要将量子状态之间的某种定值- $ alpha$ 量子 Renyi 差异最小化。 虽然优化问题显然具有共性,但它们违反了文献中标准的约束性梯度/赫斯条件,因此现有的convex 优化方法及其趋同保证并不直接适用。 在本文中,我们提出了一种新型的 convex 优化方法,称为镜像降级与Polyak 级大小。 我们证明它们在一个薄弱的条件下趋同, 表明它们为最大限度地减少量子Renyyi差异而可能趋于一致。 数值实验结果表明, 与多元体位子大小的粒子下降会快速聚集于最小化的Renyi 差异。