Differentiable programming is a new programming paradigm which enables large scale optimization through automatic calculation of gradients also known as auto-differentiation. This concept emerges from deep learning, and has also been generalized to tensor network optimizations. Here, we extend the differentiable programming to tensor networks with isometric constraints with applications to multiscale entanglement renormalization ansatz (MERA) and tensor network renormalization (TNR). By introducing several gradient-based optimization methods for the isometric tensor network and comparing with Evenbly-Vidal method, we show that auto-differentiation has a better performance for both stability and accuracy. We numerically tested our methods on 1D critical quantum Ising spin chain and 2D classical Ising model. We calculate the ground state energy for the 1D quantum model and internal energy for the classical model, and scaling dimensions of scaling operators and find they all agree with the theory well.
翻译:可区别的编程是一种新的编程模式,它通过自动计算梯度(又称自动差异)实现大规模优化。这个概念来自深层次的学习,并被普遍推广到高端网络优化。在这里,我们把可区分的编程扩大到具有等度限制的等离子网络,其应用是多尺度的分解重新整顿 ansatz (MERA) 和 thanor 网络重新整顿 。 通过对等度强压网络采用几种基于梯度的优化方法,并与均匀-维达尔方法进行比较,我们发现自动区分在稳定性和准确性两方面都有更好的性能。我们用1D临界量子旋转链和2D经典Ising 模型用数字测试了我们的方法。我们计算了 1D 量子模型和经典模型的内部能量的地面状态能量,以及缩放操作器的尺寸,发现它们都与理论一致。