Differentiable Programming Tensor Networks

Differentiable programming is a fresh programming paradigm which composes parameterized algorithmic components and optimizes them using gradient search. The concept emerges from deep learning but is not limited to training neural networks. We present the theory and practice of programming tensor net...

Full description

Bibliographic Details
Main Authors: Hai-Jun Liao, Jin-Guo Liu, Lei Wang, Tao Xiang
Format: Article
Language:English
Published: American Physical Society 2019-09-01
Series:Physical Review X
Online Access:http://doi.org/10.1103/PhysRevX.9.031041
id doaj-ddc7560ee545467ca47ac8ed964d2a53
record_format Article
spelling doaj-ddc7560ee545467ca47ac8ed964d2a532020-11-25T01:54:29ZengAmerican Physical SocietyPhysical Review X2160-33082019-09-019303104110.1103/PhysRevX.9.031041Differentiable Programming Tensor NetworksHai-Jun LiaoJin-Guo LiuLei WangTao XiangDifferentiable programming is a fresh programming paradigm which composes parameterized algorithmic components and optimizes them using gradient search. The concept emerges from deep learning but is not limited to training neural networks. We present the theory and practice of programming tensor network algorithms in a fully differentiable way. By formulating the tensor network algorithm as a computation graph, one can compute higher-order derivatives of the program accurately and efficiently using automatic differentiation. We present essential techniques to differentiate through the tensor networks contraction algorithms, including numerical stable differentiation for tensor decompositions and efficient backpropagation through fixed-point iterations. As a demonstration, we compute the specific heat of the Ising model directly by taking the second-order derivative of the free energy obtained in the tensor renormalization group calculation. Next, we perform gradient-based variational optimization of infinite projected entangled pair states for the quantum antiferromagnetic Heisenberg model and obtain state-of-the-art variational energy and magnetization with moderate efforts. Differentiable programming removes laborious human efforts in deriving and implementing analytical gradients for tensor network programs, which opens the door to more innovations in tensor network algorithms and applications.http://doi.org/10.1103/PhysRevX.9.031041
collection DOAJ
language English
format Article
sources DOAJ
author Hai-Jun Liao
Jin-Guo Liu
Lei Wang
Tao Xiang
spellingShingle Hai-Jun Liao
Jin-Guo Liu
Lei Wang
Tao Xiang
Differentiable Programming Tensor Networks
Physical Review X
author_facet Hai-Jun Liao
Jin-Guo Liu
Lei Wang
Tao Xiang
author_sort Hai-Jun Liao
title Differentiable Programming Tensor Networks
title_short Differentiable Programming Tensor Networks
title_full Differentiable Programming Tensor Networks
title_fullStr Differentiable Programming Tensor Networks
title_full_unstemmed Differentiable Programming Tensor Networks
title_sort differentiable programming tensor networks
publisher American Physical Society
series Physical Review X
issn 2160-3308
publishDate 2019-09-01
description Differentiable programming is a fresh programming paradigm which composes parameterized algorithmic components and optimizes them using gradient search. The concept emerges from deep learning but is not limited to training neural networks. We present the theory and practice of programming tensor network algorithms in a fully differentiable way. By formulating the tensor network algorithm as a computation graph, one can compute higher-order derivatives of the program accurately and efficiently using automatic differentiation. We present essential techniques to differentiate through the tensor networks contraction algorithms, including numerical stable differentiation for tensor decompositions and efficient backpropagation through fixed-point iterations. As a demonstration, we compute the specific heat of the Ising model directly by taking the second-order derivative of the free energy obtained in the tensor renormalization group calculation. Next, we perform gradient-based variational optimization of infinite projected entangled pair states for the quantum antiferromagnetic Heisenberg model and obtain state-of-the-art variational energy and magnetization with moderate efforts. Differentiable programming removes laborious human efforts in deriving and implementing analytical gradients for tensor network programs, which opens the door to more innovations in tensor network algorithms and applications.
url http://doi.org/10.1103/PhysRevX.9.031041
work_keys_str_mv AT haijunliao differentiableprogrammingtensornetworks
AT jinguoliu differentiableprogrammingtensornetworks
AT leiwang differentiableprogrammingtensornetworks
AT taoxiang differentiableprogrammingtensornetworks
_version_ 1715631984507617280