top of page
##### Tutorial 5: Quantum States

Interpretation of tensor as a quantum state. Interpretation of tensor as a density matrix. Purifications. Expectation values and correlators (2 different ways to evaluate). Schmidt decomposition and SVD. Entanglement and singular values. Topics include:

• Pure and mixed states as tensor networks

• Evaluation of expectation values from tensor networks

• Relationship between the quantum entanglement and singular values

###### Python

The box to the right shows the code set-up necessary to run the example snippets given throughout this tutorial:

##### T4.1: Multi-stage tensor decompositions

We begin by addressing an important problem: given a many-index tensor H, how can we accurately decompose this into a network T of tensors {A,B,C…} according to some prescribed geometry, for instance, as depicted in Fig.4.1(a)? More precisely, we would like to find choice of tensors {A,B,C…} that minimizes the difference H - T from the original tensor, given some fixed dimension χ of the internal indices in T

###### Fig.4.1(a):

In order to obtain the network approximation to a tensor we shall employ a multi-stage decomposition: a sequence of single tensor decompositions via the SVD. This is the opposite procedure to the contraction routine considered in Tutorial 1, where a network of multiple tensors was contracted to a single tensor via a sequence of binary tensor contractions. The results from Tutorial 3, in particular Corollary 3.4, already inform us of the correct way to perform a multi-stage decomposition: the tensor to be decomposed at each step should be a center of orthogonality, which will ensure that the global truncation error is minimized.

Fig.4.1(b) below illustrates a sequence of single tensor decompositions that take a single tensor H into the network T of Fig.4.1(a). At each step a tensor Hk is split using a truncated SVD (retaining the desired rank χ) into a product of three tensors {Uk, Sk, Vk} across the partition indicated by the dashed line, where we have colored isometric tensors orange. The matrix of singular Sk values is then absorbed into the tensor that is to be decomposed at the next step (indicated by the dashed ellipse), such that it becomes a center of orthogonality, since all other tensors in the network are isometric. This process is repeated until the desired network geometry is reached.

###### Fig.4.1(b):

Notes on multi-stage decompositions:

• Many-different sequences are possible, some cheaper than others

• In example given, the desired center of orthogonality was created by simply absorbing the singular weights correctly. Maybe more difficult with other contraction orders, and require use of one of methods from tutorial 3 to create the desired center of orthogonality at each intermediate step.

• Not globally optimal

##### T4.2: Center of orthogonality (link centered)

-center of orthog at a link

-relationship to SVD

-optimal truncations

##### T4.3: Cannonical forms

-definition and method

-easy understanding of correlations through each index

bottom of page