# 1. Tensor Network Basics¶

The tensor network functionality in `quimb`

aims to be easy to
manipulate and interact with, without losing any generality or efficiency.
Many of the manipulations try and mimic how one thinks graphically about
tensor networks, which is helped by the ability to plot a graph of the
network at any point.

The functionality can be found in the submodule `quimb.tensor`

, and
is not imported by default with the base `quimb`

:

```
[1]:
```

```
%config InlineBackend.figure_formats = ['svg']
```

```
[2]:
```

```
import quimb as qu
import quimb.tensor as qtn
```

The core functions of note are `Tensor`

, `TensorNetwork`

, `tensor_contract()`

, and `tensor_split()`

.

## 1.1. Creating Tensors¶

Tensors are created using the class `Tensor`

. Indicies matching the shape of the data array must be supplied, and optionally tags which can be used to group it with others and/or uniquely identify it.

For example, let’s create the singlet state in tensor form, i.e., an index for each qubit:

```
[3]:
```

```
data = qu.bell_state('psi-').reshape(2, 2)
inds = 'k0', 'k1'
tags = 'KET'
ket = qtn.Tensor(data, inds, tags)
ket
```

```
[3]:
```

```
Tensor(shape=(2, 2), inds=('k0', 'k1'), tags=oset(['KET']))
```

In lots of ways this is like a normal n-dimensional array, but it can also be manipulated without having to keep track of which axis is which. Some key methods are:

Note

These all have inplace versions with an underscore appended,
so that `ket.transpose_('k1', 'k0')`

would perform a
tranposition on `ket`

directly, rather than making a new
tensor. This is a convention adopted elsewhere also.

Let’s also create some tensor paulis, with indices that act on the bell state and map the physical indices into two new ones:

```
[4]:
```

```
X = qtn.Tensor(qu.pauli('X'), inds=('k0', 'b0'), tags=['PAULI', 'X', '0'])
Y = qtn.Tensor(qu.pauli('Y'), inds=('k1', 'b1'), tags=['PAULI', 'Y', '1'])
```

And finally, a random ‘bra’ to complete the inner product:

```
[5]:
```

```
bra = qtn.Tensor(qu.rand_ket(4).reshape(2, 2), inds=('b0', 'b1'), tags={'BRA'})
```

Note how repeating an index is all that is required to define a contraction.
If you want to join two tensors and have the index generated automatically
you can use the `connect()`

.

Note

Indices and tags should be strings - though this is currently not enforced.
A useful convention is to keep `inds`

lower case, and `tags`

upper.

## 1.2. Creating Tensor Networks¶

We can now combine these into a `TensorNetwork`

using the `&`

operator overload:

```
[6]:
```

```
TN = ket.H & X & Y & bra
print(TN)
```

```
TensorNetwork([
Tensor(shape=(2, 2), inds=('k0', 'k1'), tags=oset(['KET'])),
Tensor(shape=(2, 2), inds=('k0', 'b0'), tags=oset(['PAULI', 'X', '0'])),
Tensor(shape=(2, 2), inds=('k1', 'b1'), tags=oset(['PAULI', 'Y', '1'])),
Tensor(shape=(2, 2), inds=('b0', 'b1'), tags=oset(['BRA'])),
])
```

(note that `.H`

conjugates the data but leaves the indices).
We could also use the `TensorNetwork`

constructor, which takes any sequence of tensors and/or tensor networks,
and has various advanced options.

The geometry of this network is completely defined by the repeated
indices forming edges between tensors, as well as arbitrary tags
identifying the tensors. The internal data of the tensor network
allows efficient access to any tensors based on their `tags`

or `inds`

.

Note

In order to naturally maintain networks geometry, bonds (repeated
indices) can be mangled when two tensor networks are combined.
As a result of this, only exterior indices are guaranteed to
keep their absolute value - since these define the overall object.
The `bonds()`

function can be used to
find the names of indices connecting tensors if explicitly required.

Any network can also be graphed using
`graph()`

, which will pick
a layout and also represent bond size as edge thickness, and optionally
color the nodes based on `tags`

.

```
[7]:
```

```
TN.graph(color=['KET', 'PAULI', 'BRA'], figsize=(4, 4))
```

Note the tags can be used to identify both paulis at once. But they could also be uniquely identified using their `'X'`

and `'Y'`

tags respectively:

```
[8]:
```

```
TN.graph(color=['KET', 'X', 'BRA', 'Y'], figsize=(4, 4))
```

## 1.3. Contracting Tensors¶

To completely contract a network we can use the `^`

operator overload, and the `all`

function.

```
[9]:
```

```
TN ^ all
```

```
[9]:
```

```
(-0.02011845521031984+0.1288000245144975j)
```

Or if we just want to contract the paulis:

```
[10]:
```

```
print(TN ^ 'PAULI')
```

```
TensorNetwork([
Tensor(shape=(2, 2), inds=('k0', 'k1'), tags=oset(['KET'])),
Tensor(shape=(2, 2), inds=('b0', 'b1'), tags=oset(['BRA'])),
Tensor(shape=(2, 2, 2, 2), inds=('k0', 'b0', 'k1', 'b1'), tags=oset(['PAULI', 'X', '0', 'Y', '1'])),
])
```

Notice how the `tags`

of the Paulis have been combined on the new tensor.

The contraction order is optimized automatically using `opt_einsum`

, is cached, and can easily handle hundreds of tensors (though it uses a greedy algorithm and is not guaranteed to find the optimal path).

A cumulative contract allows a custom ‘bubbling’ order:

```
[11]:
```

```
# "take KET, then contract X in, then contract BRA *and* Y in, etc..."
print(TN >> ['KET', 'X', ('BRA', 'Y')])
```

```
(-0.02011845521031984+0.1288000245144975j)
```

And a structured contract uses the tensor networks tagging structure (a string format specifier like `"I{}"`

) to perform a cumulative contract automatically, e.g. grouping the tensors of a MPS/MPO into segments of 10 sites. This can be slightly quicker than finding the full contraction path.

When a TN has a structure, structured contractions can be used by specifying either an `Ellipsis`

:

```
``TN ^ ...`` # which means full, structured contract
```

or a `slice`

:

```
``TN ^ slice(100, 200)`` # which means a structured contract of those sites only
```

The full api of contraction methods is:

`quimb.tensor.tensor_core.TensorNetwork.contract()`

(which dispatches to one of the below:)`quimb.tensor.tensor_core.TensorNetwork.contract_cumulative()`

(overloaded on`>>`

)`quimb.tensor.tensor_core.TensorNetwork.contract_structured()`

`quimb.tensor.tensor_core.TensorNetwork.contract_between()`

(explicitly contract between two tensors only)

```
[12]:
```

```
print((TN ^ 'PAULI'))
```

```
TensorNetwork([
Tensor(shape=(2, 2), inds=('k0', 'k1'), tags=oset(['KET'])),
Tensor(shape=(2, 2), inds=('b0', 'b1'), tags=oset(['BRA'])),
Tensor(shape=(2, 2, 2, 2), inds=('k0', 'b0', 'k1', 'b1'), tags=oset(['PAULI', 'X', '0', 'Y', '1'])),
])
```

## 1.4. Splitting Tensors¶

Tensors can be decomposed (‘split’) using many different methods, all accessed through
`tensor_split()`

or `split()`

.

As an example, let’s split the tensor tagged `'KET'`

in our TN:

```
[13]:
```

```
# select any tensors matching the 'KET' tag - here only 1
Tk = TN['KET']
# now split it, creating a new tensor network of 2 tensors
Tk_s = Tk.split(left_inds=['k0'])
# note new index created
print(Tk_s)
```

```
TensorNetwork([
Tensor(shape=(2, 2), inds=('k0', '_d0fe7aAAAAI'), tags=oset(['KET'])),
Tensor(shape=(2, 2), inds=('_d0fe7aAAAAI', 'k1'), tags=oset(['KET'])),
])
```

Finally, let’s replace the original ‘KET’ tensor with its split version:

```
[14]:
```

```
# remove the original KET tensor
del TN['KET']
# inplace add the split tensor network
TN &= Tk_s
# plot - should now have 5 tensors
TN.graph(color=['KET', 'PAULI', 'BRA'], figsize=(4, 4))
```

## 1.5. Graph Orientated Tensor Network Creation¶

Another way to create tensor networks is define the tensors (nodes)
first and the make indices (edges) afterwards. This is mostly enabled
by the functions `new_ind()`

and
`new_bond()`

. Take for example
making a small periodic matrix product state with bond dimension 7:

```
[15]:
```

```
L = 10
# create the nodes, by default just the scalar 1.0
tensors = [qtn.Tensor() for _ in range(L)]
for i in range(L):
# add the physical indices, each of size 2
tensors[i].new_ind(f'k{i}', size=2)
# add bonds between neighbouring tensors, of size 7
tensors[i].new_bond(tensors[(i + 1) % L], size=7)
mps = qtn.TensorNetwork(tensors)
mps.graph()
```

## 1.6. Other Overloads¶

You can also add tensors/networks together using `|`

or the inplace `|=`

, which act like `&`

and `&=`

respectively, but are virtual, meaning that changes to the tensors propogate across all networks viewing it (see `TensorNetwork`

).

The `@`

symbol is overloaded to combine the objects into a network and then immediately contract them all. In this way it mimics dense dot product. E.g.

```
[16]:
```

```
# make a 5 qubit tensor state
dims = [2] * 5
data = qu.rand_ket(32).A.reshape(*dims)
inds=['k0', 'k1', 'k2', 'k3', 'k4']
psi = qtn.Tensor(data, inds=inds)
# find the inner product with itself
psi.H @ psi
```

```
[16]:
```

```
1.0000000000000002
```

In this case, the conjugated copy `psi.H`

has the same outer indices as `psi`

and so the inner product is naturally formed.

## 1.7. Other TensorNetwork Features¶

Insert gauges between tensors with

`insert_gauge()`

Compress bonds between tensors with

`compress_between()`

Replace subnetworks with an SVD (or other decomposition) using

`replace_with_svd()`

Take the trace with

`trace()`

Convert to dense form with

`to_dense()`

Treat TN as a

`scipy`

LinearOperator with`aslinearoperator()`

Automatically iterate over ‘cut’ bonds with

`cut_iter()`

to trade memory for time / parallelization.Impose unitary/isometric constaints at the tensor level with

`left_inds`

and`unitize()`

Optimize tensors in arbitrary network geometries with respect to custom loss functions using

`tensorflow`

or`pytorch`

- see Optimizing a Tensor Network using Tensorflow.Parametrized tensors -

`PTensor`

- can be used to represent tensor’s data as a function acting on parameters.

TNs are also picklable so they can be easily saved to and loaded from disk.

## 1.8. Internal Structure¶

A `TensorNetwork`

stores its tensors in three dictionaries which allow them to be selected in constant time, regardless of network size, based on their `tags`

and `inds`

. These are

`TensorNetwork.tensor_map`

: a mapping of unique string ids (`tids`

) to each tensor`TensorNetwork.tag_map`

: a mapping of every tag in the network to the set of`tids`

corresponding to tensors which have that tag.`TensorNetwork.ind_map`

: a mapping of every index in the network to the set of`tids`

corresponding to tensors which that that index.

Thus the tensors with tag `'HAM'`

in network `tn`

would be `(tn.tensor_map[tid] for tid in tn.tag_map['HAM'])`

etc. The geometry of the network can thus be completely defined by which indices appear twice, and how you label the tensors with tags in order to select them.

This allows any tagging strategy/structure to be used to place/reference/remove tensors etc. For example the default tags a 1D tensor network uses are `('I0', 'I1', 'I2', ...)`

with physical inds `('k0', 'k1', 'k2', ...)`

. A 2D network might use tags `('I0J0', 'I0J1', 'I0J2', 'I1J0', ...)`

etc.

To select a subset or partition a network into tensors that match any or all of a set of tags see `select()`

or `partition()`

.

Finally, each `Tensor`

also contains a `weakref.ref`

to each `TensorNetwork`

that is viewing it (its `owners`

), so that these maps can be updated whenever the tensor is modified directly.

## 1.9. Contraction Backend & Strategy¶

The tensor contractions can be performed with any backend supported by opt_einsum, including several which use the GPU. These are specified with the `backend`

argument to `tensor_contract()`

and any related functions, or by setting a default backend using `set_contract_backend()`

and `set_tensor_linop_backend()`

.

The strategy used to find contraction orders for tensor networks is specified using the `optimize`

keyword - see the opt_einsum docs.

There are also the following context-managers for setting temporarily setting the default contraction strategy, contraction backend, and TN linear operator backend. For example the following might be useful:

```
with qtn.contract_strategy('optimal'):
# always find the optimal contraction sequence (exponentially slow for large networks!)
...
with qtn.contract_backend('cupy')
# use the GPU array library cupy to perform any contractions
...
with qtn.tensor_linop_backend('tensorflow'):
# compile any tensor linear operators into tensorflow graphs before use
...
```

Note

Recent versions of `opt_einsum`

have support for automatically detecting the correct backend library to use.
Sometimes however you want to contract `numpy`

tensors *via*, for example, `cupy`

, in which case this must
still be specified.