2. Contraction¶
one of the core tasks of any tensor network algorithm: collection of tensors -> single tensor
defined as sum of products $\( \sum_{\sigma} \prod_{i} T_i(\sigma_i) \)$
exponentially slow to do sum directly since we would have evaluate every single combination of index value assignments.
Instead pairwise path of intermediates always best cost wise -> contracting a pair of tenosrs can remove indices entirely from the rest of the contraction
specified by “contraction tree”, cost is incredibly sensitive to the choice and the space of these trees is very large -> tricky problem, but can be automated.
Tradeoff between time spent finding the path and the time spent actually doing the contraction.
The optimize
kwarg.
2.1. contraction interfaces¶
ta @ tb
for simple two tensor contract
special forms of contraction
And information methods:
TensorNetwork.contraction_info
the
opt_einsum.PathInfo
object
TensorNetwork.contraction_tree
the
cotengra.ContractionTree
object
TensorNetwork.contraction_width
log2 of the maximum size of any intermediate tensor.
TensorNetwork.contraction_cost
the total number of scalar operations required to perform the contraction
called within many algorithms (wherever you see the optimize
kwarg)
2.2. Things you can supply to the optimize
kwarg:¶
2.2.1. str
preset¶
2.2.2. opt_einsum
PathOptimizer
¶
2.2.3. cotengra
HyperOptimizer
¶
2.2.4. explicit contraction path¶
2.2.5. path caching¶
geometry hash
caching within cotengra
2.3. Hyper edges¶
most general einsum equation
2.4. Structured Contractions¶
e.g. 1D chain
2.5. Approximate boundary contraction¶
coarse graining contractions
hyper edges not supported
2.6. Automatic approximate / compressed contraction¶
contract_copmressed
contract_around