15. Generic Tensor Fitting¶
quimb
has support for fitting arbitrary tensor networks to other tensors or tensor networks.
Here we show decomposing a 4-tensor into a ring.
%config InlineBackend.figure_formats = ['svg']
import numpy as np
import quimb.tensor as qtn
Create a target 10x10x10x10 tensor with uniform positive entries:
t_target = qtn.Tensor(
data=np.random.uniform(size=(10, 10, 10, 10)),
inds=('a', 'b', 'c', 'd'),
)
t_target
Tensor(shape=(10, 10, 10, 10), inds=[a, b, c, d], tags={}),
backend=numpy, dtype=float64, data=...# normalize for better sense of how good the fit is
t_target /= t_target.norm()
The target could also be an arbitrary tensor network.
Now we manually create the decomposed geometry, i.e. a ring of 4 tensors.
rank = 5
tn_guess = qtn.TensorNetwork([
qtn.Tensor(np.random.normal(size=(10, rank, rank)), inds=('a', 'left', 'up')),
qtn.Tensor(np.random.normal(size=(10, rank, rank)), inds=('b', 'up', 'right')),
qtn.Tensor(np.random.normal(size=(10, rank, rank)), inds=('c', 'right', 'bottom')),
qtn.Tensor(np.random.normal(size=(10, rank, rank)), inds=('d', 'bottom', 'left')),
])
tn_guess
TensorNetwork(tensors=4, indices=8)
Tensor(shape=(10, 5, 5), inds=[a, left, up], tags={}),
backend=numpy, dtype=float64, data=...Tensor(shape=(10, 5, 5), inds=[b, up, right], tags={}),
backend=numpy, dtype=float64, data=...Tensor(shape=(10, 5, 5), inds=[c, right, bottom], tags={}),
backend=numpy, dtype=float64, data=...Tensor(shape=(10, 5, 5), inds=[d, bottom, left], tags={}),
backend=numpy, dtype=float64, data=...We could have any internal structure, as long as the other indices match (and the contraction is possible).
tn_guess.draw(show_inds='all', highlight_inds=['a', 'b', 'c', 'd'])
Compute the initial distance (in terms of frobeius norm):
tn_guess.distance(t_target)
np.float64(2450.808525296577)
Perform the initial fitting using ALS (alternating least squares), see the
function TensorNetwork.fit
for more details:
tn_fitted = tn_guess.fit(t_target, method='als', steps=1000, progbar=True)
0.4697: 100%|██████████| 1000/1000 [00:01<00:00, 565.85it/s]
Sometimes, autodiff based optimization can do better than ALS, see
TNOptimizer
for more details:
tn_fitted.fit_(t_target, method='autodiff', steps=1000, progbar=True)
+0.457260587804 [best: +0.457260587804] : 21%|██▏ | 214/1000 [00:00<00:01, 546.10it/s]
TensorNetwork(tensors=4, indices=8)
Tensor(shape=(10, 5, 5), inds=[a, left, up], tags={}),
backend=numpy, dtype=float64, data=...Tensor(shape=(10, 5, 5), inds=[b, up, right], tags={}),
backend=numpy, dtype=float64, data=...Tensor(shape=(10, 5, 5), inds=[c, right, bottom], tags={}),
backend=numpy, dtype=float64, data=...Tensor(shape=(10, 5, 5), inds=[d, bottom, left], tags={}),
backend=numpy, dtype=float64, data=...Double check the new fitted tensor network is close to the target:
tn_fitted.distance(t_target)
np.float64(0.45725182227607436)
Considering the target as a wavefunction, our fitted network has an overlap of:
tn_fitted @ t_target.H
0.895422046569203
Note random tensors are generally not that easy to fit, resulting in a not great fidelity.