quimb.experimental.tnvmc

Tools for generic VMC optimization of tensor networks.

Attributes

Classes

NoContext

A convenience context manager that does nothing.

MovingStatistics

Keep track of the windowed mean and estimated variance of a stream of

DenseSampler

Sampler that explicitly constructs the full probability distribution.

DirectTNSampler

param tn:

The tensor network to sample from.

ClusterSampler

ExchangeSampler

HamiltonianSampler

MetropolisHastingsSampler

ComposePartial

AmplitudeFactory

GradientAccumulator

SGD

SignDescent

RandomSign

Adam

Vectorizer

Object for mapping back and forth between any pytree of mixed

StochasticReconfigureGradients

SR

SRADAM

TNVMC

Functions

default_to_neutral_style(fn)

Wrap a function or method to use the neutral style by default.

format_number_with_error(x, err)

Given x with error err, format a string showing the relevant

sample_bitstring_from_prob_ndarray(p, rng)

shuffled(it)

Return a copy of it in random order.

compute_amplitude(tn, config, chi, optimize)

compute_amplitudes(tn, configs, chi, optimize)

compute_local_energy(ham, tn, config, chi, optimize)

draw_config(edges, config)

auto_share_multicall(func, arrays, configs)

Call the function func, which should be an array

get_compose_partial(f, f_args, f_kwargs, g)

fuse_unary_ops_(Z)

Module Contents

quimb.experimental.tnvmc.default_to_neutral_style(fn)

Wrap a function or method to use the neutral style by default.

quimb.experimental.tnvmc.format_number_with_error(x, err)

Given x with error err, format a string showing the relevant digits of x with two significant digits of the error bracketed, and overall exponent if necessary.

Parameters:
  • x (float) – The value to print.

  • err (float) – The error on x.

Return type:

str

Examples

>>> print_number_with_uncertainty(0.1542412, 0.0626653)
'0.154(63)'
>>> print_number_with_uncertainty(-128124123097, 6424)
'-1.281241231(64)e+11'
quimb.experimental.tnvmc.sample_bitstring_from_prob_ndarray(p, rng)
quimb.experimental.tnvmc.shuffled(it)

Return a copy of it in random order.

class quimb.experimental.tnvmc.NoContext

A convenience context manager that does nothing.

__enter__()
__exit__(*_, **__)
class quimb.experimental.tnvmc.MovingStatistics(window_size)

Keep track of the windowed mean and estimated variance of a stream of values on the fly.

update(x)
property mean
property var
property std
property err
class quimb.experimental.tnvmc.DenseSampler(psi=None, seed=None, **contract_opts)

Sampler that explicitly constructs the full probability distribution. Useful for debugging small problems.

_set_psi(psi)
sample()
update(**kwargs)
class quimb.experimental.tnvmc.DirectTNSampler(tn, sweeps=1, max_group_size=8, chi=None, optimize=None, optimize_share_path=False, seed=None, track=False)
Parameters:
  • tn (TensorNetwork) – The tensor network to sample from.

  • sweeps (int, optional) – The number of sweeps to perform.

  • max_group_size (int, optional) – The maximum number of sites to include in a single marginal.

  • chi (int, optional) – The maximum bond dimension to use for compressed contraction.

  • optimize (PathOptimizer, optional) – The path optimizer to use.

  • optimize_share_path (bool, optional) – If True, a single path will be used for all contractions regardless of which marginal (i.e. which indices are open) is begin computed.

plot()
calc_groups(**kwargs)

Calculate how to group the sites into marginals.

get_groups()
calc_path()
get_path()
get_optimize()
contract(tn, output_inds)
sample()
quimb.experimental.tnvmc.compute_amplitude(tn, config, chi, optimize)
quimb.experimental.tnvmc.compute_amplitudes(tn, configs, chi, optimize)
quimb.experimental.tnvmc.compute_local_energy(ham, tn, config, chi, optimize)
quimb.experimental.tnvmc.draw_config(edges, config)
class quimb.experimental.tnvmc.ClusterSampler(psi=None, max_distance=1, use_gauges=True, seed=None, contract_opts=None)
_set_psi(psi)
sample()
candidate
accept(config)
update(**kwargs)
class quimb.experimental.tnvmc.ExchangeSampler(edges, seed=None)
candidate()
accept(config)
sample()
update(**_)
class quimb.experimental.tnvmc.HamiltonianSampler(ham, seed=None)
candidate()
accept(config)
sample()
update(**_)
class quimb.experimental.tnvmc.MetropolisHastingsSampler(sub_sampler, amplitude_factory=None, initial=None, burn_in=0, seed=None, track=False)
property acceptance_ratio
sample()
update(**kwargs)
plot()
quimb.experimental.tnvmc.auto_share_multicall(func, arrays, configs)

Call the function func, which should be an array function making use of autoray dispatched calls, multiple times, automatically reusing shared intermediates.

class quimb.experimental.tnvmc.ComposePartial(f, f_args, f_kwargs, g)
__slots__ = ('f', 'f_args', 'f_kwargs', 'g')
__call__(*args, **kwargs)
quimb.experimental.tnvmc._partial_compose_cache
quimb.experimental.tnvmc.get_compose_partial(f, f_args, f_kwargs, g)
quimb.experimental.tnvmc.fuse_unary_ops_(Z)
class quimb.experimental.tnvmc.AmplitudeFactory(psi=None, contract_fn=None, maxsize=2**20, autojit_opts=(), **contract_opts)
_set_psi(psi)
compute_single(config)

Compute the amplitude of config, making use of autojit.

compute_multi(configs)

Compute the amplitudes corresponding to the sequence configs, making use of shared intermediates.

amplitude(config)

Get the amplitude of config, either from the cache or by computing it.

amplitudes(configs)
prob(config)

Calculate the probability of a configuration.

clear()
__contains__(config)
__setitem__(config, c)
__getitem__(config)
__repr__()

Return repr(self).

class quimb.experimental.tnvmc.GradientAccumulator
_init_storage(grads)
update(grads_logpsi_sample, local_energy)
extract_grads_energy()
class quimb.experimental.tnvmc.SGD(learning_rate=0.01)

Bases: GradientAccumulator

transform_gradients()
class quimb.experimental.tnvmc.SignDescent(learning_rate=0.01)

Bases: GradientAccumulator

transform_gradients()
class quimb.experimental.tnvmc.RandomSign(learning_rate=0.01)

Bases: GradientAccumulator

transform_gradients()
class quimb.experimental.tnvmc.Adam(learning_rate=0.01, beta1=0.9, beta2=0.999, eps=1e-08)

Bases: GradientAccumulator

transform_gradients()
class quimb.experimental.tnvmc.Vectorizer(tree)

Object for mapping back and forth between any pytree of mixed real/complex n-dimensional arrays to a single, real, double precision numpy vector, as required by scipy.optimize routines.

Parameters:
  • tree (pytree of array) – Any nested container of arrays, which will be flattened and packed into a single float64 vector.

  • is_leaf (callable, optional) – A function which takes a single argument and returns True if it is a leaf node in the tree and should be extracted, False otherwise. Defaults to everything that is not a tuple, list or dict.

pack(tree, name='vector')

Take arrays and pack their values into attribute .{name}, by default .vector.

unpack(vector=None)

Turn the single, flat vector into a sequence of arrays.

class quimb.experimental.tnvmc.StochasticReconfigureGradients(delta=1e-05)
update(grads_logpsi_sample, local_energy)
extract_grads_energy()
class quimb.experimental.tnvmc.SR(learning_rate=0.05, delta=1e-05)

Bases: SGD, StochasticReconfigureGradients

class quimb.experimental.tnvmc.SRADAM(learning_rate=0.01, beta1=0.9, beta2=0.999, eps=1e-08, delta=1e-05)

Bases: Adam, StochasticReconfigureGradients

class quimb.experimental.tnvmc.TNVMC(psi, ham, sampler, conditioner='auto', learning_rate=0.01, optimizer='adam', optimizer_opts=None, track_window_size=1000, **contract_opts)
_compute_log_gradients_torch(config)
_compute_local_energy(config)
_run(steps, batchsize)
run(total=10000, batchsize=100, progbar=True)
measure(max_samples=10000, rtol=0.0001, progbar=True)
plot(figsize=(12, 6), yrange_quantile=(0.01, 0.99), zoom='auto', hlines=())