quimb.experimental.tnvmc ======================== .. py:module:: quimb.experimental.tnvmc .. autoapi-nested-parse:: Tools for generic VMC optimization of tensor networks. Attributes ---------- .. autoapisummary:: quimb.experimental.tnvmc._partial_compose_cache Classes ------- .. autoapisummary:: quimb.experimental.tnvmc.NoContext quimb.experimental.tnvmc.MovingStatistics quimb.experimental.tnvmc.DenseSampler quimb.experimental.tnvmc.DirectTNSampler quimb.experimental.tnvmc.ClusterSampler quimb.experimental.tnvmc.ExchangeSampler quimb.experimental.tnvmc.HamiltonianSampler quimb.experimental.tnvmc.MetropolisHastingsSampler quimb.experimental.tnvmc.ComposePartial quimb.experimental.tnvmc.AmplitudeFactory quimb.experimental.tnvmc.GradientAccumulator quimb.experimental.tnvmc.SGD quimb.experimental.tnvmc.SignDescent quimb.experimental.tnvmc.RandomSign quimb.experimental.tnvmc.Adam quimb.experimental.tnvmc.StochasticReconfigureGradients quimb.experimental.tnvmc.SR quimb.experimental.tnvmc.SRADAM quimb.experimental.tnvmc.TNVMC Functions --------- .. autoapisummary:: quimb.experimental.tnvmc.sample_bitstring_from_prob_ndarray quimb.experimental.tnvmc.shuffled quimb.experimental.tnvmc.compute_amplitude quimb.experimental.tnvmc.compute_amplitudes quimb.experimental.tnvmc.compute_local_energy quimb.experimental.tnvmc.draw_config quimb.experimental.tnvmc.auto_share_multicall quimb.experimental.tnvmc.get_compose_partial quimb.experimental.tnvmc.fuse_unary_ops_ Module Contents --------------- .. py:function:: sample_bitstring_from_prob_ndarray(p, rng) .. py:function:: shuffled(it) Return a copy of ``it`` in random order. .. py:class:: NoContext A convenience context manager that does nothing. .. py:method:: __enter__() .. py:method:: __exit__(*_, **__) .. py:class:: MovingStatistics(window_size) Keep track of the windowed mean and estimated variance of a stream of values on the fly. .. py:attribute:: window_size .. py:attribute:: xs :value: [] .. py:attribute:: vs :value: [] .. py:attribute:: _xsum :value: 0.0 .. py:attribute:: _vsum :value: 0.0 .. py:method:: update(x) .. py:property:: mean .. py:property:: var .. py:property:: std .. py:property:: err .. py:class:: DenseSampler(psi=None, seed=None, **contract_opts) Sampler that explicitly constructs the full probability distribution. Useful for debugging small problems. .. py:attribute:: contract_opts .. py:attribute:: rng .. py:method:: _set_psi(psi) .. py:method:: sample() .. py:method:: update(**kwargs) .. py:class:: DirectTNSampler(tn, sweeps=1, max_group_size=8, chi=None, optimize=None, optimize_share_path=False, seed=None, track=False) :param tn: The tensor network to sample from. :type tn: TensorNetwork :param sweeps: The number of sweeps to perform. :type sweeps: int, optional :param max_group_size: The maximum number of sites to include in a single marginal. :type max_group_size: int, optional :param chi: The maximum bond dimension to use for compressed contraction. :type chi: int, optional :param optimize: The path optimizer to use. :type optimize: PathOptimizer, optional :param optimize_share_path: If ``True``, a single path will be used for all contractions regardless of which marginal (i.e. which indices are open) is begin computed. :type optimize_share_path: bool, optional .. py:attribute:: tn .. py:attribute:: ind2site .. py:attribute:: tid2ind .. py:attribute:: chi .. py:attribute:: sweeps .. py:attribute:: max_group_size .. py:attribute:: optimize .. py:attribute:: optimize_share_path .. py:attribute:: groups :value: None .. py:attribute:: tree :value: None .. py:attribute:: path :value: None .. py:attribute:: rng .. py:attribute:: track .. py:method:: plot() .. py:method:: calc_groups(**kwargs) Calculate how to group the sites into marginals. .. py:method:: get_groups() .. py:method:: calc_path() .. py:method:: get_path() .. py:method:: get_optimize() .. py:method:: contract(tn, output_inds) .. py:method:: sample() .. py:function:: compute_amplitude(tn, config, chi, optimize) .. py:function:: compute_amplitudes(tn, configs, chi, optimize) .. py:function:: compute_local_energy(ham, tn, config, chi, optimize) .. py:function:: draw_config(edges, config) .. py:class:: ClusterSampler(psi=None, max_distance=1, use_gauges=True, seed=None, contract_opts=None) .. py:attribute:: rng .. py:attribute:: use_gauges .. py:attribute:: max_distance .. py:attribute:: contract_opts .. py:method:: _set_psi(psi) .. py:method:: sample() .. py:attribute:: candidate .. py:method:: accept(config) .. py:method:: update(**kwargs) .. py:class:: ExchangeSampler(edges, seed=None) .. py:attribute:: edges .. py:attribute:: Ne .. py:attribute:: sites .. py:attribute:: N .. py:attribute:: rng .. py:attribute:: config .. py:method:: candidate() .. py:method:: accept(config) .. py:method:: sample() .. py:method:: update(**_) .. py:class:: HamiltonianSampler(ham, seed=None) .. py:attribute:: ham .. py:attribute:: rng .. py:attribute:: N .. py:attribute:: config .. py:method:: candidate() .. py:method:: accept(config) .. py:method:: sample() .. py:method:: update(**_) .. py:class:: MetropolisHastingsSampler(sub_sampler, amplitude_factory=None, initial=None, burn_in=0, seed=None, track=False) .. py:attribute:: sub_sampler .. py:attribute:: seed .. py:attribute:: rng .. py:attribute:: accepted :value: 0 .. py:attribute:: total :value: 0 .. py:attribute:: burn_in .. py:attribute:: track .. py:property:: acceptance_ratio .. py:method:: sample() .. py:method:: update(**kwargs) .. py:method:: plot() .. py:function:: auto_share_multicall(func, arrays, configs) Call the function ``func``, which should be an array function making use of autoray dispatched calls, multiple times, automatically reusing shared intermediates. .. py:class:: ComposePartial(f, f_args, f_kwargs, g) .. py:attribute:: __slots__ :value: ('f', 'f_args', 'f_kwargs', 'g') .. py:attribute:: f .. py:attribute:: f_args .. py:attribute:: f_kwargs .. py:attribute:: g .. py:method:: __call__(*args, **kwargs) .. py:data:: _partial_compose_cache .. py:function:: get_compose_partial(f, f_args, f_kwargs, g) .. py:function:: fuse_unary_ops_(Z) .. py:class:: AmplitudeFactory(psi=None, contract_fn=None, maxsize=2**20, autojit_opts=(), **contract_opts) .. py:attribute:: contract_fn .. py:attribute:: contract_opts .. py:attribute:: autojit_opts .. py:attribute:: store .. py:attribute:: hits :value: 0 .. py:attribute:: queries :value: 0 .. py:method:: _set_psi(psi) .. py:method:: compute_single(config) Compute the amplitude of ``config``, making use of autojit. .. py:method:: compute_multi(configs) Compute the amplitudes corresponding to the sequence ``configs``, making use of shared intermediates. .. py:method:: amplitude(config) Get the amplitude of ``config``, either from the cache or by computing it. .. py:method:: amplitudes(configs) .. py:method:: prob(config) Calculate the probability of a configuration. .. py:method:: clear() .. py:method:: __contains__(config) .. py:method:: __setitem__(config, c) .. py:method:: __getitem__(config) .. py:method:: __repr__() .. py:class:: GradientAccumulator .. py:attribute:: _grads_logpsi :value: None .. py:attribute:: _grads_energy :value: None .. py:attribute:: _batch_energy :value: None .. py:attribute:: _num_samples :value: 0 .. py:method:: _init_storage(grads) .. py:method:: update(grads_logpsi_sample, local_energy) .. py:method:: extract_grads_energy() .. py:class:: SGD(learning_rate=0.01) Bases: :py:obj:`GradientAccumulator` .. py:attribute:: learning_rate .. py:method:: transform_gradients() .. py:class:: SignDescent(learning_rate=0.01) Bases: :py:obj:`GradientAccumulator` .. py:attribute:: learning_rate .. py:method:: transform_gradients() .. py:class:: RandomSign(learning_rate=0.01) Bases: :py:obj:`GradientAccumulator` .. py:attribute:: learning_rate .. py:method:: transform_gradients() .. py:class:: Adam(learning_rate=0.01, beta1=0.9, beta2=0.999, eps=1e-08) Bases: :py:obj:`GradientAccumulator` .. py:attribute:: learning_rate .. py:attribute:: beta1 .. py:attribute:: beta2 .. py:attribute:: eps .. py:attribute:: _num_its :value: 0 .. py:attribute:: _ms :value: None .. py:attribute:: _vs :value: None .. py:method:: transform_gradients() .. py:class:: StochasticReconfigureGradients(delta=1e-05) .. py:attribute:: delta .. py:attribute:: vectorizer :value: None .. py:attribute:: gs :value: [] .. py:method:: update(grads_logpsi_sample, local_energy) .. py:method:: extract_grads_energy() .. py:class:: SR(learning_rate=0.05, delta=1e-05) Bases: :py:obj:`SGD`, :py:obj:`StochasticReconfigureGradients` .. py:class:: SRADAM(learning_rate=0.01, beta1=0.9, beta2=0.999, eps=1e-08, delta=1e-05) Bases: :py:obj:`Adam`, :py:obj:`StochasticReconfigureGradients` .. py:class:: TNVMC(psi, ham, sampler, conditioner='auto', learning_rate=0.01, optimizer='adam', optimizer_opts=None, track_window_size=1000, **contract_opts) .. py:attribute:: psi .. py:attribute:: ham .. py:attribute:: sampler .. py:attribute:: optimizer .. py:attribute:: contract_opts .. py:attribute:: amplitude_factory .. py:attribute:: moving_stats .. py:attribute:: local_energies .. py:attribute:: energies .. py:attribute:: energy_errors .. py:attribute:: num_tensors .. py:attribute:: nsites .. py:attribute:: _progbar :value: None .. py:method:: _compute_log_gradients_torch(config) .. py:method:: _compute_local_energy(config) .. py:method:: _run(steps, batchsize) .. py:method:: run(total=10000, batchsize=100, progbar=True) .. py:method:: measure(max_samples=10000, rtol=0.0001, progbar=True) .. py:method:: plot(figsize=(12, 6), yrange_quantile=(0.01, 0.99), zoom='auto', hlines=())