Stochastic Loop Corrections to Belief Propagation for Tensor Network Contraction
This paper introduces a hybrid stochastic method that achieves exact tensor network contraction by sampling loop corrections to Belief Propagation via Markov chain Monte Carlo, thereby eliminating systematic errors on loopy graphs while providing unbiased estimates with controllable statistical error across all parameter regimes.