Expected Kullback-Leibler-based characterizations of score-driven updates
This paper establishes that score-driven updates are uniquely characterized by their ability to reduce the expected Kullback-Leibler divergence relative to the true data-generating density, providing a rigorous information-theoretic foundation that holds even in non-concave, multivariate, and misspecified settings where alternative performance measures fail.