On the relationship between concentration inequalities and maximum bias for depth estimators

This paper establishes a unified framework using concentration inequalities to analyze the statistical convergence and robustness of depth-based estimators, explicitly deriving their maximum bias curves and breakdown points while demonstrating how slight variations in these inequalities reveal distinct robustness behaviors across different depth formulations.

Jorge G. Adrover, Marcelo Ruiz2026-03-05🔢 math

Expected Kullback-Leibler-based characterizations of score-driven updates

This paper establishes that score-driven updates are uniquely characterized by their ability to reduce the expected Kullback-Leibler divergence relative to the true data-generating density, providing a rigorous information-theoretic foundation that holds even in non-concave, multivariate, and misspecified settings where alternative performance measures fail.

Ramon de Punder, Timo Dimitriadis, Rutger-Jan Lange2026-03-05🔢 math

Uniform error bounds of the ensemble transform Kalman filter for infinite-dimensional dynamics with multiplicative covariance inflation

This paper establishes theoretical uniform-in-time error bounds for the deterministic ensemble transform Kalman filter applied to infinite-dimensional nonlinear dynamical systems, demonstrating that appropriate multiplicative covariance inflation ensures bounded estimation errors and justifying its practical effectiveness.

Kota Takeda, Takashi Sakajo2026-03-05🔢 math

Beyond Mixtures and Products for Ensemble Aggregation: A Likelihood Perspective on Generalized Means

This paper establishes a principled theoretical framework for density aggregation by demonstrating that normalized generalized means with order r[0,1]r \in [0,1] are the only rules guaranteeing systematic improvements in log-likelihood over individual distributions, thereby providing a unified justification for the widespread use of linear and geometric pooling in Deep Ensembles.

Raphaël Razafindralambo, Rémy Sun, Frédéric Precioso + 2 more2026-03-05🤖 cs.LG

A computational transition for detecting correlated stochastic block models by low-degree polynomials

This paper establishes that low-degree polynomial tests can distinguish between correlated sparse stochastic block models and independent Erdős-Rényi graphs if and only if the subsampling probability exceeds the minimum of Otter's constant and the Kesten-Stigum threshold, thereby identifying a sharp computational transition for detection and partial recovery.

Guanyi Chen, Jian Ding, Shuyang Gong + 1 more2026-03-05🤖 cs.LG

Generalization Properties of Score-matching Diffusion Models for Intrinsically Low-dimensional Data

This paper establishes finite-sample convergence guarantees for score-based diffusion models learning intrinsically low-dimensional distributions, demonstrating that their generalization error scales with the data's intrinsic (p,q)(p,q)-Wasserstein dimension rather than the ambient dimension, thereby mitigating the curse of dimensionality without requiring restrictive assumptions like compact support or smooth densities.

Saptarshi Chakraborty, Quentin Berthet, Peter L. Bartlett2026-03-05🤖 cs.AI