Fast Equivariant Imaging: Acceleration for Unsupervised Learning via Augmented Lagrangian and Auxiliary PnP Denoisers

This paper introduces Fast Equivariant Imaging (FEI), a novel unsupervised learning framework that leverages the Augmented Lagrangian method and auxiliary Plug-and-Play denoisers to achieve a 10x training acceleration and improved generalization for deep imaging tasks like X-ray CT reconstruction and inpainting without requiring ground-truth data.

Guixian Xu, Jinglai Li, Junqi Tang2026-03-05🤖 cs.LG

Implicit Bias of Per-sample Adam on Separable Data: Departure from the Full-batch Regime

This paper demonstrates that the implicit bias of per-sample Adam on separable data can deviate from the full-batch \ell_\infty-max-margin behavior, potentially converging to the 2\ell_2-max-margin classifier or a data-adaptive Mahalanobis-norm margin depending on the dataset, whereas Signum consistently converges to the \ell_\infty-max-margin regardless of batch size.

Beomhan Baek, Minhak Song, Chulhee Yun2026-03-05🤖 cs.AI

A stochastic optimization algorithm for revenue maximization in a service system with balking customers

This paper proposes a stochastic gradient descent algorithm that dynamically maximizes revenue in a single-server queue with balking customers by using a novel Infinitesimal Perturbation Analysis procedure to estimate effective arrival rates based solely on observable joining behavior, thereby converging to the optimal price under mild regularity conditions.

Shreehari Anand Bodas, Harsha Honnappa, Michel Mandjes + 1 more2026-03-05🔢 math

Stochastic Optimization for Resource Adequacy in Capacity Markets with Storage and Renewables

This paper proposes and validates a computationally tractable two-stage stochastic optimization framework for capacity procurement that integrates storage and renewables by modeling temporally correlated uncertainties, demonstrating that stochastic decomposition can efficiently solve large-scale problems while achieving faster convergence for optimization than for precise reliability metric estimation.

Baptiste Rabecq, Andy Sun, Feng Zhao + 3 more2026-03-05🔢 math

Negative Curvature Methods with High-Probability Complexity Guarantees for Stochastic Nonconvex Optimization

This paper proposes a two-step stochastic optimization framework that combines gradient and negative curvature steps with adaptive step sizes and early stopping to achieve high-probability convergence to second-order stationary points, offering complexity guarantees that match deterministic rates up to noise-dependent terms.

Albert S. Berahas, Raghu Bollapragada, Wanping Dong2026-03-05🔢 math