Polynomial, trigonometric, and tropical activations

This paper introduces a family of polynomial, trigonometric, and tropical activation functions derived from orthonormal bases that, when combined with variance-preserving initialization, enable the successful training of deep models like GPT-2 and ConvNeXt while mitigating gradient instability, offering polynomial interpretability, and facilitating fine-tuning through Hermite interpolation.

Ismail Khalfaoui-Hassani, Stefan Kesselheim2026-03-03💬 cs.CL

SPEED: Scalable, Precise, and Efficient Concept Erasure for Diffusion Models

The paper introduces SPEED, an efficient concept erasure framework for diffusion models that directly edits parameters within a null space—enhanced by influence-based filtering, directed prior augmentation, and invariant equality constraints—to achieve scalable, precise removal of multiple concepts in seconds while preserving the quality of non-target generations.

Ouxiang Li, Yuan Wang, Xinting Hu + 3 more2026-03-03💻 cs

A Multi-Objective Evaluation Framework for Analyzing Utility-Fairness Trade-Offs in Machine Learning Systems

This paper introduces a model-agnostic, multi-objective evaluation framework that utilizes radar charts and comprehensive metrics to systematically analyze and visualize utility-fairness trade-offs in machine learning systems, with a specific focus on mitigating disparities in medical imaging applications.

Gökhan Özbulak, Oscar Jimenez-del-Toro, Maíra Fatoretto + 2 more2026-03-03🤖 cs.LG