The Curse and Blessing of Mean Bias in FP4-Quantized LLM Training
This paper identifies a coherent rank-one mean bias as the primary cause of numerical instability in low-bit LLM training and demonstrates that simply subtracting this mean restores stability and performance in FP4 quantization, offering a hardware-efficient alternative to complex spectral methods.