FedEMA-Distill: Exponential Moving Average Guided Knowledge Distillation for Robust Federated Learning
FedEMA-Distill is a robust and communication-efficient federated learning framework that leverages server-side exponential moving average smoothing and ensemble knowledge distillation from compressed client logits to achieve superior accuracy, faster convergence, and Byzantine resilience under non-IID data conditions without requiring client-side software modifications.