Geodesic Gradient Descent: A Generic and Learning-rate-free Optimizer on Objective Function-induced Manifolds
This paper introduces Geodesic Gradient Descent (GGD), a generic, learning-rate-free optimization algorithm that approximates local neighborhoods of objective function-induced hypersurfaces using n-dimensional spheres to ensure update trajectories remain on the manifold, achieving significant performance improvements over Adam on both regression and classification tasks.