Some Super-approximation Rates of ReLU Neural Networks for Korobov Functions
This paper establishes nearly optimal super-approximation error bounds of order and in and norms, respectively, for ReLU neural networks approximating Korobov functions by leveraging sparse grid finite elements and bit extraction, thereby demonstrating that neural network expressivity effectively overcomes the curse of dimensionality.