Analysis Seminar: Efficient approximations of high-dimensional functions on smooth manifolds using deep ReLU neural networks

Speaker: Demetrio Labatte, University of Houston

Abstract: The impressive capacity of deep neural networks to approximate multivariate functions, apparently overcoming the curse of dimensionality, is exemplified by their effectiveness in addressing high-dimensional problems where traditional numerical solvers fail. To offer a theoretical framework elucidating this phenomenon, we analyze the approximation of Hoelder functions defined on a d-dimensional smooth manifold M embedded in R^D, with d typically much lower than D, using deep neural networks. Our approach identifies neural networks as a class of structured parametric functions. We establish new uniform convergence estimates for the approximation and generalization errors by deep neural networks with ReLU activation functions, showing that such estimates do not depend on the ambient dimension D of the function but only on the lower manifold dimension d, in a precise sense. This finding is a significant improvement over existing estimates established in the literature in a similar setting, where approximation and generalization errors were found to depend weakly on the ambient dimension D.

Host: Brett Wick