Session III.2 - Approximation Theory
Monday, June 19, 15:30 ~ 16:00
Optimal Approximation Rates for Deep ReLU Neural Networks on Sobolev Spaces
Jonathan Siegel
Texas A&M University, United States - This email address is being protected from spambots. You need JavaScript enabled to view it.
Deep ReLU neural networks are among the most widely used class of neural networks in practical applications. We consider the problem of determining optimal $L_p$-approximation rates for deep ReLU neural networks on the Sobolev class $W^s(L_q)$ for all $1\leq p,q\leq \infty$ and $s \gt 0$. Existing sharp results are only available when $q=\infty$, i.e. when the derivatives are measured in $L_\infty$. In our work, we extend these results and determine the best possible rates for all $p,q$ and $s$ for which a compact Sobolev embedding holds, i.e. when $s/d \gt 1/q - 1/p$. This settles in particular the classical non-linear regime where $p \gt q$. Our techniques can also be used to obtain optimal rates for Besov spaces. We will discuss some of the technical details of the proof and conclude by giving a few open research directions.