View abstract

Session III.5 - Information-Based Complexity

Tuesday, June 20, 16:30 ~ 17:00

The curse of dimensionality for the $L_p$-discrepancy with finite $p$

Erich Novak

FSU Jena, Germany   -   This email address is being protected from spambots. You need JavaScript enabled to view it.

The $L_p$-discrepancy is closely related to the worst-case error of algorithms for numerical integration. Its inverse for dimension $d$ and error threshold $\varepsilon \in (0,1)$ is the number of points in $[0,1]^d$ that is required such that the minimal normalized $L_p$-discrepancy is less or equal $\varepsilon$.

The inverse of $L_2$-discrepancy grows exponentially fast with the dimension $d$, i.e., we have the curse of dimensionality, whereas the inverse of $L_{\infty}$-discrepancy depends linearly on $d$; both results are from 2001. The behavior of inverse of $L_p$-discrepancy for $p \not\in \{2,\infty\}$ was open. We show that the $L_p$-discrepancy suffers from the curse of dimensionality for all $p$ of the form $p=2 \ell/(2 \ell -1)$ with $\ell \in \mathbb{N}$. We prove lower bounds for the worst-case error of numerical integration in an anchored Sobolev space of once differentiable functions in each variable whose first derivative has finite $L_q$-norm, where $q$ is an even positive integer with $1/p+1/q=1$.

Conjecture: The curse holds for all $p$ with $1 \lt p \lt \infty$.

Joint work with Friedrich Pillichshammer (JKU Linz, Austria).

View abstract PDF