Report
The Parametric Complexity of Operator Learning
العنوان: | The Parametric Complexity of Operator Learning |
---|---|
المؤلفون: | Lanthaler, Samuel, Stuart, Andrew M. |
سنة النشر: | 2023 |
المجموعة: | Computer Science Mathematics |
مصطلحات موضوعية: | Computer Science - Machine Learning, Mathematics - Numerical Analysis |
الوصف: | Neural operator architectures employ neural networks to approximate operators mapping between Banach spaces of functions; they may be used to accelerate model evaluations via emulation, or to discover models from data. Consequently, the methodology has received increasing attention over recent years, giving rise to the rapidly growing field of operator learning. The first contribution of this paper is to prove that for general classes of operators which are characterized only by their $C^r$- or Lipschitz-regularity, operator learning suffers from a ``curse of parametric complexity'', which is an infinite-dimensional analogue of the well-known curse of dimensionality encountered in high-dimensional approximation problems. The result is applicable to a wide variety of existing neural operators, including PCA-Net, DeepONet and the FNO. The second contribution of the paper is to prove that this general curse can be overcome for solution operators defined by the Hamilton-Jacobi equation; this is achieved by leveraging additional structure in the underlying solution operator, going beyond regularity. To this end, a novel neural operator architecture is introduced, termed HJ-Net, which explicitly takes into account characteristic information of the underlying Hamiltonian system. Error and complexity estimates are derived for HJ-Net which show that this architecture can provably beat the curse of parametric complexity related to the infinite-dimensional input and output function spaces. |
نوع الوثيقة: | Working Paper |
URL الوصول: | http://arxiv.org/abs/2306.15924 |
رقم الانضمام: | edsarx.2306.15924 |
قاعدة البيانات: | arXiv |
الوصف غير متاح. |