Comptes Rendus
Nonlinear compressive reduced basis approximation for PDE’s
Comptes Rendus. Mécanique, Volume 351 (2023) no. S1, pp. 357-374.

Linear model reduction techniques design offline low-dimensional subspaces that are tailored to the approximation of solutions to a parameterized partial differential equation, for the purpose of fast online numerical simulations. These methods, such as the Proper Orthogonal Decomposition (POD) or Reduced Basis (RB) methods, are very effective when the family of solutions has fast-decaying Karhunen–Loève eigenvalues or Kolmogorov widths, reflecting the approximability by finite-dimensional linear spaces. On the other hand, they become ineffective when these quantities have a slow decay, in particular for families of solutions to hyperbolic transport equations with parameter-dependent shock positions. The objective of this work is to explore the ability of nonlinear model reduction to circumvent this particular situation. To this end, we first describe particular notions of nonlinear widths that have a substantially faster decay for the aforementioned families. Then, we discuss a systematic approach for achieving better performance via a nonlinear reconstruction from the first coordinates of a linear reduced model approximation, thus allowing us to stay in the same “classical” framework of projection-based model reduction. We analyze the approach and report on its performance for a simple and yet instructive univariate test case.

Les techniques linéaires de réduction de modèles proposent, hors ligne, des sous-espaces de faible dimension adaptés à l’approximation des solutions d’une équation aux dérivées partielles paramétrée, dans le but d’effectuer des simulations numériques rapides en ligne. Ces méthodes, telles que la décomposition orthogonale appropriée (POD) ou les méthodes de base réduite (RB), sont très efficaces lorsque la famille de solutions a des valeurs propres de Karhunen–Loève ou des épaisseurs de Kolmogorov à décroissance rapide, reflétant la possibilité d’approximation par des espaces linéaires de dimension finie. D’autre part, elles deviennent inefficaces lorsque ces quantités ont une décroissance lente, en particulier pour les familles de solutions aux équations de transport hyperboliques avec des positions de choc dépendant des paramètres. L’objectif de ce travail est d’explorer la capacité de la réduction de modèle non linéaire à contourner cette situation particulière. À cette fin, nous décrivons d’abord des notions particulières d’épaisseurs non linéaires qui ont une décroissance substantiellement plus rapide pour les familles susmentionnées. Ensuite, nous discutons d’une approche systématique permettant d’obtenir de meilleures performances via une reconstruction non linéaire à partir des premières coordonnées d’une approximation de modèle réduit linéaire, ce qui nous permet de rester dans le même cadre “classique” de la réduction de modèle basée sur la projection. Nous analysons l’approche et rendons compte de ses performances pour un cas test univarié simple mais instructif.

Received:
Revised:
Accepted:
Online First:
Published online:
DOI: 10.5802/crmeca.191
Keywords: non linear reduced basis, compressed sensing, solution manifold, machine learning, $m$-width
Mot clés : base réduite non linéaire, acquisition comprimée, variété des solutions, apprentissage automatique, $m$-épaisseur

Albert Cohen 1; Charbel Farhat 2, 3, 4; Yvon Maday 1; Agustin Somacal 1

1 Sorbonne Université, CNRS, Université Paris Cité, Laboratoire Jacques-Louis Lions (LJLL), F-75005 Paris, France
2 Department of Mechanical Engineering, Stanford University, Stanford, CA 94305, USA
3 Department of Aeronautics and Astronautics, Stanford University, Stanford, CA 94305, USA
4 Institute for Computational and Mathematical Engineering, Stanford University, Stanford, CA 94305, USA
License: CC-BY 4.0
Copyrights: The authors retain unrestricted copyrights and publishing rights
@article{CRMECA_2023__351_S1_357_0,
     author = {Albert Cohen and Charbel Farhat and Yvon Maday and Agustin Somacal},
     title = {Nonlinear compressive reduced basis approximation for {PDE{\textquoteright}s}},
     journal = {Comptes Rendus. M\'ecanique},
     pages = {357--374},
     publisher = {Acad\'emie des sciences, Paris},
     volume = {351},
     number = {S1},
     year = {2023},
     doi = {10.5802/crmeca.191},
     language = {en},
}
TY  - JOUR
AU  - Albert Cohen
AU  - Charbel Farhat
AU  - Yvon Maday
AU  - Agustin Somacal
TI  - Nonlinear compressive reduced basis approximation for PDE’s
JO  - Comptes Rendus. Mécanique
PY  - 2023
SP  - 357
EP  - 374
VL  - 351
IS  - S1
PB  - Académie des sciences, Paris
DO  - 10.5802/crmeca.191
LA  - en
ID  - CRMECA_2023__351_S1_357_0
ER  - 
%0 Journal Article
%A Albert Cohen
%A Charbel Farhat
%A Yvon Maday
%A Agustin Somacal
%T Nonlinear compressive reduced basis approximation for PDE’s
%J Comptes Rendus. Mécanique
%D 2023
%P 357-374
%V 351
%N S1
%I Académie des sciences, Paris
%R 10.5802/crmeca.191
%G en
%F CRMECA_2023__351_S1_357_0
Albert Cohen; Charbel Farhat; Yvon Maday; Agustin Somacal. Nonlinear compressive reduced basis approximation for PDE’s. Comptes Rendus. Mécanique, Volume 351 (2023) no. S1, pp. 357-374. doi : 10.5802/crmeca.191. https://comptes-rendus.academie-sciences.fr/mecanique/articles/10.5802/crmeca.191/

[1] Alfio Quarteroni; Andrea Manzoni; Federico Negri Reduced basis methods for partial differential equations: an introduction, Unitext, 92, Springer, 2016 | DOI | Zbl

[2] Jan S. Hesthaven; Gianluigi Rozza; Benjamin Stamm Certified reduced basis methods for parametrized partial differential equations, SpringerBriefs in Mathematics, 590, Springer, 2015 | DOI | Zbl

[3] Yvon Maday; A. Patera Reduced basis methods, Model Order Reduction. Volume 2: Snapshot-based methods and algorithms (P. Benner; S. Grivet-Talocia; A. Quarteroni; G. Rozza; W. Schilders; L. Silveira, eds.), Walter de Gruyter, 2020, pp. 139-179 | DOI | Zbl

[4] Ronald DeVore Nonlinear approximation, Acta Numer., Volume 7 (1998), pp. 51-150 | DOI | Zbl

[5] Ronald DeVore; Boris Hanin; Guergana Petrova Neural network approximation, Acta Numer., Volume 30 (2021), pp. 327-444 | DOI | MR | Zbl

[6] David Amsallem; Matthew J. Zahr; Charbel Farhat Nonlinear model order reduction based on local reduced-order bases, Int. J. Numer. Methods Eng., Volume 92 (2012) no. 10, pp. 891-916 | DOI | MR | Zbl

[7] Sebastian Grimberg; Charbel Farhat; Radek Tezaur; Charbel Bou-Mosleh Mesh sampling and weighting for the hyperreduction of nonlinear Petrov–Galerkin reduced-order models with local reduced-order bases, Int. J. Numer. Methods Eng., Volume 122 (2021) no. 7, pp. 1846-1874 | DOI | MR

[8] Andrea Bonito; Albert Cohen; Ronald Devore; Diane Guignard; Peter Jantsch; Guergana Petrova Nonlinear methods for model reduction, ESAIM, Math. Model. Numer. Anal., Volume 55 (2021) no. 2, pp. 507-531 | DOI | MR | Zbl

[9] Joshua Barnett; Charbel Farhat Quadratic approximation manifold for mitigating the Kolmogorov barrier in nonlinear projection-based model order reduction, J. Comput. Phys., Volume 464 (2022), 111348 | DOI | MR | Zbl

[10] Kookjin Lee; Kevin T. Carlberg Model reduction of dynamical systems on nonlinear manifolds using deep convolutional autoencoders, J. Comput. Phys., Volume 404 (2020), 108973 | MR | Zbl

[11] Stefania Fresca; Luca Dede; Andrea Manzoni A comprehensive deep learning-based approach to reduced order modeling of nonlinear time-dependent parametrized PDEs, J. Sci. Comput., Volume 87 (2021) no. 2, 61 | DOI | MR | Zbl

[12] Anthony Gruber; Max Gunzburger; Lili Ju; Zhu Wang A comparison of neural network architectures for data-driven reduced-order modeling, Comput. Methods Appl. Mech. Eng., Volume 393 (2022), 114764 | DOI | MR | Zbl

[13] Davide Papapicco; Nicola Demo; Michele Girfoglio; Giovanni Stabile; Gianluigi Rozza The Neural Network shifted-proper orthogonal decomposition: A machine learning approach for non-linear reduction of hyperbolic equations, Comput. Methods Appl. Mech. Eng., Volume 392 (2022), 114687 | DOI | MR | Zbl

[14] Zhichao Peng; Min Wang; Fengyan Li A learning-based projection method for model order reduction of transport problems, J. Comput. Appl. Math., Volume 418 (2023), 114560 | DOI | MR | Zbl

[15] Joshua L. Barnett; Charbel Farhat; Yvon Maday Mitigating the Kolmogorov Barrier for the Reduction of Aerodynamic Models using Neural-Network-Augmented Reduced-Order Models, AIAA SCITECH 2023 Forum, American Institute of Aeronautics and Astronautics, Inc. (2023), 0535 | DOI

[16] Florian Bernard; Angelo Iollo; Sébastien Riffaud Reduced-order model for the BGK equation based on POD and optimal transport, J. Comput. Phys., Volume 373 (2018), pp. 545-570 | DOI | MR | Zbl

[17] Nicolas Cagniart; Yvon Maday; Benjamin Stamm Model order reduction for problems with large convection effects, Contributions to partial differential equations and applications (Computational Methods in Applied Sciences), Volume 47, Springer, 2019, pp. 131-150 | DOI | MR | Zbl

[18] Felix Black; Philipp Schulze; Benjamin Unger Projection-based model reduction with dynamically transformed modes, ESAIM, Math. Model. Numer. Anal., Volume 54 (2020) no. 6, pp. 2011-2043 | DOI | MR | Zbl

[19] Willy Haik; Yvon Maday; Ludovic Chamoin Assimilation de données variationnelle et hybride pour le contrôle thermique en temps réel du procédé de fabrication additive, CSMA: 15ème Colloque National en Calcul des Structure (2022) (https://hal.science/hal-03607001)

[20] Benjamin Peherstorfer Breaking the Kolmogorov barrier with nonlinear model reduction, Notices Am. Math. Soc., Volume 69 (2022) no. 5, pp. 725-733 | DOI | MR | Zbl

[21] Albert Cohen; Ronald DeVore; Guergana Petrova; Przemyslaw Wojtaszczyk Optimal stable nonlinear approximation, Found. Comput. Math., Volume 22 (2022) no. 3, pp. 607-648 | DOI | MR | Zbl

[22] Rudy Geelen; Stephen Wright; Karen Willcox Operator inference for non-intrusive model reduction with quadratic manifolds, Comput. Methods Appl. Mech. Eng., Volume 403 (2023), 115717 | DOI | MR | Zbl

[23] Athanasios Papoulis; S. Unnikrishna Pillai Probability, random variables and stochastic processes, McGraw-Hill, 2002

[24] Trevor Hastie; Robert Tibshirani; Jerome Friedman The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Springer Series in Statistics, 2009 | DOI | Zbl

[25] Leo Breiman Random forests, Mach. Learn., Volume 45 (2001), pp. 5-32 | DOI | Zbl

[26] Pierre Geurts; Damien Ernst; Louis Wehenkel Extremely Randomized Trees, Mach. Learn., Volume 63 (2006), pp. 3-42 | DOI | Zbl

[27] Fabian Pedregosa; Gaël Varoquaux; Alexandre Gramfort; Vincent Michel; Bertrand Thirion; Olivier Grisel; Mathieu Blondel; Peter Prettenhofer; Ron Weiss; Vincent Dubourg; Jake Vanderplas; Alexandre Passos; David Cournapeau; Matthieu Brucher; Matthieu Perrot; Édouard Duchesnay Scikit-learn: Machine learning in Python, J. Mach. Learn. Res., Volume 12 (2011) no. Oct, pp. 2825-2830 | MR | Zbl

Cited by Sources:

Comments - Policy