Linear model reduction techniques design offline low-dimensional subspaces that are tailored to the approximation of solutions to a parameterized partial differential equation, for the purpose of fast online numerical simulations. These methods, such as the Proper Orthogonal Decomposition (POD) or Reduced Basis (RB) methods, are very effective when the family of solutions has fast-decaying Karhunen–Loève eigenvalues or Kolmogorov widths, reflecting the approximability by finite-dimensional linear spaces. On the other hand, they become ineffective when these quantities have a slow decay, in particular for families of solutions to hyperbolic transport equations with parameter-dependent shock positions. The objective of this work is to explore the ability of nonlinear model reduction to circumvent this particular situation. To this end, we first describe particular notions of nonlinear widths that have a substantially faster decay for the aforementioned families. Then, we discuss a systematic approach for achieving better performance via a nonlinear reconstruction from the first coordinates of a linear reduced model approximation, thus allowing us to stay in the same “classical” framework of projection-based model reduction. We analyze the approach and report on its performance for a simple and yet instructive univariate test case.
Les techniques linéaires de réduction de modèles proposent, hors ligne, des sous-espaces de faible dimension adaptés à l’approximation des solutions d’une équation aux dérivées partielles paramétrée, dans le but d’effectuer des simulations numériques rapides en ligne. Ces méthodes, telles que la décomposition orthogonale appropriée (POD) ou les méthodes de base réduite (RB), sont très efficaces lorsque la famille de solutions a des valeurs propres de Karhunen–Loève ou des épaisseurs de Kolmogorov à décroissance rapide, reflétant la possibilité d’approximation par des espaces linéaires de dimension finie. D’autre part, elles deviennent inefficaces lorsque ces quantités ont une décroissance lente, en particulier pour les familles de solutions aux équations de transport hyperboliques avec des positions de choc dépendant des paramètres. L’objectif de ce travail est d’explorer la capacité de la réduction de modèle non linéaire à contourner cette situation particulière. À cette fin, nous décrivons d’abord des notions particulières d’épaisseurs non linéaires qui ont une décroissance substantiellement plus rapide pour les familles susmentionnées. Ensuite, nous discutons d’une approche systématique permettant d’obtenir de meilleures performances via une reconstruction non linéaire à partir des premières coordonnées d’une approximation de modèle réduit linéaire, ce qui nous permet de rester dans le même cadre “classique” de la réduction de modèle basée sur la projection. Nous analysons l’approche et rendons compte de ses performances pour un cas test univarié simple mais instructif.
Revised:
Accepted:
Online First:
Published online:
Mot clés : base réduite non linéaire, acquisition comprimée, variété des solutions, apprentissage automatique, $m$-épaisseur
Albert Cohen 1; Charbel Farhat 2, 3, 4; Yvon Maday 1; Agustin Somacal 1
@article{CRMECA_2023__351_S1_357_0, author = {Albert Cohen and Charbel Farhat and Yvon Maday and Agustin Somacal}, title = {Nonlinear compressive reduced basis approximation for {PDE{\textquoteright}s}}, journal = {Comptes Rendus. M\'ecanique}, pages = {357--374}, publisher = {Acad\'emie des sciences, Paris}, volume = {351}, number = {S1}, year = {2023}, doi = {10.5802/crmeca.191}, language = {en}, }
TY - JOUR AU - Albert Cohen AU - Charbel Farhat AU - Yvon Maday AU - Agustin Somacal TI - Nonlinear compressive reduced basis approximation for PDE’s JO - Comptes Rendus. Mécanique PY - 2023 SP - 357 EP - 374 VL - 351 IS - S1 PB - Académie des sciences, Paris DO - 10.5802/crmeca.191 LA - en ID - CRMECA_2023__351_S1_357_0 ER -
Albert Cohen; Charbel Farhat; Yvon Maday; Agustin Somacal. Nonlinear compressive reduced basis approximation for PDE’s. Comptes Rendus. Mécanique, Volume 351 (2023) no. S1, pp. 357-374. doi : 10.5802/crmeca.191. https://comptes-rendus.academie-sciences.fr/mecanique/articles/10.5802/crmeca.191/
[1] Reduced basis methods for partial differential equations: an introduction, Unitext, 92, Springer, 2016 | DOI | Zbl
[2] Certified reduced basis methods for parametrized partial differential equations, SpringerBriefs in Mathematics, 590, Springer, 2015 | DOI | Zbl
[3] Reduced basis methods, Model Order Reduction. Volume 2: Snapshot-based methods and algorithms (P. Benner; S. Grivet-Talocia; A. Quarteroni; G. Rozza; W. Schilders; L. Silveira, eds.), Walter de Gruyter, 2020, pp. 139-179 | DOI | Zbl
[4] Nonlinear approximation, Acta Numer., Volume 7 (1998), pp. 51-150 | DOI | Zbl
[5] Neural network approximation, Acta Numer., Volume 30 (2021), pp. 327-444 | DOI | MR | Zbl
[6] Nonlinear model order reduction based on local reduced-order bases, Int. J. Numer. Methods Eng., Volume 92 (2012) no. 10, pp. 891-916 | DOI | MR | Zbl
[7] Mesh sampling and weighting for the hyperreduction of nonlinear Petrov–Galerkin reduced-order models with local reduced-order bases, Int. J. Numer. Methods Eng., Volume 122 (2021) no. 7, pp. 1846-1874 | DOI | MR
[8] Nonlinear methods for model reduction, ESAIM, Math. Model. Numer. Anal., Volume 55 (2021) no. 2, pp. 507-531 | DOI | MR | Zbl
[9] Quadratic approximation manifold for mitigating the Kolmogorov barrier in nonlinear projection-based model order reduction, J. Comput. Phys., Volume 464 (2022), 111348 | DOI | MR | Zbl
[10] Model reduction of dynamical systems on nonlinear manifolds using deep convolutional autoencoders, J. Comput. Phys., Volume 404 (2020), 108973 | MR | Zbl
[11] A comprehensive deep learning-based approach to reduced order modeling of nonlinear time-dependent parametrized PDEs, J. Sci. Comput., Volume 87 (2021) no. 2, 61 | DOI | MR | Zbl
[12] A comparison of neural network architectures for data-driven reduced-order modeling, Comput. Methods Appl. Mech. Eng., Volume 393 (2022), 114764 | DOI | MR | Zbl
[13] The Neural Network shifted-proper orthogonal decomposition: A machine learning approach for non-linear reduction of hyperbolic equations, Comput. Methods Appl. Mech. Eng., Volume 392 (2022), 114687 | DOI | MR | Zbl
[14] A learning-based projection method for model order reduction of transport problems, J. Comput. Appl. Math., Volume 418 (2023), 114560 | DOI | MR | Zbl
[15] Mitigating the Kolmogorov Barrier for the Reduction of Aerodynamic Models using Neural-Network-Augmented Reduced-Order Models, AIAA SCITECH 2023 Forum, American Institute of Aeronautics and Astronautics, Inc. (2023), 0535 | DOI
[16] Reduced-order model for the BGK equation based on POD and optimal transport, J. Comput. Phys., Volume 373 (2018), pp. 545-570 | DOI | MR | Zbl
[17] Model order reduction for problems with large convection effects, Contributions to partial differential equations and applications (Computational Methods in Applied Sciences), Volume 47, Springer, 2019, pp. 131-150 | DOI | MR | Zbl
[18] Projection-based model reduction with dynamically transformed modes, ESAIM, Math. Model. Numer. Anal., Volume 54 (2020) no. 6, pp. 2011-2043 | DOI | MR | Zbl
[19] Assimilation de données variationnelle et hybride pour le contrôle thermique en temps réel du procédé de fabrication additive, CSMA: 15ème Colloque National en Calcul des Structure (2022) (https://hal.science/hal-03607001)
[20] Breaking the Kolmogorov barrier with nonlinear model reduction, Notices Am. Math. Soc., Volume 69 (2022) no. 5, pp. 725-733 | DOI | MR | Zbl
[21] Optimal stable nonlinear approximation, Found. Comput. Math., Volume 22 (2022) no. 3, pp. 607-648 | DOI | MR | Zbl
[22] Operator inference for non-intrusive model reduction with quadratic manifolds, Comput. Methods Appl. Mech. Eng., Volume 403 (2023), 115717 | DOI | MR | Zbl
[23] Probability, random variables and stochastic processes, McGraw-Hill, 2002
[24] The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Springer Series in Statistics, 2009 | DOI | Zbl
[25] Random forests, Mach. Learn., Volume 45 (2001), pp. 5-32 | DOI | Zbl
[26] Extremely Randomized Trees, Mach. Learn., Volume 63 (2006), pp. 3-42 | DOI | Zbl
[27] Scikit-learn: Machine learning in Python, J. Mach. Learn. Res., Volume 12 (2011) no. Oct, pp. 2825-2830 | MR | Zbl
Cited by Sources:
Comments - Policy