logo CRAS
Comptes Rendus. Mathématique

Analyse numérique
A functional equation with polynomial solutions and application to Neural Networks
Comptes Rendus. Mathématique, Tome 358 (2020) no. 9-10, pp. 1059-1072.

We construct and discuss a functional equation with contraction property. The solutions are real univariate polynomials. The series solving the natural fixed point iterations have immediate interpretation in terms of Neural Networks with recursive properties and controlled accuracy.

Reçu le :
Révisé le :
Accepté le :
Publié le :
DOI : https://doi.org/10.5802/crmath.124
Classification : 65Q20,  65Y99,  78M32
@article{CRMATH_2020__358_9-10_1059_0,
     author = {Bruno Despr\'es and Matthieu Ancellin},
     title = {A functional equation with polynomial solutions and application to Neural Networks},
     journal = {Comptes Rendus. Math\'ematique},
     pages = {1059--1072},
     publisher = {Acad\'emie des sciences, Paris},
     volume = {358},
     number = {9-10},
     year = {2020},
     doi = {10.5802/crmath.124},
     language = {en},
}
Bruno Després; Matthieu Ancellin. A functional equation with polynomial solutions and application to Neural Networks. Comptes Rendus. Mathématique, Tome 358 (2020) no. 9-10, pp. 1059-1072. doi : 10.5802/crmath.124. https://comptes-rendus.academie-sciences.fr/mathematique/item/CRMATH_2020__358_9-10_1059_0/

[1] Alain Bensoussan; Yiqun Li; Dinh Phan Cao Nguyen; Minh-Binh Tran; Sheung Chi Phillip Yam; Xiang Zhou Machine Learning and Control Theory (https://arxiv.org/abs/2006.05604)

[2] Mikael Bodén A Guide to Recurrent Neural Networks and Backpropagation, 2001 (published in Dallas project, SICS technical report)

[3] Philippe G. Ciarlet Linear and nonlinear functional analysis with applications, Other Titles in Applied Mathematics, Volume 130, Society for Industrial and Applied Mathematics, (SIAM), 2013 | Zbl 1293.46001

[4] George Cybenko Approximation by superpositions of a sigmoidal function, Math. Control Signals Systems, Volume 2 (1989) no. 4, pp. 303-314 | Article | MR 1015670 | Zbl 0679.94019

[5] Ingrid C. Daubechies; Ronald A. DeVore; Simon Foucart; Boris L. Hanin; Guergana Petrova Nonlinear Approximation and (Deep) ReLU Networks (https://arxiv.org/abs/1905.02199v1)

[6] Bruno Després Machine Learning, adaptive numerical approximation and VOF methods, 2020 (colloquium LJLL/Sorbonne university, https://www.youtube.com/watch?v=OPKFYe01hH4)

[7] Bruno Després; Hervé Jourdren Machine learning design of volume of fluid schemes for compressible flows, J. Comput. Phys., Volume 408 (2020), 109275 | Article | MR 4062318

[8] Ian Goodfellow; Yoshua Bengio; Aaron Courville Deep Learning, Adaptive Computation and Machine Learning, MIT Press, 2016 | Zbl 1373.68009

[9] Masayoshi Hata; Masaya Yamaguti Weierstrass’s function and chaos, Hokkaido Math. J., Volume 12 (1983) no. 3, pp. 333-342 | MR 719972 | Zbl 0522.26006

[10] Masayoshi Hata; Masaya Yamaguti The Takagi Function and Its Generalization, Japan J. Appl. Math., Volume 1 (1984) no. 1, pp. 183-199 | Article | MR 839313 | Zbl 0604.26004

[11] Juncai He; Lin Li; Jinchao Xu; Chunyue Zheng ReLU Deep Neural Networks and Linear Finite Elements, J. Comput. Math., Volume 38 (2020) no. 3, pp. 502-527

[12] Bo Li; Shanshan Tang; Haijun Yu Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units, Commun. Comput. Phys., Volume 27 (2019) no. 2, pp. 379-411 | MR 4040947

[13] Jianfeng Lu; Zuowei Shen; Haizhao Yang; Shijun Zhang Deep Network Approximation for Smooth Functions, 2020 (https://blog.nus.edu.sg/matzuows/publications/)

[14] Patrick K. Mogensen; Asbjørn N. Riseth Optim: A mathematical optimization package for Julia, J. Open Source Softw., Volume 3 (2018) no. 24, 615 | Article

[15] Joost A. A. Opschoor; Philipp C. Petersen; Christoph Schwab Deep ReLU networks and high-order finite element methods, Anal. Appl. (Singap.), Volume 18 (2020) no. 5, pp. 715-770 | Article | MR 4131037 | Zbl 07272155

[16] Jarett Revels; Miles Lubin; Theodore Papamarkou Forward-Mode Automatic Differentiation in Julia (2016) (https://arxiv.org/abs/1607.07892)

[17] Dmitry Yarotsky Error bounds for approximations with deep ReLU networks, Neural Netw., Volume 97 (2017), pp. 103-114 | Article | Zbl 1429.68260