We construct and discuss a functional equation with contraction property. The solutions are real univariate polynomials. The series solving the natural fixed point iterations have immediate interpretation in terms of Neural Networks with recursive properties and controlled accuracy.
Révisé le :
Accepté le :
Publié le :
Bruno Després 1 ; Matthieu Ancellin 2
@article{CRMATH_2020__358_9-10_1059_0, author = {Bruno Despr\'es and Matthieu Ancellin}, title = {A functional equation with polynomial solutions and application to {Neural} {Networks}}, journal = {Comptes Rendus. Math\'ematique}, pages = {1059--1072}, publisher = {Acad\'emie des sciences, Paris}, volume = {358}, number = {9-10}, year = {2020}, doi = {10.5802/crmath.124}, language = {en}, }
TY - JOUR AU - Bruno Després AU - Matthieu Ancellin TI - A functional equation with polynomial solutions and application to Neural Networks JO - Comptes Rendus. Mathématique PY - 2020 SP - 1059 EP - 1072 VL - 358 IS - 9-10 PB - Académie des sciences, Paris DO - 10.5802/crmath.124 LA - en ID - CRMATH_2020__358_9-10_1059_0 ER -
%0 Journal Article %A Bruno Després %A Matthieu Ancellin %T A functional equation with polynomial solutions and application to Neural Networks %J Comptes Rendus. Mathématique %D 2020 %P 1059-1072 %V 358 %N 9-10 %I Académie des sciences, Paris %R 10.5802/crmath.124 %G en %F CRMATH_2020__358_9-10_1059_0
Bruno Després; Matthieu Ancellin. A functional equation with polynomial solutions and application to Neural Networks. Comptes Rendus. Mathématique, Volume 358 (2020) no. 9-10, pp. 1059-1072. doi : 10.5802/crmath.124. https://comptes-rendus.academie-sciences.fr/mathematique/articles/10.5802/crmath.124/
[1] Machine Learning and Control Theory (https://arxiv.org/abs/2006.05604)
[2] A Guide to Recurrent Neural Networks and Backpropagation, 2001 (published in Dallas project, SICS technical report)
[3] Linear and nonlinear functional analysis with applications, Other Titles in Applied Mathematics, 130, Society for Industrial and Applied Mathematics, (SIAM), 2013 | Zbl
[4] Approximation by superpositions of a sigmoidal function, Math. Control Signals Systems, Volume 2 (1989) no. 4, pp. 303-314 | DOI | MR | Zbl
[5] Nonlinear Approximation and (Deep) ReLU Networks (https://arxiv.org/abs/1905.02199v1)
[6] Machine Learning, adaptive numerical approximation and VOF methods, 2020 (colloquium LJLL/Sorbonne university, https://www.youtube.com/watch?v=OPKFYe01hH4)
[7] Machine learning design of volume of fluid schemes for compressible flows, J. Comput. Phys., Volume 408 (2020), 109275 | DOI | MR
[8] Deep Learning, Adaptive Computation and Machine Learning, MIT Press, 2016 | Zbl
[9] Weierstrass’s function and chaos, Hokkaido Math. J., Volume 12 (1983) no. 3, pp. 333-342 | MR | Zbl
[10] The Takagi Function and Its Generalization, Japan J. Appl. Math., Volume 1 (1984) no. 1, pp. 183-199 | DOI | MR | Zbl
[11] ReLU Deep Neural Networks and Linear Finite Elements, J. Comput. Math., Volume 38 (2020) no. 3, pp. 502-527
[12] Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units, Commun. Comput. Phys., Volume 27 (2019) no. 2, pp. 379-411 | MR
[13] Deep Network Approximation for Smooth Functions, 2020 (https://blog.nus.edu.sg/matzuows/publications/)
[14] Optim: A mathematical optimization package for Julia, J. Open Source Softw., Volume 3 (2018) no. 24, 615 | DOI
[15] Deep ReLU networks and high-order finite element methods, Anal. Appl. (Singap.), Volume 18 (2020) no. 5, pp. 715-770 | DOI | MR | Zbl
[16] Forward-Mode Automatic Differentiation in Julia (2016) (https://arxiv.org/abs/1607.07892)
[17] Error bounds for approximations with deep ReLU networks, Neural Netw., Volume 97 (2017), pp. 103-114 | DOI | Zbl
Cité par Sources :
Commentaires - Politique