Ciência_Iscte
Publications
Publication Detailed Description
Journal Title
SIAM Journal on Mathematics of Data Science
Year (definitive publication)
2025
Language
English
Country
United States of America
More Information
Web of Science®
Scopus
Google Scholar
This publication is not indexed in Overton
Abstract
The Neural tangent kernel (NTK) has emerged as a fundamental concept in the study of wide neural networks. In particular, it is known that the positivity of the NTK is directly related to the memorization capacity of sufficiently wide networks, i.e., to the possibility of reaching zero loss in training via gradient descent. Here we will improve on previous works and obtain a sharp result concerning the positivity of the NTK of feedforward networks of any depth. More precisely, we will show that, for any nonpolynomial activation function, the NTK is strictly positive definite. Our results are based on a novel characterization of polynomial functions, which is of independent interest.
Acknowledgements
--
Keywords
Wide neural networks,Neural tangent kernel,Memorization,Global minima
Fields of Science and Technology Classification
- Mathematics - Natural Sciences
Funding Records
| Funding Reference | Funding Entity |
|---|---|
| UIDB/04459/2020 | Fundação para a Ciência e a Tecnologia |
| UIDP/04459/2020 | Fundação para a Ciência e a Tecnologia |
Português