Please use this identifier to cite or link to this item:
http://hdl.handle.net/10553/114745
Title: | Piecewise polynomial activation functions for feedforward neural networks | Authors: | López-Rubio, Ezequiel Ortega Zamorano, Francisco Domínguez, Enrique Muñoz-Pérez, José |
UNESCO Clasification: | 1203 Ciencia de los ordenadores | Keywords: | Activation functions Feedforward neural networks Supervised learning Regression Classification |
Issue Date: | 2019 | Journal: | Neural Processing Letters | Abstract: | Since the origins of artificial neural network research, many models of feedforward networks have been proposed. This paper presents an algorithm which adapts the shape of the activation function to the training data, so that it is learned along with the connection weights. The activation function is interpreted as a piecewise polynomial approximation to the distribution function of the argument of the activation function. An online learning procedure is given, and it is formally proved that it makes the training error decrease or stay the same except for extreme cases. Moreover, the model is computationally simpler than standard feedforward networks, so that it is suitable for implementation on FPGAs and microcontrollers. However, our present proposal is limited to two-layer, one-output-neuron architectures due to the lack of differentiability of the learned activation functions with respect to the node locations. Experimental results are provided, which show the performance of the proposal algorithm for classification and regression applications. | URI: | http://hdl.handle.net/10553/114745 | ISSN: | 1370-4621 | DOI: | 10.1007/s11063-018-09974-4 | Source: | Neural Processing Letters [ISSN 1370-4621], n. 50, p. 121-147 |
Appears in Collections: | Artículos |
SCOPUSTM
Citations
7
checked on Nov 17, 2024
WEB OF SCIENCETM
Citations
6
checked on Nov 17, 2024
Page view(s)
73
checked on Apr 20, 2024
Google ScholarTM
Check
Altmetric
Share
Export metadata
Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.