Abstract
Artificial Neural Networks (ANNs) were first proposed, by Hecht-Nielsen [1987], as multivariate function approximators based on Kolmogorov's theorem. Since then, several researchers have proven that multilayer ANNs, with an arbitrary squashing function in the hidden layer, can approximate any multivariate function to any degree of accuracy. Based on these results, researchers have attempted to train backpropagation networks to realize arbitrary functions. Although their results are encouraging, this technique has many shortcomings and may lead to an inappropriate response by the network. In this paper, we present an alternative neural network architecture, based on cascaded univariate function approximators, which can be trained to multiply two real numbers and may be used to realize arbitrary multivariate function mappings.
Original language | English |
---|---|
Pages | 2217-2222 |
Number of pages | 6 |
State | Published - 1994 |
Event | Proceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) - Orlando, FL, USA Duration: 27 Jun 1994 → 29 Jun 1994 |
Conference
Conference | Proceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) |
---|---|
City | Orlando, FL, USA |
Period | 27/06/94 → 29/06/94 |