Neural networks as function approximators: teaching a neural network to multiply

David A. Vaccari, Edward Wojciechowski

Research output: Contribution to conferencePaperpeer-review

6 Scopus citations

Abstract

Artificial Neural Networks (ANNs) were first proposed, by Hecht-Nielsen [1987], as multivariate function approximators based on Kolmogorov's theorem. Since then, several researchers have proven that multilayer ANNs, with an arbitrary squashing function in the hidden layer, can approximate any multivariate function to any degree of accuracy. Based on these results, researchers have attempted to train backpropagation networks to realize arbitrary functions. Although their results are encouraging, this technique has many shortcomings and may lead to an inappropriate response by the network. In this paper, we present an alternative neural network architecture, based on cascaded univariate function approximators, which can be trained to multiply two real numbers and may be used to realize arbitrary multivariate function mappings.

Original languageEnglish
Pages2217-2222
Number of pages6
StatePublished - 1994
EventProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7) - Orlando, FL, USA
Duration: 27 Jun 199429 Jun 1994

Conference

ConferenceProceedings of the 1994 IEEE International Conference on Neural Networks. Part 1 (of 7)
CityOrlando, FL, USA
Period27/06/9429/06/94

Fingerprint

Dive into the research topics of 'Neural networks as function approximators: teaching a neural network to multiply'. Together they form a unique fingerprint.

Cite this