Meade's On-line Preprints
Approximation Properties of Local Bases Assembled From Neural Network Transfer
Andrew J. Meade, Jr. and Boris A. Zeldin
Submitted to Mathematical and Computer Modelling, 1997.
Keywords:: (artificial) neural networks, function approximation, local bases, transfer functions.
Abstract: The adaptive data-driven emulation and control of mechanical systems are popular applications of artificial neural networks in engineering. However, multi-layer perceptron training is an ill-posed nonlinear optimization problem. This paper explores a method to constrain network parameters so that conventional computational techniques for function approximation can be used during training. This was accomplished by forming local basis functions which provide accurate approximation and stable evaluation of the network parameters. It is noted that this approach is quite general and does not violate the principles of network architecture. By employing the concept of shift invariant subspaces, this approach yields a new and more robust error condition for feedforward artificial neural networks and allows one to both characterize and control the accuracy of the local bases formed. The two methods used are: 1) adding bases while altering their shape and keeping their spacing constant and 2) adding bases while altering their shape and decreasing their spacing in a coupled fashion. Numerical examples demonstrate the usefulness of the proposed approximation of functions and their derivatives.
This work was supported under under NASA Johnson Space Center grant NAG 9-719 and Office of Naval Research grant N00014-95-1-0741.