Abstract
Artificial neural networks (ANN) is a mathematical model which offers the possibility to develop such a global and integrated approach, without providing any physical explanation for the relationships that have to be validated from a physical point of view. The design of neural networks structures is an important problem for ANN applications which is difficult to solve theoretically. The definition of optimal network architecture for any particular problem is quite difficult and remains an open problem. This work intends to describe a pruning method to optimize the architecture: optimal brain surgeon (OBS). This method balances the accuracy and the time complexity to achieve better neural networks performance. In order to validate the approach, results need to be consistent with experimental data and the rather large tolerance permits the applicability of the methodology. Root mean square error (RMSE) values were small (i.e. < 0.05), thus the network output for each test pattern was relatively close to the respective targets. Artificial neural network is a powerful statistical procedure permitting to relate the parameter of a given problem to its desired result by considering a complex network of neurons. The ANN architecture (neurons number on hidden layer, weights connection, etc.) optimization was pruned by OBS. This method deleted and adjusted weights within a reasonable time, while the unit-OBS method accelerated convergence by deleting one neuron at a time.
Keywords: Artificial neural networks, architecture design, brain surgeon, pruning