Academic Journals Database
Disseminating quality controlled scientific knowledge

Selección eficiente de arquitecturas neuronales empleando técnicas destructivas y de regularización

ADD TO MY LIST
 
Author(s): Andrés Eduardo Gaona Barrera | Dora María Ballesteros Larrotta

Journal: Tecnura
ISSN 0123-921X

Volume: 16;
Issue: 33;
Start page: 158;
Date: 2012;
Original page

Keywords: algorithm back-propagation | neural networks | regularization | pruning techniques

ABSTRACT
This article shows a detailed comparison of both theoretical and practical Ontogenetic Neural Networks obtained through pruning and regularization algorithms. We initially deal with the concept of a regularized error function and the different ways to modify such a function (weight decay (WD), soft weight sharing, and Chauvin penalty). Then some of the most representative pruning algorithms are considered, particularly the OBD (Optimal Brain Damage) algorithm. We select OBD and WD within the problem of the XOR function with the purpose of analyzing pruning techniques and regularization algorithms. The basic back-propagation algorithm is used in both WD and the inverse Hessian matrix in OBD. According to the results, WD is faster than OBD, but it deletes a smaller number of weights. Additionally, OBD reduces the complexity of the neural-network architecture, but its computational cost is still high.

Tango Jona
Tangokurs Rapperswil-Jona

     Affiliate Program