Academic Journals Database
Disseminating quality controlled scientific knowledge

Effect of Weight Updates Functions in QSAR/QSPR Modeling using Artificial Neural Network

ADD TO MY LIST
 
Author(s): Mohammad Asadollahi- Baboli

Journal: Journal of Artificial Intelligence
ISSN 1994-5450

Volume: 4;
Issue: 4;
Start page: 257;
Date: 2011;
VIEW PDF   PDF DOWNLOAD PDF   Download PDF Original page

Keywords: intrinsic viscosity | levenberg-marquart algorithm | Quantitative structure activity/property relationship | partition coefficient | inhibition

ABSTRACT
There are many parameters in artificial neural network which must be optimized for developing an acceptable model. One of these parameters is Weight Update Functions (WUFs) which imply how the weights are re-calculated in artificial neural networks. To show the importance of weight update functions in modeling, four different common WUFs were selected: (1) Basic Back Propagation (BBP), (2) Conjugate Gradient (CG), (3) Quasi-Newton (Q-N) and (4) Levenberg-Marquart (L-M) algorithms. Then the effects of these WUFs were studied on four different data sets (two QSPR and two QSAR sets). It is shown that selecting the most favorable weight update function is a key role in ANNs models developing and the performance of developed ANNs model is directly related to the selected weight update function. Moreover it is shown in this article that the levenberg-Marquart algorithm is the best WUF for developing ANNs models. The accuracy and prediction ability of ANNs models was illustrated using leave-one-out and leave-multiple-out cross validation techniques. High R2 and low Root Mean Square Error (RMSE) values for leave-one-out (Q2LOO>0.75 in all four sets) and leave-multiple-out (R2L25%O>0.70 in all four sets) procedures revealed that the levenberg-Marquart algorithm is a good algorithm for developing robust ANNs models.
Why do you need a reservation system?      Save time & money - Smart Internet Solutions