...
Several parameters can be specified by a user.
Parameters overview
Training method
This keyword indicates method that will be used to optimize neural network weights. All these methods, with an exception of stiff and Levenberg-Marquardt algorithms, are heuristic algorithms that have a number of adjustable parameters. These parameters were selected by respective authors to provide the fast convergence of the algorithms. Below of implementation of each algorithm are presented.
...
The default value is SuperSAB algorithm. It is not recommended to use Differential equations and Levenberg-Marquardt for any tasks with large number of descriptors or molecules (typically more than 100).
Number of neurons
Only single layer neural networks are available. The number of neurons in input and output layers correspond to number of descriptors and number of properties, respectively.
Learning iterations
The number of iterations that will be used in neural network training. The neural network training is stopped if there is no improvement of RMSE error for the validation set after ITERATIONS iterations (this corresponds to detection of the early stopping point). The training is also stopped if total number of iterations in program equals to 5*ITERATIONS.
Ensemble
Indicates the number of network in ensemble that will be analyzed. To use ASNN algorithm typically 64 - 100 neural networks are used. For fast preliminary calculations 10 networks can be also used to explore data.
Disable ASNN
Disables ASNN correction and uses standard ensemble average.
References
- Tetko, I.V. Associative Neural Network, Neural Processing Letters, 2002, 16, 187-199.
- Tetko, I.V. Neural Network Studies. 4. Introduction to Associative Neural Networks, J. Chem. Inf. Comput. Sci., 2002, 42, 717 -728.
Tetko, I.V. Associative Neural Network, In: Neural Networks: Methods and Applications, Livingstone, D.J., Ed., The Humana Press Inc., vol. 458, 2008, pp. 185-202.