A Tabu Search Algorithm for the Training of Neural Networks
Abstract
The most widely used training algorithm of neural networks (NNs) is back propagation ( BP), a gradient-based technique that requires significant computational effort. Metaheuristic search techniques such as genetic algorithms, tabu search (TS) and simulated annealing have been recently used to cope with major shortcomings of BP such as the tendency to converge to a local optimal and a slow convergence rate. In this paper, an efficient TS algorithm employing different strategies to provide a balance between intensification and diversification is proposed for the training of NNs. The proposed algorithm is compared with other metaheuristic techniques found in literature using published test problems, and found to outperform them in the majority of the test cases.