Title: A benchmark of recent population-based metaheuristic algorithms for multi-layer neural network training
Abstract: Multi-layer neural networks (MLNNs) are extensively used in many industrial applications. Training is the crucial task for MLNNs. While gradient descent-based approaches are most commonly employed here, they suffer from drawbacks such as getting stuck in local optima. One approach to address this is to employ population-based metaheuristic algorithms. Since a variety of such algorithms have been proposed in the literature, in this paper we benchmark the performance of 15 metaheuristic algorithms, including state-of-the-art as well as some of the most recent algorithms, for neural network training. In particular, we evaluate particle swarm optimisation (PSO), differential evolution (DE), artificial bee colony (ABC), imperialist competitive algorithm (ICA), cuckoo search (CS), gravitational search algorithm (GSA), bat algorithm (BA), firefly algorithm (FA), grey wolf optimiser (GWO), ant lion optimiser (ALO), dragonfly algorithm (DA), sine cosine algorithm (SCA), whale optimisation algorithm (WOA), grasshopper optimisation algorithm (GOA), and salp swarm algorithm (SSA), and assess their performance on different classification algorithms.
Publication Year: 2020
Publication Date: 2020-07-08
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 42
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot