The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms

We present Stochastic Optimisation Software (SOS), a Java platform facilitating the algorithmic design process and the evaluation of metaheuristic optimisation algorithms. SOS reduces the burden of coding miscellaneous methods for dealing with several bothersome and time-demanding tasks such as para...

Full description

Bibliographic Details
Main Authors: Fabio Caraffini, Giovanni Iacca
Format: Article
Language:English
Published: MDPI AG 2020-05-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/8/5/785
id doaj-ed8c8e0c791d448992fe756823e0ade6
record_format Article
spelling doaj-ed8c8e0c791d448992fe756823e0ade62020-11-25T03:27:56ZengMDPI AGMathematics2227-73902020-05-01878578510.3390/math8050785The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation AlgorithmsFabio Caraffini 0Giovanni Iacca 1Institute of Artificial Intelligence, School of Computer Science and Informatics, De Montfort University, Leicester LE1 9BH, UKDepartment of Information Engineering and Computer Science, University of Trento, 38123 Trento, ItalyWe present Stochastic Optimisation Software (SOS), a Java platform facilitating the algorithmic design process and the evaluation of metaheuristic optimisation algorithms. SOS reduces the burden of coding miscellaneous methods for dealing with several bothersome and time-demanding tasks such as parameter tuning, implementation of comparison algorithms and testbed problems, collecting and processing data to display results, measuring algorithmic overhead, etc. SOS provides numerous off-the-shelf methods including: (1) customised implementations of statistical tests, such as the Wilcoxon rank-sum test and the Holm–Bonferroni procedure, for comparing the performances of optimisation algorithms and automatically generating result tables in PDF and formats; (2) the implementation of an original advanced statistical routine for accurately comparing couples of stochastic optimisation algorithms; (3) the implementation of a novel testbed suite for continuous optimisation, derived from the IEEE CEC 2014 benchmark, allowing for controlled activation of the rotation on each testbed function. Moreover, we briefly comment on the current state of the literature in stochastic optimisation and highlight similarities shared by modern metaheuristics inspired by nature. We argue that the vast majority of these algorithms are simply a reformulation of the same methods and that metaheuristics for optimisation should be simply treated as stochastic processes with less emphasis on the inspiring metaphor behind them.https://www.mdpi.com/2227-7390/8/5/785algorithmic designmetaheuristic optimisationevolutionary computationswarm intelligencememetic computingparameter tuning
collection DOAJ
language English
format Article
sources DOAJ
author Fabio Caraffini
Giovanni Iacca
spellingShingle Fabio Caraffini
Giovanni Iacca
The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms
Mathematics
algorithmic design
metaheuristic optimisation
evolutionary computation
swarm intelligence
memetic computing
parameter tuning
author_facet Fabio Caraffini
Giovanni Iacca
author_sort Fabio Caraffini
title The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms
title_short The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms
title_full The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms
title_fullStr The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms
title_full_unstemmed The SOS Platform: Designing, Tuning and Statistically Benchmarking Optimisation Algorithms
title_sort sos platform: designing, tuning and statistically benchmarking optimisation algorithms
publisher MDPI AG
series Mathematics
issn 2227-7390
publishDate 2020-05-01
description We present Stochastic Optimisation Software (SOS), a Java platform facilitating the algorithmic design process and the evaluation of metaheuristic optimisation algorithms. SOS reduces the burden of coding miscellaneous methods for dealing with several bothersome and time-demanding tasks such as parameter tuning, implementation of comparison algorithms and testbed problems, collecting and processing data to display results, measuring algorithmic overhead, etc. SOS provides numerous off-the-shelf methods including: (1) customised implementations of statistical tests, such as the Wilcoxon rank-sum test and the Holm–Bonferroni procedure, for comparing the performances of optimisation algorithms and automatically generating result tables in PDF and formats; (2) the implementation of an original advanced statistical routine for accurately comparing couples of stochastic optimisation algorithms; (3) the implementation of a novel testbed suite for continuous optimisation, derived from the IEEE CEC 2014 benchmark, allowing for controlled activation of the rotation on each testbed function. Moreover, we briefly comment on the current state of the literature in stochastic optimisation and highlight similarities shared by modern metaheuristics inspired by nature. We argue that the vast majority of these algorithms are simply a reformulation of the same methods and that metaheuristics for optimisation should be simply treated as stochastic processes with less emphasis on the inspiring metaphor behind them.
topic algorithmic design
metaheuristic optimisation
evolutionary computation
swarm intelligence
memetic computing
parameter tuning
url https://www.mdpi.com/2227-7390/8/5/785
work_keys_str_mv AT fabiocaraffini thesosplatformdesigningtuningandstatisticallybenchmarkingoptimisationalgorithms
AT giovanniiacca thesosplatformdesigningtuningandstatisticallybenchmarkingoptimisationalgorithms
AT fabiocaraffini sosplatformdesigningtuningandstatisticallybenchmarkingoptimisationalgorithms
AT giovanniiacca sosplatformdesigningtuningandstatisticallybenchmarkingoptimisationalgorithms
_version_ 1724586281685483520