RHOASo: An Early Stop Hyper-Parameter Optimization Algorithm
This work proposes a new algorithm for optimizing hyper-parameters of a machine learning algorithm, RHOASo, based on conditional optimization of concave asymptotic functions. A comparative analysis of the algorithm is presented, giving particular emphasis to two important properties: the capability...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-09-01
|
Series: | Mathematics |
Subjects: | |
Online Access: | https://www.mdpi.com/2227-7390/9/18/2334 |
id |
doaj-c99a866bcb6044098fdefe4a45359f74 |
---|---|
record_format |
Article |
spelling |
doaj-c99a866bcb6044098fdefe4a45359f742021-09-26T00:38:43ZengMDPI AGMathematics2227-73902021-09-0192334233410.3390/math9182334RHOASo: An Early Stop Hyper-Parameter Optimization AlgorithmÁngel Luis Muñoz Castañeda0Noemí DeCastro-García1David Escudero García2Department of Mathematics, Universidad de León, 24007 León, SpainDepartment of Mathematics, Universidad de León, 24007 León, SpainResearch Institute of Applied Sciences in Cybersecurity (RIASC), Universidad de León, 24007 León, SpainThis work proposes a new algorithm for optimizing hyper-parameters of a machine learning algorithm, RHOASo, based on conditional optimization of concave asymptotic functions. A comparative analysis of the algorithm is presented, giving particular emphasis to two important properties: the capability of the algorithm to work efficiently with a small part of a dataset and to finish the tuning process automatically, that is, without making explicit, by the user, the number of iterations that the algorithm must perform. Statistical analyses over 16 public benchmark datasets comparing the performance of seven hyper-parameter optimization algorithms with RHOASo were carried out. The efficiency of RHOASo presents the positive statistically significant differences concerning the other hyper-parameter optimization algorithms considered in the experiments. Furthermore, it is shown that, on average, the algorithm needs around <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>70</mn><mo>%</mo></mrow></semantics></math></inline-formula> of the iterations needed by other algorithms to achieve competitive performance. The results show that the algorithm presents significant stability regarding the size of the used dataset partition.https://www.mdpi.com/2227-7390/9/18/2334hyperparametersmachine learningoptimizationinference |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Ángel Luis Muñoz Castañeda Noemí DeCastro-García David Escudero García |
spellingShingle |
Ángel Luis Muñoz Castañeda Noemí DeCastro-García David Escudero García RHOASo: An Early Stop Hyper-Parameter Optimization Algorithm Mathematics hyperparameters machine learning optimization inference |
author_facet |
Ángel Luis Muñoz Castañeda Noemí DeCastro-García David Escudero García |
author_sort |
Ángel Luis Muñoz Castañeda |
title |
RHOASo: An Early Stop Hyper-Parameter Optimization Algorithm |
title_short |
RHOASo: An Early Stop Hyper-Parameter Optimization Algorithm |
title_full |
RHOASo: An Early Stop Hyper-Parameter Optimization Algorithm |
title_fullStr |
RHOASo: An Early Stop Hyper-Parameter Optimization Algorithm |
title_full_unstemmed |
RHOASo: An Early Stop Hyper-Parameter Optimization Algorithm |
title_sort |
rhoaso: an early stop hyper-parameter optimization algorithm |
publisher |
MDPI AG |
series |
Mathematics |
issn |
2227-7390 |
publishDate |
2021-09-01 |
description |
This work proposes a new algorithm for optimizing hyper-parameters of a machine learning algorithm, RHOASo, based on conditional optimization of concave asymptotic functions. A comparative analysis of the algorithm is presented, giving particular emphasis to two important properties: the capability of the algorithm to work efficiently with a small part of a dataset and to finish the tuning process automatically, that is, without making explicit, by the user, the number of iterations that the algorithm must perform. Statistical analyses over 16 public benchmark datasets comparing the performance of seven hyper-parameter optimization algorithms with RHOASo were carried out. The efficiency of RHOASo presents the positive statistically significant differences concerning the other hyper-parameter optimization algorithms considered in the experiments. Furthermore, it is shown that, on average, the algorithm needs around <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><mrow><mn>70</mn><mo>%</mo></mrow></semantics></math></inline-formula> of the iterations needed by other algorithms to achieve competitive performance. The results show that the algorithm presents significant stability regarding the size of the used dataset partition. |
topic |
hyperparameters machine learning optimization inference |
url |
https://www.mdpi.com/2227-7390/9/18/2334 |
work_keys_str_mv |
AT angelluismunozcastaneda rhoasoanearlystophyperparameteroptimizationalgorithm AT noemidecastrogarcia rhoasoanearlystophyperparameteroptimizationalgorithm AT davidescuderogarcia rhoasoanearlystophyperparameteroptimizationalgorithm |
_version_ |
1716870201800130560 |