NEO: NEuro-Inspired Optimization—A Fractional Time Series Approach

Solving optimization problems is a recurrent theme across different fields, including large-scale machine learning systems and deep learning. Often in practical applications, we encounter objective functions where the Hessian is ill-conditioned, which precludes us from using optimization algorithms...

Full description

Bibliographic Details
Main Authors: Sarthak Chatterjee, Subhro Das, Sérgio Pequito
Format: Article
Language:English
Published: Frontiers Media S.A. 2021-09-01
Series:Frontiers in Physiology
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fphys.2021.724044/full
id doaj-b38707dfa2ab4d9b8cfdb45a2161e22c
record_format Article
spelling doaj-b38707dfa2ab4d9b8cfdb45a2161e22c2021-09-21T15:55:23ZengFrontiers Media S.A.Frontiers in Physiology1664-042X2021-09-011210.3389/fphys.2021.724044724044NEO: NEuro-Inspired Optimization—A Fractional Time Series ApproachSarthak Chatterjee0Subhro Das1Sérgio Pequito2Department of Electrical, Computer, and Systems Engineering, Rensselaer Polytechnic Institute, Troy, NY, United StatesMIT-IBM Watson AI Lab, IBM Research, Cambridge, MA, United StatesDelft Center for Systems and Control, Delft University of Technology, Delft, NetherlandsSolving optimization problems is a recurrent theme across different fields, including large-scale machine learning systems and deep learning. Often in practical applications, we encounter objective functions where the Hessian is ill-conditioned, which precludes us from using optimization algorithms utilizing second-order information. In this paper, we propose to use fractional time series analysis methods that have successfully been used to model neurophysiological processes in order to circumvent this issue. In particular, the long memory property of fractional time series exhibiting non-exponential power-law decay of trajectories seems to model behavior associated with the local curvature of the objective function at a given point. Specifically, we propose a NEuro-inspired Optimization (NEO) method that leverages this behavior, which contrasts with the short memory characteristics of currently used methods (e.g., gradient descent and heavy-ball). We provide evidence of the efficacy of the proposed method on a wide variety of settings implicitly found in practice.https://www.frontiersin.org/articles/10.3389/fphys.2021.724044/fulloptimizationtime series processesiterative optimization algorithmslong memory time seriesfractional calculus
collection DOAJ
language English
format Article
sources DOAJ
author Sarthak Chatterjee
Subhro Das
Sérgio Pequito
spellingShingle Sarthak Chatterjee
Subhro Das
Sérgio Pequito
NEO: NEuro-Inspired Optimization—A Fractional Time Series Approach
Frontiers in Physiology
optimization
time series processes
iterative optimization algorithms
long memory time series
fractional calculus
author_facet Sarthak Chatterjee
Subhro Das
Sérgio Pequito
author_sort Sarthak Chatterjee
title NEO: NEuro-Inspired Optimization—A Fractional Time Series Approach
title_short NEO: NEuro-Inspired Optimization—A Fractional Time Series Approach
title_full NEO: NEuro-Inspired Optimization—A Fractional Time Series Approach
title_fullStr NEO: NEuro-Inspired Optimization—A Fractional Time Series Approach
title_full_unstemmed NEO: NEuro-Inspired Optimization—A Fractional Time Series Approach
title_sort neo: neuro-inspired optimization—a fractional time series approach
publisher Frontiers Media S.A.
series Frontiers in Physiology
issn 1664-042X
publishDate 2021-09-01
description Solving optimization problems is a recurrent theme across different fields, including large-scale machine learning systems and deep learning. Often in practical applications, we encounter objective functions where the Hessian is ill-conditioned, which precludes us from using optimization algorithms utilizing second-order information. In this paper, we propose to use fractional time series analysis methods that have successfully been used to model neurophysiological processes in order to circumvent this issue. In particular, the long memory property of fractional time series exhibiting non-exponential power-law decay of trajectories seems to model behavior associated with the local curvature of the objective function at a given point. Specifically, we propose a NEuro-inspired Optimization (NEO) method that leverages this behavior, which contrasts with the short memory characteristics of currently used methods (e.g., gradient descent and heavy-ball). We provide evidence of the efficacy of the proposed method on a wide variety of settings implicitly found in practice.
topic optimization
time series processes
iterative optimization algorithms
long memory time series
fractional calculus
url https://www.frontiersin.org/articles/10.3389/fphys.2021.724044/full
work_keys_str_mv AT sarthakchatterjee neoneuroinspiredoptimizationafractionaltimeseriesapproach
AT subhrodas neoneuroinspiredoptimizationafractionaltimeseriesapproach
AT sergiopequito neoneuroinspiredoptimizationafractionaltimeseriesapproach
_version_ 1717372299325210624