A universally optimal multistage accelerated stochastic gradient method

© 2019 Neural information processing systems foundation. All rights reserved. We study the problem of minimizing a strongly convex, smooth function when we have noisy estimates of its gradient. We propose a novel multistage accelerated algorithm that is universally optimal in the sense that it achie...

Full description

Bibliographic Details
Main Authors: Aybat, NS (Author), Fallah, A (Author), Gürbüzbalaban, M (Author), Ozdaglar, A (Author)
Format: Article
Language:English
Published: 2021-11-04T16:40:38Z.
Subjects:
Online Access:Get fulltext