A universally optimal multistage accelerated stochastic gradient method
© 2019 Neural information processing systems foundation. All rights reserved. We study the problem of minimizing a strongly convex, smooth function when we have noisy estimates of its gradient. We propose a novel multistage accelerated algorithm that is universally optimal in the sense that it achie...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
2021-11-04T16:40:38Z.
|
Subjects: | |
Online Access: | Get fulltext |