Conditional gradient methods via stochastic path-integrated differential estimator
We propose a class of novel variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang ct al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as wel...
Main Author: | |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
International Machine Learning Society,
2021-04-27T17:11:23Z.
|
Subjects: | |
Online Access: | Get fulltext |
Summary: | We propose a class of novel variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang ct al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting, while matching the best known FW variants in the convex case. We also extend our framework à la conditional gradient sliding (CGS) of Lan & Zhou (2016), and propose SPIDER-CGS. National Science Foundation (U.S.) (Grant 200021178865/1) National Science Foundation (U.S.). Career (Grant 1846088) |
---|