Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization

We consider a stochastic continuous submodular huge-scale optimization problem, which arises naturally in many applications such as machine learning. Due to high-dimensional data, the computation of the whole gradient vector can become prohibitively expensive. To reduce the complexity and memory req...

Full description

Bibliographic Details
Main Authors: Zhigang Li, Mingchuan Zhang, Junlong Zhu, Ruijuan Zheng, Qikun Zhang, Qingtao Wu
Format: Article
Language:English
Published: Hindawi-Wiley 2018-01-01
Series:Complexity
Online Access:http://dx.doi.org/10.1155/2018/2609471
id doaj-1b372e670d0b4a41b368cb1625d4bff2
record_format Article
spelling doaj-1b372e670d0b4a41b368cb1625d4bff22020-11-24T21:52:57ZengHindawi-WileyComplexity1076-27871099-05262018-01-01201810.1155/2018/26094712609471Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular MaximizationZhigang Li0Mingchuan Zhang1Junlong Zhu2Ruijuan Zheng3Qikun Zhang4Qingtao Wu5School of Computer and Communication Engineering, Zhengzhou University of Light Industry, Zhengzhou, 450002, ChinaInformation Engineering College, Henan University of Science and Technology, Luoyang, 471023, ChinaInformation Engineering College, Henan University of Science and Technology, Luoyang, 471023, ChinaInformation Engineering College, Henan University of Science and Technology, Luoyang, 471023, ChinaSchool of Computer and Communication Engineering, Zhengzhou University of Light Industry, Zhengzhou, 450002, ChinaInformation Engineering College, Henan University of Science and Technology, Luoyang, 471023, ChinaWe consider a stochastic continuous submodular huge-scale optimization problem, which arises naturally in many applications such as machine learning. Due to high-dimensional data, the computation of the whole gradient vector can become prohibitively expensive. To reduce the complexity and memory requirements, we propose a stochastic block-coordinate gradient projection algorithm for maximizing continuous submodular functions, which chooses a random subset of gradient vector and updates the estimates along the positive gradient direction. We prove that the estimates of all nodes generated by the algorithm converge to some stationary points with probability 1. Moreover, we show that the proposed algorithm achieves the tight (pmin/2F⁎-ϵ) approximation guarantee after O(1/ϵ2) iterations for DR-submodular functions by choosing appropriate step sizes. Furthermore, we also show that the algorithm achieves the tight (γ2/1+γ2pminF⁎-ϵ) approximation guarantee after O(1/ϵ2) iterations for weakly DR-submodular functions with parameter γ by choosing diminishing step sizes.http://dx.doi.org/10.1155/2018/2609471
collection DOAJ
language English
format Article
sources DOAJ
author Zhigang Li
Mingchuan Zhang
Junlong Zhu
Ruijuan Zheng
Qikun Zhang
Qingtao Wu
spellingShingle Zhigang Li
Mingchuan Zhang
Junlong Zhu
Ruijuan Zheng
Qikun Zhang
Qingtao Wu
Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization
Complexity
author_facet Zhigang Li
Mingchuan Zhang
Junlong Zhu
Ruijuan Zheng
Qikun Zhang
Qingtao Wu
author_sort Zhigang Li
title Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization
title_short Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization
title_full Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization
title_fullStr Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization
title_full_unstemmed Stochastic Block-Coordinate Gradient Projection Algorithms for Submodular Maximization
title_sort stochastic block-coordinate gradient projection algorithms for submodular maximization
publisher Hindawi-Wiley
series Complexity
issn 1076-2787
1099-0526
publishDate 2018-01-01
description We consider a stochastic continuous submodular huge-scale optimization problem, which arises naturally in many applications such as machine learning. Due to high-dimensional data, the computation of the whole gradient vector can become prohibitively expensive. To reduce the complexity and memory requirements, we propose a stochastic block-coordinate gradient projection algorithm for maximizing continuous submodular functions, which chooses a random subset of gradient vector and updates the estimates along the positive gradient direction. We prove that the estimates of all nodes generated by the algorithm converge to some stationary points with probability 1. Moreover, we show that the proposed algorithm achieves the tight (pmin/2F⁎-ϵ) approximation guarantee after O(1/ϵ2) iterations for DR-submodular functions by choosing appropriate step sizes. Furthermore, we also show that the algorithm achieves the tight (γ2/1+γ2pminF⁎-ϵ) approximation guarantee after O(1/ϵ2) iterations for weakly DR-submodular functions with parameter γ by choosing diminishing step sizes.
url http://dx.doi.org/10.1155/2018/2609471
work_keys_str_mv AT zhigangli stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization
AT mingchuanzhang stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization
AT junlongzhu stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization
AT ruijuanzheng stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization
AT qikunzhang stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization
AT qingtaowu stochasticblockcoordinategradientprojectionalgorithmsforsubmodularmaximization
_version_ 1725873934846394368