Efficient Asynchronous Semi-Stochastic Block Coordinate Descent Methods for Large-Scale SVD
Eigenvector computation such as Singular Value Decomposition (SVD) is one of the most fundamental problems in machine learning, optimization and numerical linear algebra. In recent years, many stochastic variance reduction algorithms and randomized coordinate descent algorithms have been developed t...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2021-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9471835/ |
id |
doaj-3c9dfa5897aa40cbb29340fea5e13efd |
---|---|
record_format |
Article |
spelling |
doaj-3c9dfa5897aa40cbb29340fea5e13efd2021-09-16T23:00:31ZengIEEEIEEE Access2169-35362021-01-01912615912617110.1109/ACCESS.2021.30942829471835Efficient Asynchronous Semi-Stochastic Block Coordinate Descent Methods for Large-Scale SVDFanhua Shang0https://orcid.org/0000-0002-1040-352XZhihui Zhang1https://orcid.org/0000-0003-1394-6586Yuanyuan Liu2Hongying Liu3https://orcid.org/0000-0002-8475-2749Jing Xu4https://orcid.org/0000-0001-8532-2241Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi’an, ChinaKey Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi’an, ChinaKey Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi’an, ChinaKey Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Xi’an, ChinaCollege of Artificial Intelligence, Nankai University, Tianjin, ChinaEigenvector computation such as Singular Value Decomposition (SVD) is one of the most fundamental problems in machine learning, optimization and numerical linear algebra. In recent years, many stochastic variance reduction algorithms and randomized coordinate descent algorithms have been developed to efficiently solve the leading eigenvalue problem. By taking full advantage of both variance reduction and randomized coordinate descent techniques, this paper proposes a novel Semi-stochastic Block Coordinate Descent algorithm (SBCD-SVD), which is more suitable than existing algorithms for large-scale leading eigenvalue problems of SVD, and can obtain linear convergence. Unlike existing stochastic variance reduction and randomized coordinate descent methods, our algorithm inherits their advantages. Moreover, we propose a new Asynchronous parallel Semi-stochastic Block Coordinate Descent algorithm (ASBCD-SVD) and one new Asynchronous parallel Sparse approximated Variance Reduction algorithm (ASVR-SVD) for large-scale dense and sparse datasets, respectively. Finally, we prove that both dense and sparse asynchronous parallel variants can converge linearly. Extensive experimental results show that our algorithms attain high parallel speedup and achieve almost the same performance with significantly shorter time, and thus they can be widely used in various practice applications.https://ieeexplore.ieee.org/document/9471835/Singular value decompositionsemi-stochastic gradientrandomized coordinate descentasynchronous parallelismimage compression |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Fanhua Shang Zhihui Zhang Yuanyuan Liu Hongying Liu Jing Xu |
spellingShingle |
Fanhua Shang Zhihui Zhang Yuanyuan Liu Hongying Liu Jing Xu Efficient Asynchronous Semi-Stochastic Block Coordinate Descent Methods for Large-Scale SVD IEEE Access Singular value decomposition semi-stochastic gradient randomized coordinate descent asynchronous parallelism image compression |
author_facet |
Fanhua Shang Zhihui Zhang Yuanyuan Liu Hongying Liu Jing Xu |
author_sort |
Fanhua Shang |
title |
Efficient Asynchronous Semi-Stochastic Block Coordinate Descent Methods for Large-Scale SVD |
title_short |
Efficient Asynchronous Semi-Stochastic Block Coordinate Descent Methods for Large-Scale SVD |
title_full |
Efficient Asynchronous Semi-Stochastic Block Coordinate Descent Methods for Large-Scale SVD |
title_fullStr |
Efficient Asynchronous Semi-Stochastic Block Coordinate Descent Methods for Large-Scale SVD |
title_full_unstemmed |
Efficient Asynchronous Semi-Stochastic Block Coordinate Descent Methods for Large-Scale SVD |
title_sort |
efficient asynchronous semi-stochastic block coordinate descent methods for large-scale svd |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2021-01-01 |
description |
Eigenvector computation such as Singular Value Decomposition (SVD) is one of the most fundamental problems in machine learning, optimization and numerical linear algebra. In recent years, many stochastic variance reduction algorithms and randomized coordinate descent algorithms have been developed to efficiently solve the leading eigenvalue problem. By taking full advantage of both variance reduction and randomized coordinate descent techniques, this paper proposes a novel Semi-stochastic Block Coordinate Descent algorithm (SBCD-SVD), which is more suitable than existing algorithms for large-scale leading eigenvalue problems of SVD, and can obtain linear convergence. Unlike existing stochastic variance reduction and randomized coordinate descent methods, our algorithm inherits their advantages. Moreover, we propose a new Asynchronous parallel Semi-stochastic Block Coordinate Descent algorithm (ASBCD-SVD) and one new Asynchronous parallel Sparse approximated Variance Reduction algorithm (ASVR-SVD) for large-scale dense and sparse datasets, respectively. Finally, we prove that both dense and sparse asynchronous parallel variants can converge linearly. Extensive experimental results show that our algorithms attain high parallel speedup and achieve almost the same performance with significantly shorter time, and thus they can be widely used in various practice applications. |
topic |
Singular value decomposition semi-stochastic gradient randomized coordinate descent asynchronous parallelism image compression |
url |
https://ieeexplore.ieee.org/document/9471835/ |
work_keys_str_mv |
AT fanhuashang efficientasynchronoussemistochasticblockcoordinatedescentmethodsforlargescalesvd AT zhihuizhang efficientasynchronoussemistochasticblockcoordinatedescentmethodsforlargescalesvd AT yuanyuanliu efficientasynchronoussemistochasticblockcoordinatedescentmethodsforlargescalesvd AT hongyingliu efficientasynchronoussemistochasticblockcoordinatedescentmethodsforlargescalesvd AT jingxu efficientasynchronoussemistochasticblockcoordinatedescentmethodsforlargescalesvd |
_version_ |
1717377799604404224 |