Block Coordinate Descent for Regularized Multi-convex Optimization
This thesis considers regularized block multi-convex optimization, where the feasible set and objective function are generally non-convex but convex in each block of variables. I review some of its interesting examples and propose a generalized block coordinate descent (BCD) method. The generalize...
Main Author: | |
---|---|
Other Authors: | |
Format: | Others |
Language: | English |
Published: |
2013
|
Subjects: | |
Online Access: | http://hdl.handle.net/1911/72066 |
id |
ndltd-RICE-oai-scholarship.rice.edu-1911-72066 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-RICE-oai-scholarship.rice.edu-1911-720662013-09-18T03:28:46ZBlock Coordinate Descent for Regularized Multi-convex OptimizationXu, Yangyangblock multi-convexblock coordinate descentKurdyka-Lojasiewicz inequalitynonnegative matrix and tensor factorizationmatrix completiontensor completionThis thesis considers regularized block multi-convex optimization, where the feasible set and objective function are generally non-convex but convex in each block of variables. I review some of its interesting examples and propose a generalized block coordinate descent (BCD) method. The generalized BCD uses three different block-update schemes. Based on the property of one block subproblem, one can freely choose one of the three schemes to update the corresponding block of variables. Appropriate choices of block-update schemes can often speed up the algorithm and greatly save computing time. Under certain conditions, I show that any limit point satisfies the Nash equilibrium conditions. Furthermore, I establish its global convergence and estimate its asymptotic convergence rate by assuming a property based on the Kurdyka-{\L}ojasiewicz inequality. As a consequence, this thesis gives a global linear convergence result of cyclic block coordinate descent for strongly convex optimization. The proposed algorithms are adapted for factorizing nonnegative matrices and tensors, as well as completing them from their incomplete observations. The algorithms were tested on synthetic data, hyperspectral data, as well as image sets from the CBCL, ORL and Swimmer databases. Compared to the existing state-of-the-art algorithms, the proposed algorithms demonstrate superior performance in both speed and solution quality.Yin, Wotao2013-09-16T19:14:02Z2013-09-16T19:14:05Z2013-09-16T19:14:02Z2013-09-16T19:14:05Z2013-052013-09-16May 20132013-09-16T19:14:05Zthesistextapplication/pdfhttp://hdl.handle.net/1911/72066123456789/ETD-2013-05-403eng |
collection |
NDLTD |
language |
English |
format |
Others
|
sources |
NDLTD |
topic |
block multi-convex block coordinate descent Kurdyka-Lojasiewicz inequality nonnegative matrix and tensor factorization matrix completion tensor completion |
spellingShingle |
block multi-convex block coordinate descent Kurdyka-Lojasiewicz inequality nonnegative matrix and tensor factorization matrix completion tensor completion Xu, Yangyang Block Coordinate Descent for Regularized Multi-convex Optimization |
description |
This thesis considers regularized block multi-convex optimization, where the feasible set and objective function are generally non-convex but convex in each block of variables.
I review some of its interesting examples and propose a generalized block coordinate descent (BCD) method. The generalized BCD uses three different block-update schemes.
Based on the property of one block subproblem, one can freely choose one of the three schemes to update the corresponding block of variables. Appropriate choices of block-update schemes can often speed up the algorithm and greatly save computing time.
Under certain conditions, I show that any limit point satisfies the Nash equilibrium conditions. Furthermore, I establish its global convergence and estimate its asymptotic convergence rate by assuming a property based on the Kurdyka-{\L}ojasiewicz inequality. As a consequence, this thesis gives a global linear convergence result of cyclic block coordinate descent for strongly convex optimization. The proposed algorithms are adapted for factorizing nonnegative matrices and tensors, as well as completing them from their incomplete observations. The algorithms were tested on synthetic data, hyperspectral data, as well as image sets from the CBCL, ORL and Swimmer databases. Compared to the existing state-of-the-art algorithms, the proposed algorithms demonstrate superior performance in both speed and solution quality. |
author2 |
Yin, Wotao |
author_facet |
Yin, Wotao Xu, Yangyang |
author |
Xu, Yangyang |
author_sort |
Xu, Yangyang |
title |
Block Coordinate Descent for Regularized Multi-convex Optimization |
title_short |
Block Coordinate Descent for Regularized Multi-convex Optimization |
title_full |
Block Coordinate Descent for Regularized Multi-convex Optimization |
title_fullStr |
Block Coordinate Descent for Regularized Multi-convex Optimization |
title_full_unstemmed |
Block Coordinate Descent for Regularized Multi-convex Optimization |
title_sort |
block coordinate descent for regularized multi-convex optimization |
publishDate |
2013 |
url |
http://hdl.handle.net/1911/72066 |
work_keys_str_mv |
AT xuyangyang blockcoordinatedescentforregularizedmulticonvexoptimization |
_version_ |
1716597533553197056 |