Adaptive learning in lasso models

Regression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern (also known as model selection) in linear models from observations contaminated by noise. We examine a scenario where a fraction of the zero co-variates are highly correlated with non-zero co-variate...

Full description

Bibliographic Details
Main Author: Patnaik, Kaushik
Other Authors: Song, Le
Format: Others
Language:en_US
Published: Georgia Institute of Technology 2016
Subjects:
Online Access:http://hdl.handle.net/1853/54353
id ndltd-GATECH-oai-smartech.gatech.edu-1853-54353
record_format oai_dc
spelling ndltd-GATECH-oai-smartech.gatech.edu-1853-543532016-02-04T03:36:20ZAdaptive learning in lasso modelsPatnaik, KaushikLassoL1 regressionAdaptive methodsActive learningRegression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern (also known as model selection) in linear models from observations contaminated by noise. We examine a scenario where a fraction of the zero co-variates are highly correlated with non-zero co-variates making sparsity recovery difficult. We propose two methods that adaptively increment the regularization parameter to prune the Lasso solution set. We prove that the algorithms achieve consistent model selection with high probability while using fewer samples than traditional Lasso. The algorithm can be extended to a broad set of L1-regularized M-estimators for linear statistical models.Georgia Institute of TechnologySong, Le2016-01-07T17:23:27Z2016-01-07T17:23:27Z2015-122015-08-20December 20152016-01-07T17:23:27ZThesisapplication/pdfhttp://hdl.handle.net/1853/54353en_US
collection NDLTD
language en_US
format Others
sources NDLTD
topic Lasso
L1 regression
Adaptive methods
Active learning
spellingShingle Lasso
L1 regression
Adaptive methods
Active learning
Patnaik, Kaushik
Adaptive learning in lasso models
description Regression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern (also known as model selection) in linear models from observations contaminated by noise. We examine a scenario where a fraction of the zero co-variates are highly correlated with non-zero co-variates making sparsity recovery difficult. We propose two methods that adaptively increment the regularization parameter to prune the Lasso solution set. We prove that the algorithms achieve consistent model selection with high probability while using fewer samples than traditional Lasso. The algorithm can be extended to a broad set of L1-regularized M-estimators for linear statistical models.
author2 Song, Le
author_facet Song, Le
Patnaik, Kaushik
author Patnaik, Kaushik
author_sort Patnaik, Kaushik
title Adaptive learning in lasso models
title_short Adaptive learning in lasso models
title_full Adaptive learning in lasso models
title_fullStr Adaptive learning in lasso models
title_full_unstemmed Adaptive learning in lasso models
title_sort adaptive learning in lasso models
publisher Georgia Institute of Technology
publishDate 2016
url http://hdl.handle.net/1853/54353
work_keys_str_mv AT patnaikkaushik adaptivelearninginlassomodels
_version_ 1718178610721849344