The Oracle Inequalities on Simultaneous Lasso and Dantzig Selector in High-Dimensional Nonparametric Regression

During the last few years, a great deal of attention has been focused on Lasso and Dantzig selector in high-dimensional linear regression when the number of variables can be much larger than the sample size. Under a sparsity scenario, the authors (see, e.g., Bickel et al., 2009, Bunea et al., 2007,...

Full description

Bibliographic Details
Main Authors: Shiqing Wang, Limin Su
Format: Article
Language:English
Published: Hindawi Limited 2013-01-01
Series:Mathematical Problems in Engineering
Online Access:http://dx.doi.org/10.1155/2013/571361
id doaj-c6ee86f4527c42ea986b1831acb71879
record_format Article
spelling doaj-c6ee86f4527c42ea986b1831acb718792020-11-25T00:48:56ZengHindawi LimitedMathematical Problems in Engineering1024-123X1563-51472013-01-01201310.1155/2013/571361571361The Oracle Inequalities on Simultaneous Lasso and Dantzig Selector in High-Dimensional Nonparametric RegressionShiqing Wang0Limin Su1College of Mathematics and Information Sciences, North China University of Water Resources and Electric Power, Zhengzhou 450011, ChinaCollege of Mathematics and Information Sciences, North China University of Water Resources and Electric Power, Zhengzhou 450011, ChinaDuring the last few years, a great deal of attention has been focused on Lasso and Dantzig selector in high-dimensional linear regression when the number of variables can be much larger than the sample size. Under a sparsity scenario, the authors (see, e.g., Bickel et al., 2009, Bunea et al., 2007, Candes and Tao, 2007, Candès and Tao, 2007, Donoho et al., 2006, Koltchinskii, 2009, Koltchinskii, 2009, Meinshausen and Yu, 2009, Rosenbaum and Tsybakov, 2010, Tsybakov, 2006, van de Geer, 2008, and Zhang and Huang, 2008) discussed the relations between Lasso and Dantzig selector and derived sparsity oracle inequalities for the prediction risk and bounds on the estimation loss. In this paper, we point out that some of the authors overemphasize the role of some sparsity conditions, and the assumptions based on this sparsity condition may cause bad results. We give better assumptions and the methods that avoid using the sparsity condition. As a comparison with the results by Bickel et al., 2009, more precise oracle inequalities for the prediction risk and bounds on the estimation loss are derived when the number of variables can be much larger than the sample size.http://dx.doi.org/10.1155/2013/571361
collection DOAJ
language English
format Article
sources DOAJ
author Shiqing Wang
Limin Su
spellingShingle Shiqing Wang
Limin Su
The Oracle Inequalities on Simultaneous Lasso and Dantzig Selector in High-Dimensional Nonparametric Regression
Mathematical Problems in Engineering
author_facet Shiqing Wang
Limin Su
author_sort Shiqing Wang
title The Oracle Inequalities on Simultaneous Lasso and Dantzig Selector in High-Dimensional Nonparametric Regression
title_short The Oracle Inequalities on Simultaneous Lasso and Dantzig Selector in High-Dimensional Nonparametric Regression
title_full The Oracle Inequalities on Simultaneous Lasso and Dantzig Selector in High-Dimensional Nonparametric Regression
title_fullStr The Oracle Inequalities on Simultaneous Lasso and Dantzig Selector in High-Dimensional Nonparametric Regression
title_full_unstemmed The Oracle Inequalities on Simultaneous Lasso and Dantzig Selector in High-Dimensional Nonparametric Regression
title_sort oracle inequalities on simultaneous lasso and dantzig selector in high-dimensional nonparametric regression
publisher Hindawi Limited
series Mathematical Problems in Engineering
issn 1024-123X
1563-5147
publishDate 2013-01-01
description During the last few years, a great deal of attention has been focused on Lasso and Dantzig selector in high-dimensional linear regression when the number of variables can be much larger than the sample size. Under a sparsity scenario, the authors (see, e.g., Bickel et al., 2009, Bunea et al., 2007, Candes and Tao, 2007, Candès and Tao, 2007, Donoho et al., 2006, Koltchinskii, 2009, Koltchinskii, 2009, Meinshausen and Yu, 2009, Rosenbaum and Tsybakov, 2010, Tsybakov, 2006, van de Geer, 2008, and Zhang and Huang, 2008) discussed the relations between Lasso and Dantzig selector and derived sparsity oracle inequalities for the prediction risk and bounds on the estimation loss. In this paper, we point out that some of the authors overemphasize the role of some sparsity conditions, and the assumptions based on this sparsity condition may cause bad results. We give better assumptions and the methods that avoid using the sparsity condition. As a comparison with the results by Bickel et al., 2009, more precise oracle inequalities for the prediction risk and bounds on the estimation loss are derived when the number of variables can be much larger than the sample size.
url http://dx.doi.org/10.1155/2013/571361
work_keys_str_mv AT shiqingwang theoracleinequalitiesonsimultaneouslassoanddantzigselectorinhighdimensionalnonparametricregression
AT liminsu theoracleinequalitiesonsimultaneouslassoanddantzigselectorinhighdimensionalnonparametricregression
AT shiqingwang oracleinequalitiesonsimultaneouslassoanddantzigselectorinhighdimensionalnonparametricregression
AT liminsu oracleinequalitiesonsimultaneouslassoanddantzigselectorinhighdimensionalnonparametricregression
_version_ 1725253954427682816