Error Bound of Mode-Based Additive Models
Due to their flexibility and interpretability, additive models are powerful tools for high-dimensional mean regression and variable selection. However, the least-squares loss-based mean regression models suffer from sensitivity to non-Gaussian noises, and there is also a need to improve the model’s...
Main Authors: | Hao Deng, Jianghong Chen, Biqin Song, Zhibin Pan |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-05-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/23/6/651 |
Similar Items
-
Robust Variable Selection and Estimation Based on Kernel Modal Regression
by: Changying Guo, et al.
Published: (2019-04-01) -
Reproducing kernel functions and homogenizing transforms
by: Nuray Yildirim Elif, et al.
Published: (2021-01-01) -
Functional inverse regression and reproducing kernel Hilbert space
by: Ren, Haobo
Published: (2006) -
Reproducing kernel method for the solutions of non-linear partial differential equations
by: Elif Nuray Yildirim, et al.
Published: (2021-01-01) -
A method for approximate missing data from data error measured with l norm
by: Benjawan Rodjanadid, et al.
Published: (2021-06-01)