On Accuracy of PDF Divergence Estimators and Their Applicability to Representative Data Sampling
Generalisation error estimation is an important issue in machine learning. Cross-validation traditionally used for this purpose requires building multiple models and repeating the whole procedure many times in order to produce reliable error estimates. It is however possible to accurately estimate t...
Main Authors: | Katarzyna Musial, Bogdan Gabrys, Marcin Budka |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2011-07-01
|
Series: | Entropy |
Subjects: | |
Online Access: | http://www.mdpi.com/1099-4300/13/7/1229/ |
Similar Items
-
Principles of Bayesian Inference Using General Divergence Criteria
by: Jack Jewson, et al.
Published: (2018-06-01) -
Symmetric-Approximation Energy-Based Estimation of Distribution (SEED): A Continuous Optimization Algorithm
by: Juan De Anda-Suarez, et al.
Published: (2019-01-01) -
Quantifying Model Error in Bayesian Parameter Estimation
by: White, Staci A.
Published: (2015) -
The Use Of Kullback-Leibler Divergence In Opinion Retrieval
by: Cen, Kun
Published: (2008) -
The Use Of Kullback-Leibler Divergence In Opinion Retrieval
by: Cen, Kun
Published: (2008)