A Comparison of Variance and Renyi's Entropy with Application to Machine Learning
<p> This research explores parametric and nonparametric similarities and disagreements between variance and the information theoretic measure of entropy, specifically Renyi’s entropy. A history and known relationships of the two different uncertainty measures is examined. Then, twent...
Main Author: | |
---|---|
Language: | EN |
Published: |
Northern Illinois University
2017
|
Subjects: | |
Online Access: | http://pqdtopen.proquest.com/#viewpdf?dispub=10603911 |
Summary: | <p> This research explores parametric and nonparametric similarities and disagreements between variance and the information theoretic measure of entropy, specifically Renyi’s entropy. A history and known relationships of the two different uncertainty measures is examined. Then, twenty discrete and continuous parametric families are tabulated with their respective variance and Renyi entropy functions ordered to understand the behavior of these two measures of uncertainty. Finally, an algorithm for variable selection using Renyi’s Quadratic Entropy and its kernel estimation is explored and compared to other popular selection methods using real data.</p><p> |
---|