Summary: | This thesis is concerned with nonparametric regression and regularization. In particular, wavelet regression using a Lévy prior model is investigated. The use of this prior is motivated by the statistical properties, such as heavy-tails, common in many datasets of interest, such as those in financial time series. The Lévy process we propose captures the heavy tails of the wavelet coefficients of an unknown function. We study the Besov regularity of the wavelet coefficients and establish the connection between the parameters of the Lévy wavelet prior model and Besov spaces. At first, we gave a necessary and sufficient condition such that the realizations of the prior model fall into a certain class of Besov spaces. We show that the tempered stable distribution preserves its functional form for different time scales. We prove that this scaling behaviour can model the exponential-decay-across-scale property of the wavelet coefficients without imposing any specified structure on the coefficients’ energy. We also introduce a Lévy wavelet mixture model to capture the sparseness of the wavelet coefficients. We show that this sparse model exhibits a thresholding rule. We also study the Lévy tempered stable prior model under a Bayesian framework. For the prior specified, we gave a closed form to the posterior Lévy measure of the wavelet coefficients and estimate the hyperparameters of the prior model in both a simulation study and for the S&P 500 time series. We focus on density estimation using a penalized likelihood approach. Primarily, we study the wavelet Tsallis entropy and Fisher information and give closed-form expressions for these measures when the wavelet coefficients are driven by a tempered stable process. Then, we develop an entropic regularization based on the wavelet Tsallis entropy and show that the penalized maximum likelihood method improves the convergence of the estimates.
|