Estimating Linear Regression Using Integrated Likelihood Function.

碩士 === 東海大學 === 統計學系 === 101 === In linear regression modeling, the method of least squares is a general way to find the optimal linear relation of a dependent variable and multiple independent variables (covariates) provided that the covariates are assumed to be given or deterministic to the model....

Full description

Bibliographic Details
Main Authors: Zeng Yi Siou, 曾怡琇
Other Authors: Huang Yu Min
Format: Others
Language:zh-TW
Published: 2013
Online Access:http://ndltd.ncl.edu.tw/handle/73370160829649554047
Description
Summary:碩士 === 東海大學 === 統計學系 === 101 === In linear regression modeling, the method of least squares is a general way to find the optimal linear relation of a dependent variable and multiple independent variables (covariates) provided that the covariates are assumed to be given or deterministic to the model. In practice, the covariates can be collected from real data sources and by natural follow some distributions. The ordinary least square estimates can be less efficient if the covariates are stochastic. In this study, we propose a new method to estimate the regression. We estimate the parameters by maximizing the integrated likelihood function, that is, the joint marginal distribution of the dependent variable. We approximate the integrated likelihood function using selected Monte Carlo samples of covariates through that only important probability weights are accumulated in the likelihood function. The maximum likelihood estimation is obtained applying the Newton-Raphson iterations on the approximated likelihood function. Simulation examples are given and the results are compared to the least squares estimates.