Summary: | 碩士 === 國立東華大學 === 應用數學系 === 90 === In the case of the random design nonparametric regression,
the kernel menthod is the most popular regression function estimator . Howerver,there is a drawback to the kernel method.
That is, it is lower efficiency when the estimator within the
neighborhood of the jump point. A new edges-preserving smoother,
that is called "edges-preserving M-smoother", was proposed by Chu,Glod, Godtliebsen, and Marron (1998). It is based on robust
M-estimatior and using local minima property. In most cases the
edges-preserving M-smoother has a pleasing result.
The contents of this thesis is is to propose two kinds of
edges-preserving M-smoother: the method of local maximum and the
method of global maximum. Then, compare the estimative efficiency at the jump points. Simulation studie demonstrate that the method of local maximum has better estimative
efficiency than the method of globl maximum when the regression
function has too much jumps. But the method of global maximum can also show the position of the jump points when the
regression function has less jumps. So when the regression
function has less jumps, the method of the global maximum is also a good estimative method.
Besides, to reduce the mean squared error of the edges-preserving M-smoother, we give the recommendtion and comparisons for the choice of parameters h and g. At the end of the thesis, we choose a pair h and g, which is the best parameters. For applying to two kinds of edges-preserving M-smoother,they can make two methods arrive to the maximal elliciency.
|