Preponderantly increasing/decreasing data in regression analysis

For the given data (wI, xI, yI ), i = 1, . . . , n, and the given model function f (x; θ), where θ is a vector of unknown parameters, the goal of regression analysis is to obtain estimator θ∗ of the unknown parameters θ such that the vector of residuals is minimized in some sense. The common approac...

Full description

Bibliographic Details
Main Author: Darija Marković
Format: Article
Language:English
Published: Croatian Operational Research Society 2016-12-01
Series:Croatian Operational Research Review
Subjects:
Online Access:http://hrcak.srce.hr/index.php?show=clanak&id_clanak_jezik=257102&lang=en
Description
Summary:For the given data (wI, xI, yI ), i = 1, . . . , n, and the given model function f (x; θ), where θ is a vector of unknown parameters, the goal of regression analysis is to obtain estimator θ∗ of the unknown parameters θ such that the vector of residuals is minimized in some sense. The common approach to this problem of minimization is the least-squares method, that is minimizing the L2 norm of the vector of residuals. For nonlinear model functions, what is necessary is finding at least the sufficient conditions on the data that will guarantee the existence of the best least-squares estimator. In this paper we will describe and examine in detail the property of preponderant increase/decrease of the data, which ensures the existence of the best estimator for certain important nonlinear model functions.
ISSN:1848-0225
1848-9931