A Gradient Boosting Algorithm Based on Gaussian Process Regression
碩士 === 國立臺灣大學 === 資訊管理學研究所 === 106 === Gaussian process regression (GPR) is an important model in the field of machine learning. GPR model is flexible, robust, and easy to implement. However, it suffers from expensive computational cost: O(n^3) for training time, O(n^2) for training memory and O(n)...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2018
|
Online Access: | http://ndltd.ncl.edu.tw/handle/sa3vf5 |
Summary: | 碩士 === 國立臺灣大學 === 資訊管理學研究所 === 106 === Gaussian process regression (GPR) is an important model in the field of machine learning. GPR model is flexible, robust, and easy to implement. However, it suffers from expensive computational cost: O(n^3) for training time, O(n^2) for training memory and O(n) for testing time, where n is the number of observations in training data. In this work, we develop a fast approximation method to reduce the time and space complexity. The proposed method is related to the design of gradient boosting algorithm. We conduct experiments using real-world dataset and demonstrate that the proposed method can achieve comparable prediction performance compared to the standard GPR model and some state-of-the-art regression methods.
|
---|