Applying Automatic Differentiation and Truncated Newton Methods to Conditional Random Fields
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 96 === In recent years, labeling sequential data arises in many fields. Conditional random fields are a popular model for solving this type of problems. Its Hessian matrix in a closed form is not easy to derive. This difficulty causes that optimization methods using se...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2008
|
Online Access: | http://ndltd.ncl.edu.tw/handle/68481133280783589228 |
id |
ndltd-TW-096NTU05392026 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-096NTU053920262016-05-11T04:16:25Z http://ndltd.ncl.edu.tw/handle/68481133280783589228 Applying Automatic Differentiation and Truncated Newton Methods to Conditional Random Fields 應用自動微分及截斷牛頓法於條件隨機場 Hsiang-Jui Wang 王湘叡 碩士 國立臺灣大學 資訊工程學研究所 96 In recent years, labeling sequential data arises in many fields. Conditional random fields are a popular model for solving this type of problems. Its Hessian matrix in a closed form is not easy to derive. This difficulty causes that optimization methods using second-order information like the Hessian-vector products may not be suitable. Automatic differentiation is a technique to evaluate derivatives of a function without its gradient function. Moreover, computing Hessian-vector products by automatic differentiation only requires the gradient function but not the Hessian matrix. This thesis first gives a study on the background knowledge of automatic differentiation. Then it merges truncated Newton methods with automatic differentiation for solving conditional random fields. Chih-Jen Lin 林智仁 2008 學位論文 ; thesis 42 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 96 === In recent years, labeling sequential data arises in many fields. Conditional random fields are a popular model for solving this type of problems. Its Hessian matrix in a closed form is not easy to derive. This difficulty causes that optimization methods using second-order information like the Hessian-vector products may not be suitable. Automatic differentiation is a technique to evaluate derivatives of a function without its gradient function. Moreover, computing Hessian-vector products by automatic differentiation only requires the gradient function but not the Hessian matrix. This thesis first gives a study on the background knowledge of automatic differentiation. Then it merges truncated Newton methods with automatic differentiation for solving conditional random fields.
|
author2 |
Chih-Jen Lin |
author_facet |
Chih-Jen Lin Hsiang-Jui Wang 王湘叡 |
author |
Hsiang-Jui Wang 王湘叡 |
spellingShingle |
Hsiang-Jui Wang 王湘叡 Applying Automatic Differentiation and Truncated Newton Methods to Conditional Random Fields |
author_sort |
Hsiang-Jui Wang |
title |
Applying Automatic Differentiation and Truncated Newton Methods to Conditional Random Fields |
title_short |
Applying Automatic Differentiation and Truncated Newton Methods to Conditional Random Fields |
title_full |
Applying Automatic Differentiation and Truncated Newton Methods to Conditional Random Fields |
title_fullStr |
Applying Automatic Differentiation and Truncated Newton Methods to Conditional Random Fields |
title_full_unstemmed |
Applying Automatic Differentiation and Truncated Newton Methods to Conditional Random Fields |
title_sort |
applying automatic differentiation and truncated newton methods to conditional random fields |
publishDate |
2008 |
url |
http://ndltd.ncl.edu.tw/handle/68481133280783589228 |
work_keys_str_mv |
AT hsiangjuiwang applyingautomaticdifferentiationandtruncatednewtonmethodstoconditionalrandomfields AT wángxiāngruì applyingautomaticdifferentiationandtruncatednewtonmethodstoconditionalrandomfields AT hsiangjuiwang yīngyòngzìdòngwēifēnjíjiéduànniúdùnfǎyútiáojiànsuíjīchǎng AT wángxiāngruì yīngyòngzìdòngwēifēnjíjiéduànniúdùnfǎyútiáojiànsuíjīchǎng |
_version_ |
1718265028346380288 |