Applying Automatic Differentiation and Truncated Newton Methods to Conditional Random Fields
碩士 === 國立臺灣大學 === 資訊工程學研究所 === 96 === In recent years, labeling sequential data arises in many fields. Conditional random fields are a popular model for solving this type of problems. Its Hessian matrix in a closed form is not easy to derive. This difficulty causes that optimization methods using se...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2008
|
Online Access: | http://ndltd.ncl.edu.tw/handle/68481133280783589228 |
Summary: | 碩士 === 國立臺灣大學 === 資訊工程學研究所 === 96 === In recent years, labeling sequential data arises in many fields. Conditional random fields are a popular model for solving this type of problems. Its Hessian matrix in a closed form is not easy to derive. This difficulty causes that optimization methods using second-order information like the Hessian-vector products may not be suitable. Automatic differentiation is a technique to evaluate derivatives of a function without its gradient function. Moreover, computing Hessian-vector products by automatic differentiation only requires the gradient function but not the Hessian matrix. This thesis first gives a study on the background knowledge of automatic differentiation. Then it merges truncated Newton methods with automatic differentiation for solving conditional random fields.
|
---|