Linear‐Time Korean Morphological Analysis Using an Action‐based Local Monotonic Attention Mechanism

For Korean language processing, morphological analysis is a critical component that requires extensive work. This morphological analysis can be conducted in an end‐to‐end manner without requiring a complicated feature design using a sequence‐to‐sequence model. However, the sequence‐to‐sequence model...

Full description

Bibliographic Details
Main Authors: Hyunsun Hwang, Changki Lee
Format: Article
Language:English
Published: Electronics and Telecommunications Research Institute (ETRI) 2019-08-01
Series:ETRI Journal
Subjects:
Online Access:https://doi.org/10.4218/etrij.2018-0456
id doaj-e86cb36aa42b4e9f8150a177a77e4f2d
record_format Article
spelling doaj-e86cb36aa42b4e9f8150a177a77e4f2d2020-11-25T03:07:59ZengElectronics and Telecommunications Research Institute (ETRI)ETRI Journal1225-64632019-08-0142110110710.4218/etrij.2018-045610.4218/etrij.2018-0456Linear‐Time Korean Morphological Analysis Using an Action‐based Local Monotonic Attention MechanismHyunsun HwangChangki LeeFor Korean language processing, morphological analysis is a critical component that requires extensive work. This morphological analysis can be conducted in an end‐to‐end manner without requiring a complicated feature design using a sequence‐to‐sequence model. However, the sequence‐to‐sequence model has a time complexity of O(n2) for an input length n when using the attention mechanism technique for high performance. In this study, we propose a linear‐time Korean morphological analysis model using a local monotonic attention mechanism relying on monotonic alignment, which is a characteristic of Korean morphological analysis. The proposed model indicates an extreme improvement in a single threaded environment and a high morphometric F1‐measure even for a hard attention model with the elimination of the attention mechanism formula.https://doi.org/10.4218/etrij.2018-0456deep learningkorean morphological analysislocal attention mechanismnatural language processingsequence‐to‐sequence learning
collection DOAJ
language English
format Article
sources DOAJ
author Hyunsun Hwang
Changki Lee
spellingShingle Hyunsun Hwang
Changki Lee
Linear‐Time Korean Morphological Analysis Using an Action‐based Local Monotonic Attention Mechanism
ETRI Journal
deep learning
korean morphological analysis
local attention mechanism
natural language processing
sequence‐to‐sequence learning
author_facet Hyunsun Hwang
Changki Lee
author_sort Hyunsun Hwang
title Linear‐Time Korean Morphological Analysis Using an Action‐based Local Monotonic Attention Mechanism
title_short Linear‐Time Korean Morphological Analysis Using an Action‐based Local Monotonic Attention Mechanism
title_full Linear‐Time Korean Morphological Analysis Using an Action‐based Local Monotonic Attention Mechanism
title_fullStr Linear‐Time Korean Morphological Analysis Using an Action‐based Local Monotonic Attention Mechanism
title_full_unstemmed Linear‐Time Korean Morphological Analysis Using an Action‐based Local Monotonic Attention Mechanism
title_sort linear‐time korean morphological analysis using an action‐based local monotonic attention mechanism
publisher Electronics and Telecommunications Research Institute (ETRI)
series ETRI Journal
issn 1225-6463
publishDate 2019-08-01
description For Korean language processing, morphological analysis is a critical component that requires extensive work. This morphological analysis can be conducted in an end‐to‐end manner without requiring a complicated feature design using a sequence‐to‐sequence model. However, the sequence‐to‐sequence model has a time complexity of O(n2) for an input length n when using the attention mechanism technique for high performance. In this study, we propose a linear‐time Korean morphological analysis model using a local monotonic attention mechanism relying on monotonic alignment, which is a characteristic of Korean morphological analysis. The proposed model indicates an extreme improvement in a single threaded environment and a high morphometric F1‐measure even for a hard attention model with the elimination of the attention mechanism formula.
topic deep learning
korean morphological analysis
local attention mechanism
natural language processing
sequence‐to‐sequence learning
url https://doi.org/10.4218/etrij.2018-0456
work_keys_str_mv AT hyunsunhwang lineartimekoreanmorphologicalanalysisusinganactionbasedlocalmonotonicattentionmechanism
AT changkilee lineartimekoreanmorphologicalanalysisusinganactionbasedlocalmonotonicattentionmechanism
_version_ 1724667834504577024