Zero‐anaphora resolution in Korean based on deep language representation model: BERT

AbstractIt is necessary to achieve high performance in the task of zero anaphora resolution (ZAR) for completely understanding the texts in Korean, Japanese, Chinese, and various other languages. Deep‐learning‐based models are being employed for building ZAR systems, owing to the success of deep lea...

Full description

Bibliographic Details
Main Authors: Youngtae Kim, Dongyul Ra, Soojong Lim
Format: Article
Language:English
Published: Electronics and Telecommunications Research Institute (ETRI) 2020-10-01
Series:ETRI Journal
Subjects:
Online Access:https://doi.org/10.4218/etrij.2019-0441