A Single Attention-Based Combination of CNN and RNN for Relation Classification

As a vital task in natural language processing, relation classification aims to identify relation types between entities from texts. In this paper, we propose a novel Att-RCNN model to extract text features and classify relations by combining recurrent neural network (RNN) and convolutional neural n...

Full description

Bibliographic Details
Main Authors: Xiaoyu Guo, Hui Zhang, Haijun Yang, Lianyuan Xu, Zhiwen Ye
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8606107/
Description
Summary:As a vital task in natural language processing, relation classification aims to identify relation types between entities from texts. In this paper, we propose a novel Att-RCNN model to extract text features and classify relations by combining recurrent neural network (RNN) and convolutional neural network (CNN). This network structure utilizes RNN to extract higher level contextual representations of words and CNN to obtain sentence features for the relation classification task. In addition to this network structure, both word-level and sentence-level attention mechanisms are employed in Att-RCNN to strengthen critical words and features to promote the model performance. Moreover, we conduct experiments on four distinct datasets: SemEval-2010 task 8, SemEval-2018 task 7 (two subtask datasets), and KBP37 dataset. Compared with the previous public models, Att-RCNN has the overall best performance and achieves the highest $F_{1}$ score, especially on the KBP37 dataset.
ISSN:2169-3536