Interactive Multi-Head Attention Networks for Aspect-Level Sentiment Classification
Aspect-level sentiment classification (ASC) has received much attention these years. With the successful application of attention networks in many fields, attention-based ASC has aroused great interest. However, most of the previous methods did not analyze the contribution of words well and the cont...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2019-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8890661/ |
id |
doaj-d930b4a1b27348e0a260bcdd8879f679 |
---|---|
record_format |
Article |
spelling |
doaj-d930b4a1b27348e0a260bcdd8879f6792021-03-30T00:41:48ZengIEEEIEEE Access2169-35362019-01-01716001716002810.1109/ACCESS.2019.29512838890661Interactive Multi-Head Attention Networks for Aspect-Level Sentiment ClassificationQiuyue Zhang0https://orcid.org/0000-0001-7229-7200Ran Lu1https://orcid.org/0000-0003-3851-7761Qicai Wang2https://orcid.org/0000-0003-3250-2235Zhenfang Zhu3https://orcid.org/0000-0002-7217-3109Peiyu Liu4https://orcid.org/0000-0002-2559-8913School of Information Science and Engineering, Shandong Normal University, Jinan, ChinaSchool of Information Science and Engineering, Shandong Normal University, Jinan, ChinaSchool of Information Science and Engineering, Shandong Normal University, Jinan, ChinaSchool of Information Science and Electrical Engineering, Shandong Jiaotong University, Jinan, ChinaSchool of Information Science and Engineering, Shandong Normal University, Jinan, ChinaAspect-level sentiment classification (ASC) has received much attention these years. With the successful application of attention networks in many fields, attention-based ASC has aroused great interest. However, most of the previous methods did not analyze the contribution of words well and the context-aspect term interaction was not well implemented, which largely limit the efficacy of models. In this paper, we exploit a novel method that is efficient and mainly adopts Multi-head Attention (MHA) networks. First, the word embedding and aspect term embedding are pre-trained by Bidirectional Encoder Representations from Transformers (BERT). Second, we make full use of MHA and convolutional operation to obtain hidden states, which is superior to traditional neural networks. Then, the interaction between context and aspect term is further implemented through averaging pooling and MHA. We conduct extensive experiments on three benchmark datasets and the final results show that the Interactive Multi-head Attention Networks (IMAN) model consistently outperforms the state-of-the-art methods on ASC task.https://ieeexplore.ieee.org/document/8890661/Natural language processingaspect-levelsentiment classificationattention mechanism |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Qiuyue Zhang Ran Lu Qicai Wang Zhenfang Zhu Peiyu Liu |
spellingShingle |
Qiuyue Zhang Ran Lu Qicai Wang Zhenfang Zhu Peiyu Liu Interactive Multi-Head Attention Networks for Aspect-Level Sentiment Classification IEEE Access Natural language processing aspect-level sentiment classification attention mechanism |
author_facet |
Qiuyue Zhang Ran Lu Qicai Wang Zhenfang Zhu Peiyu Liu |
author_sort |
Qiuyue Zhang |
title |
Interactive Multi-Head Attention Networks for Aspect-Level Sentiment Classification |
title_short |
Interactive Multi-Head Attention Networks for Aspect-Level Sentiment Classification |
title_full |
Interactive Multi-Head Attention Networks for Aspect-Level Sentiment Classification |
title_fullStr |
Interactive Multi-Head Attention Networks for Aspect-Level Sentiment Classification |
title_full_unstemmed |
Interactive Multi-Head Attention Networks for Aspect-Level Sentiment Classification |
title_sort |
interactive multi-head attention networks for aspect-level sentiment classification |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2019-01-01 |
description |
Aspect-level sentiment classification (ASC) has received much attention these years. With the successful application of attention networks in many fields, attention-based ASC has aroused great interest. However, most of the previous methods did not analyze the contribution of words well and the context-aspect term interaction was not well implemented, which largely limit the efficacy of models. In this paper, we exploit a novel method that is efficient and mainly adopts Multi-head Attention (MHA) networks. First, the word embedding and aspect term embedding are pre-trained by Bidirectional Encoder Representations from Transformers (BERT). Second, we make full use of MHA and convolutional operation to obtain hidden states, which is superior to traditional neural networks. Then, the interaction between context and aspect term is further implemented through averaging pooling and MHA. We conduct extensive experiments on three benchmark datasets and the final results show that the Interactive Multi-head Attention Networks (IMAN) model consistently outperforms the state-of-the-art methods on ASC task. |
topic |
Natural language processing aspect-level sentiment classification attention mechanism |
url |
https://ieeexplore.ieee.org/document/8890661/ |
work_keys_str_mv |
AT qiuyuezhang interactivemultiheadattentionnetworksforaspectlevelsentimentclassification AT ranlu interactivemultiheadattentionnetworksforaspectlevelsentimentclassification AT qicaiwang interactivemultiheadattentionnetworksforaspectlevelsentimentclassification AT zhenfangzhu interactivemultiheadattentionnetworksforaspectlevelsentimentclassification AT peiyuliu interactivemultiheadattentionnetworksforaspectlevelsentimentclassification |
_version_ |
1724187962125582336 |