A Bi-Directional LSTM-CNN Model with Attention for Aspect-Level Text Classification
The prevalence that people share their opinions on the products and services in their daily lives on the Internet has generated a large quantity of comment data, which contain great business value. As for comment sentences, they often contain several comment aspects and the sentiment on these aspect...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2018-11-01
|
Series: | Future Internet |
Subjects: | |
Online Access: | https://www.mdpi.com/1999-5903/10/12/116 |
id |
doaj-5572ac84ad3f4bbc89c4db1e6e4ea86c |
---|---|
record_format |
Article |
spelling |
doaj-5572ac84ad3f4bbc89c4db1e6e4ea86c2020-11-24T22:57:26ZengMDPI AGFuture Internet1999-59032018-11-01101211610.3390/fi10120116fi10120116A Bi-Directional LSTM-CNN Model with Attention for Aspect-Level Text ClassificationYonghua Zhu0Xun Gao1Weilin Zhang2Shenkai Liu3Yuanyuan Zhang4Shanghai Film Academy, Shanghai University, Shanghai 200444, ChinaSchool of Computer Engineering and Science, Shanghai University, Shanghai 200444, ChinaSchool of Computer Engineering and Science, Shanghai University, Shanghai 200444, ChinaShanghai Film Academy, Shanghai University, Shanghai 200444, ChinaCollege of Information Technology, Zhejiang Chinese Medical University, Hangzhou 310053, ChinaThe prevalence that people share their opinions on the products and services in their daily lives on the Internet has generated a large quantity of comment data, which contain great business value. As for comment sentences, they often contain several comment aspects and the sentiment on these aspects are different, which makes it meaningless to give an overall sentiment polarity of the sentence. In this paper, we introduce Attention-based Aspect-level Recurrent Convolutional Neural Network (AARCNN) to analyze the remarks at aspect-level. The model integrates attention mechanism and target information analysis, which enables the model to concentrate on the important parts of the sentence and to make full use of the target information. The model uses bidirectional LSTM (Bi-LSTM) to build the memory of the sentence, and then CNN is applied to extracting attention from memory to get the attentive sentence representation. The model uses aspect embedding to analyze the target information of the representation and finally the model outputs the sentiment polarity through a softmax layer. The model was tested on multi-language datasets, and demonstrated that it has better performance than conventional deep learning methods.https://www.mdpi.com/1999-5903/10/12/116attention mechanismNLPaspect-level sentiment classification |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Yonghua Zhu Xun Gao Weilin Zhang Shenkai Liu Yuanyuan Zhang |
spellingShingle |
Yonghua Zhu Xun Gao Weilin Zhang Shenkai Liu Yuanyuan Zhang A Bi-Directional LSTM-CNN Model with Attention for Aspect-Level Text Classification Future Internet attention mechanism NLP aspect-level sentiment classification |
author_facet |
Yonghua Zhu Xun Gao Weilin Zhang Shenkai Liu Yuanyuan Zhang |
author_sort |
Yonghua Zhu |
title |
A Bi-Directional LSTM-CNN Model with Attention for Aspect-Level Text Classification |
title_short |
A Bi-Directional LSTM-CNN Model with Attention for Aspect-Level Text Classification |
title_full |
A Bi-Directional LSTM-CNN Model with Attention for Aspect-Level Text Classification |
title_fullStr |
A Bi-Directional LSTM-CNN Model with Attention for Aspect-Level Text Classification |
title_full_unstemmed |
A Bi-Directional LSTM-CNN Model with Attention for Aspect-Level Text Classification |
title_sort |
bi-directional lstm-cnn model with attention for aspect-level text classification |
publisher |
MDPI AG |
series |
Future Internet |
issn |
1999-5903 |
publishDate |
2018-11-01 |
description |
The prevalence that people share their opinions on the products and services in their daily lives on the Internet has generated a large quantity of comment data, which contain great business value. As for comment sentences, they often contain several comment aspects and the sentiment on these aspects are different, which makes it meaningless to give an overall sentiment polarity of the sentence. In this paper, we introduce Attention-based Aspect-level Recurrent Convolutional Neural Network (AARCNN) to analyze the remarks at aspect-level. The model integrates attention mechanism and target information analysis, which enables the model to concentrate on the important parts of the sentence and to make full use of the target information. The model uses bidirectional LSTM (Bi-LSTM) to build the memory of the sentence, and then CNN is applied to extracting attention from memory to get the attentive sentence representation. The model uses aspect embedding to analyze the target information of the representation and finally the model outputs the sentiment polarity through a softmax layer. The model was tested on multi-language datasets, and demonstrated that it has better performance than conventional deep learning methods. |
topic |
attention mechanism NLP aspect-level sentiment classification |
url |
https://www.mdpi.com/1999-5903/10/12/116 |
work_keys_str_mv |
AT yonghuazhu abidirectionallstmcnnmodelwithattentionforaspectleveltextclassification AT xungao abidirectionallstmcnnmodelwithattentionforaspectleveltextclassification AT weilinzhang abidirectionallstmcnnmodelwithattentionforaspectleveltextclassification AT shenkailiu abidirectionallstmcnnmodelwithattentionforaspectleveltextclassification AT yuanyuanzhang abidirectionallstmcnnmodelwithattentionforaspectleveltextclassification AT yonghuazhu bidirectionallstmcnnmodelwithattentionforaspectleveltextclassification AT xungao bidirectionallstmcnnmodelwithattentionforaspectleveltextclassification AT weilinzhang bidirectionallstmcnnmodelwithattentionforaspectleveltextclassification AT shenkailiu bidirectionallstmcnnmodelwithattentionforaspectleveltextclassification AT yuanyuanzhang bidirectionallstmcnnmodelwithattentionforaspectleveltextclassification |
_version_ |
1725650836128792576 |