Improving discourse representations with node hierarchy attention
Long text representation for natural language processing tasks has capture researchers’ attention recently. Beyond the sentence, finding a good representation for the text turns to the bag of the words that losses sequence order. Indeed, the text does not pattern in a haphazard way; rather, in a coh...
Main Authors: | Erfaneh Gharavi, Hadi Veisi, Rupesh Silwal, Matthew S. Gerber |
---|---|
Format: | Article |
Language: | English |
Published: |
Elsevier
2021-03-01
|
Series: | Machine Learning with Applications |
Subjects: | |
Online Access: | http://www.sciencedirect.com/science/article/pii/S2666827020300153 |
Similar Items
-
Lexicon-Enhanced LSTM With Attention for General Sentiment Analysis
by: Xianghua Fu, et al.
Published: (2018-01-01) -
Bi-Level Attention Model for Sentiment Analysis of Short Texts
by: Wei Liu, et al.
Published: (2019-01-01) -
Lexicon-Enhanced Attention Network Based on Text Representation for Sentiment Classification
by: Wenkuan Li, et al.
Published: (2019-09-01) -
A Joint Semantic Vector Representation Model for Text Clustering and Classification
by: S. Momtazi, et al.
Published: (2019-07-01) -
Temporal Representations of Citations for Understanding the Changing Roles of Scientific Publications
by: Jiangen He, et al.
Published: (2018-09-01)