A Topical Category-Aware Neural Text Summarizer
The advent of the sequence-to-sequence model and the attention mechanism has increased the comprehension and readability of automatically generated summaries. However, most previous studies on text summarization have focused on generating or extracting sentences only from an original text, even thou...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2020-08-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/10/16/5422 |
id |
doaj-fc67ff09e22e4fe7affc704bc4dd570f |
---|---|
record_format |
Article |
spelling |
doaj-fc67ff09e22e4fe7affc704bc4dd570f2020-11-25T03:48:29ZengMDPI AGApplied Sciences2076-34172020-08-01105422542210.3390/app10165422A Topical Category-Aware Neural Text SummarizerSo-Eon Kim0Nazira Kaibalina1Seong-Bae Park2Department of Computer Science and Engineering, Kyung Hee University, Yongin 17104, KoreaDepartment of Computer Science and Engineering, Kyung Hee University, Yongin 17104, KoreaDepartment of Computer Science and Engineering, Kyung Hee University, Yongin 17104, KoreaThe advent of the sequence-to-sequence model and the attention mechanism has increased the comprehension and readability of automatically generated summaries. However, most previous studies on text summarization have focused on generating or extracting sentences only from an original text, even though every text has a latent topic category. That is, even if a topic category helps improve the summarization quality, there have been no efforts to utilize such information in text summarization. Therefore, this paper proposes a novel topical category-aware neural text summarizer which is differentiated from legacy neural summarizers in that it reflects the topic category of an original text into generating a summary. The proposed summarizer adopts the class activation map (CAM) as topical influence of the words in the original text. Since the CAM excerpts the words relevant to a specific category from the text, it allows the attention mechanism to be influenced by the topic category. As a result, the proposed neural summarizer reflects the topical information of a text as well as the content information into a summary by combining the attention mechanism and CAM. The experiments on The New York Times Annotated Corpus show that the proposed model outperforms the legacy attention-based sequence-to-sequence model, which proves that it is effective at reflecting a topic category into automatic summarization.https://www.mdpi.com/2076-3417/10/16/5422text summarizationclass activation mapattention mechanismtopic categorytext readability |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
So-Eon Kim Nazira Kaibalina Seong-Bae Park |
spellingShingle |
So-Eon Kim Nazira Kaibalina Seong-Bae Park A Topical Category-Aware Neural Text Summarizer Applied Sciences text summarization class activation map attention mechanism topic category text readability |
author_facet |
So-Eon Kim Nazira Kaibalina Seong-Bae Park |
author_sort |
So-Eon Kim |
title |
A Topical Category-Aware Neural Text Summarizer |
title_short |
A Topical Category-Aware Neural Text Summarizer |
title_full |
A Topical Category-Aware Neural Text Summarizer |
title_fullStr |
A Topical Category-Aware Neural Text Summarizer |
title_full_unstemmed |
A Topical Category-Aware Neural Text Summarizer |
title_sort |
topical category-aware neural text summarizer |
publisher |
MDPI AG |
series |
Applied Sciences |
issn |
2076-3417 |
publishDate |
2020-08-01 |
description |
The advent of the sequence-to-sequence model and the attention mechanism has increased the comprehension and readability of automatically generated summaries. However, most previous studies on text summarization have focused on generating or extracting sentences only from an original text, even though every text has a latent topic category. That is, even if a topic category helps improve the summarization quality, there have been no efforts to utilize such information in text summarization. Therefore, this paper proposes a novel topical category-aware neural text summarizer which is differentiated from legacy neural summarizers in that it reflects the topic category of an original text into generating a summary. The proposed summarizer adopts the class activation map (CAM) as topical influence of the words in the original text. Since the CAM excerpts the words relevant to a specific category from the text, it allows the attention mechanism to be influenced by the topic category. As a result, the proposed neural summarizer reflects the topical information of a text as well as the content information into a summary by combining the attention mechanism and CAM. The experiments on The New York Times Annotated Corpus show that the proposed model outperforms the legacy attention-based sequence-to-sequence model, which proves that it is effective at reflecting a topic category into automatic summarization. |
topic |
text summarization class activation map attention mechanism topic category text readability |
url |
https://www.mdpi.com/2076-3417/10/16/5422 |
work_keys_str_mv |
AT soeonkim atopicalcategoryawareneuraltextsummarizer AT nazirakaibalina atopicalcategoryawareneuraltextsummarizer AT seongbaepark atopicalcategoryawareneuraltextsummarizer AT soeonkim topicalcategoryawareneuraltextsummarizer AT nazirakaibalina topicalcategoryawareneuraltextsummarizer AT seongbaepark topicalcategoryawareneuraltextsummarizer |
_version_ |
1724498828073107456 |