Natural Language Understanding and Dialogue Summarization with Gating Mechanisms in Two-Level Semantics
碩士 === 國立臺灣大學 === 資訊網路與多媒體研究所 === 106 === Attention-based recurrent neural network models for joint intent detection and slot filling have achieved the state-of-the-art performance, while they have independent attention weights. Considering that slot and intent have the strong relationship, this the...
Main Authors: | , |
---|---|
Other Authors: | |
Format: | Others |
Language: | en_US |
Published: |
2018
|
Online Access: | http://ndltd.ncl.edu.tw/handle/7mq4ge |
id |
ndltd-TW-106NTU05641006 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-TW-106NTU056410062019-05-30T03:50:41Z http://ndltd.ncl.edu.tw/handle/7mq4ge Natural Language Understanding and Dialogue Summarization with Gating Mechanisms in Two-Level Semantics 雙層語意控制之自然語言理解與自動對話摘要 Chih-Wen Goo 古志文 碩士 國立臺灣大學 資訊網路與多媒體研究所 106 Attention-based recurrent neural network models for joint intent detection and slot filling have achieved the state-of-the-art performance, while they have independent attention weights. Considering that slot and intent have the strong relationship, this thesis proposes a slot gate that focuses on learning the relationship between intent and slot attention vectors in order to obtain better semantic frame results by the global optimization. The experiments show that proposed model significantly improves the performance compared to the state-of-the-art baselines on benchmark ATIS, Snips and AMI datasets. Furthermore, can we extend the gating mechanism to multi-sentences? In this thesis, we choose the summarization task. Abstractive summarization has been widely studied, while the prior work mainly focused on summarizing single-speaker documents (news, scientific publications etc). In dialogues, there are different interactions between speakers, which is usually defined as dialogue acts. These interactive signals may provide informative cues for better summarizing dialogues. This thesis aims to leverage dialogue acts in a neural summarization model, where a sentence gate is designed to model the relationship between dialogue acts and summaries. The experiments show that proposed model significantly improves the abstractive summarization performance compared to the state-of-the-art baselines on AMI meeting corpus, demonstrating the usefulness of interactive signals. 陳縕儂 2018 學位論文 ; thesis 45 en_US |
collection |
NDLTD |
language |
en_US |
format |
Others
|
sources |
NDLTD |
description |
碩士 === 國立臺灣大學 === 資訊網路與多媒體研究所 === 106 === Attention-based recurrent neural network models for joint intent detection and slot filling have achieved the state-of-the-art performance, while they have independent attention weights. Considering that slot and intent have the strong relationship, this thesis proposes a slot gate that focuses on learning the relationship between intent and slot attention vectors in order to obtain better semantic frame results by the global optimization. The experiments show that proposed model significantly improves the performance compared to the state-of-the-art baselines on benchmark ATIS, Snips and AMI datasets.
Furthermore, can we extend the gating mechanism to multi-sentences? In this thesis, we choose the summarization task. Abstractive summarization has been widely studied, while the prior work mainly focused on summarizing single-speaker documents (news, scientific publications etc). In dialogues, there are different interactions between speakers, which is usually defined as dialogue acts. These interactive signals may provide informative cues for better summarizing dialogues. This thesis aims to leverage dialogue acts in a neural summarization model, where a sentence gate is designed to model the relationship between dialogue acts and summaries. The experiments show that proposed model significantly improves the abstractive summarization performance compared to the state-of-the-art baselines on AMI meeting corpus, demonstrating the usefulness of interactive signals.
|
author2 |
陳縕儂 |
author_facet |
陳縕儂 Chih-Wen Goo 古志文 |
author |
Chih-Wen Goo 古志文 |
spellingShingle |
Chih-Wen Goo 古志文 Natural Language Understanding and Dialogue Summarization with Gating Mechanisms in Two-Level Semantics |
author_sort |
Chih-Wen Goo |
title |
Natural Language Understanding and Dialogue Summarization with Gating Mechanisms in Two-Level Semantics |
title_short |
Natural Language Understanding and Dialogue Summarization with Gating Mechanisms in Two-Level Semantics |
title_full |
Natural Language Understanding and Dialogue Summarization with Gating Mechanisms in Two-Level Semantics |
title_fullStr |
Natural Language Understanding and Dialogue Summarization with Gating Mechanisms in Two-Level Semantics |
title_full_unstemmed |
Natural Language Understanding and Dialogue Summarization with Gating Mechanisms in Two-Level Semantics |
title_sort |
natural language understanding and dialogue summarization with gating mechanisms in two-level semantics |
publishDate |
2018 |
url |
http://ndltd.ncl.edu.tw/handle/7mq4ge |
work_keys_str_mv |
AT chihwengoo naturallanguageunderstandinganddialoguesummarizationwithgatingmechanismsintwolevelsemantics AT gǔzhìwén naturallanguageunderstandinganddialoguesummarizationwithgatingmechanismsintwolevelsemantics AT chihwengoo shuāngcéngyǔyìkòngzhìzhīzìrányǔyánlǐjiěyǔzìdòngduìhuàzhāiyào AT gǔzhìwén shuāngcéngyǔyìkòngzhìzhīzìrányǔyánlǐjiěyǔzìdòngduìhuàzhāiyào |
_version_ |
1719195422294016000 |