Natural Language Understanding and Dialogue Summarization with Gating Mechanisms in Two-Level Semantics

碩士 === 國立臺灣大學 === 資訊網路與多媒體研究所 === 106 === Attention-based recurrent neural network models for joint intent detection and slot filling have achieved the state-of-the-art performance, while they have independent attention weights. Considering that slot and intent have the strong relationship, this the...

Full description

Bibliographic Details
Main Authors: Chih-Wen Goo, 古志文
Other Authors: 陳縕儂
Format: Others
Language:en_US
Published: 2018
Online Access:http://ndltd.ncl.edu.tw/handle/7mq4ge
Description
Summary:碩士 === 國立臺灣大學 === 資訊網路與多媒體研究所 === 106 === Attention-based recurrent neural network models for joint intent detection and slot filling have achieved the state-of-the-art performance, while they have independent attention weights. Considering that slot and intent have the strong relationship, this thesis proposes a slot gate that focuses on learning the relationship between intent and slot attention vectors in order to obtain better semantic frame results by the global optimization. The experiments show that proposed model significantly improves the performance compared to the state-of-the-art baselines on benchmark ATIS, Snips and AMI datasets. Furthermore, can we extend the gating mechanism to multi-sentences? In this thesis, we choose the summarization task. Abstractive summarization has been widely studied, while the prior work mainly focused on summarizing single-speaker documents (news, scientific publications etc). In dialogues, there are different interactions between speakers, which is usually defined as dialogue acts. These interactive signals may provide informative cues for better summarizing dialogues. This thesis aims to leverage dialogue acts in a neural summarization model, where a sentence gate is designed to model the relationship between dialogue acts and summaries. The experiments show that proposed model significantly improves the abstractive summarization performance compared to the state-of-the-art baselines on AMI meeting corpus, demonstrating the usefulness of interactive signals.