Double attention recurrent convolution neural network for answer selection

Answer selection is one of the key steps in many question answering (QA) applications. In this paper, a new deep model with two kinds of attention is proposed for answer selection: the double attention recurrent convolution neural network (DARCNN). Double attention means self-attention and cross-att...

Full description

Bibliographic Details
Main Authors: Ganchao Bao, Yuan Wei, Xin Sun, Hongli Zhang
Format: Article
Language:English
Published: The Royal Society 2020-05-01
Series:Royal Society Open Science
Subjects:
Online Access:https://royalsocietypublishing.org/doi/pdf/10.1098/rsos.191517
id doaj-4c52b58e101a4190b01fb10d20d55936
record_format Article
spelling doaj-4c52b58e101a4190b01fb10d20d559362020-11-25T03:56:47ZengThe Royal SocietyRoyal Society Open Science2054-57032020-05-017510.1098/rsos.191517191517Double attention recurrent convolution neural network for answer selectionGanchao BaoYuan WeiXin SunHongli ZhangAnswer selection is one of the key steps in many question answering (QA) applications. In this paper, a new deep model with two kinds of attention is proposed for answer selection: the double attention recurrent convolution neural network (DARCNN). Double attention means self-attention and cross-attention. The design inspiration of this model came from the transformer in the domain of machine translation. Self-attention can directly calculate dependencies between words regardless of the distance. However, self-attention ignores the distinction between its surrounding words and other words. Thus, we design a decay self-attention that prioritizes local words in a sentence. In addition, cross-attention is established to achieve interaction between question and candidate answer. With the outputs of self-attention and decay self-attention, we can get two kinds of interactive information via cross-attention. Finally, using the feature vectors of the question and answer, elementwise multiplication is used to combine with them and multilayer perceptron is used to predict the matching score. Experimental results on four QA datasets containing Chinese and English show that DARCNN performs better than other answer selection models, thereby demonstrating the effectiveness of self-attention, decay self-attention and cross-attention in answer selection tasks.https://royalsocietypublishing.org/doi/pdf/10.1098/rsos.191517answer selectionattention mechanismbidirectional lstmconvolutional neural networksiamese network
collection DOAJ
language English
format Article
sources DOAJ
author Ganchao Bao
Yuan Wei
Xin Sun
Hongli Zhang
spellingShingle Ganchao Bao
Yuan Wei
Xin Sun
Hongli Zhang
Double attention recurrent convolution neural network for answer selection
Royal Society Open Science
answer selection
attention mechanism
bidirectional lstm
convolutional neural network
siamese network
author_facet Ganchao Bao
Yuan Wei
Xin Sun
Hongli Zhang
author_sort Ganchao Bao
title Double attention recurrent convolution neural network for answer selection
title_short Double attention recurrent convolution neural network for answer selection
title_full Double attention recurrent convolution neural network for answer selection
title_fullStr Double attention recurrent convolution neural network for answer selection
title_full_unstemmed Double attention recurrent convolution neural network for answer selection
title_sort double attention recurrent convolution neural network for answer selection
publisher The Royal Society
series Royal Society Open Science
issn 2054-5703
publishDate 2020-05-01
description Answer selection is one of the key steps in many question answering (QA) applications. In this paper, a new deep model with two kinds of attention is proposed for answer selection: the double attention recurrent convolution neural network (DARCNN). Double attention means self-attention and cross-attention. The design inspiration of this model came from the transformer in the domain of machine translation. Self-attention can directly calculate dependencies between words regardless of the distance. However, self-attention ignores the distinction between its surrounding words and other words. Thus, we design a decay self-attention that prioritizes local words in a sentence. In addition, cross-attention is established to achieve interaction between question and candidate answer. With the outputs of self-attention and decay self-attention, we can get two kinds of interactive information via cross-attention. Finally, using the feature vectors of the question and answer, elementwise multiplication is used to combine with them and multilayer perceptron is used to predict the matching score. Experimental results on four QA datasets containing Chinese and English show that DARCNN performs better than other answer selection models, thereby demonstrating the effectiveness of self-attention, decay self-attention and cross-attention in answer selection tasks.
topic answer selection
attention mechanism
bidirectional lstm
convolutional neural network
siamese network
url https://royalsocietypublishing.org/doi/pdf/10.1098/rsos.191517
work_keys_str_mv AT ganchaobao doubleattentionrecurrentconvolutionneuralnetworkforanswerselection
AT yuanwei doubleattentionrecurrentconvolutionneuralnetworkforanswerselection
AT xinsun doubleattentionrecurrentconvolutionneuralnetworkforanswerselection
AT honglizhang doubleattentionrecurrentconvolutionneuralnetworkforanswerselection
_version_ 1724463859891175424