Shared Representation Generator for Relation Extraction With Piecewise-LSTM Convolutional Neural Networks
Traditional distant supervision for relation extraction is faced with the problem of introducing noises. In this paper, we present a shared representation generator to de-emphasize the noisy expressions by extracting common features in relation. Different from computing weighted sum in widespread at...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2019-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8611069/ |
id |
doaj-f253329e689f4e53aec96c1bc659adba |
---|---|
record_format |
Article |
spelling |
doaj-f253329e689f4e53aec96c1bc659adba2021-03-29T22:18:32ZengIEEEIEEE Access2169-35362019-01-017316723168010.1109/ACCESS.2019.28927248611069Shared Representation Generator for Relation Extraction With Piecewise-LSTM Convolutional Neural NetworksDanfeng Yan0Bo Hu1https://orcid.org/0000-0003-2485-7589State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing, ChinaState Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, Beijing, ChinaTraditional distant supervision for relation extraction is faced with the problem of introducing noises. In this paper, we present a shared representation generator to de-emphasize the noisy expressions by extracting common features in relation. Different from computing weighted sum in widespread attention mechanism, we directly generate bag representation in multi-instance learning by feature transformation, which only remains the semantics related to predict relation. We introduce the generator loss into objective function to improve the performance of shared representation. Also, the structure of our proposed generator is flexible and scalable. To capture more structural information, piecewise convolutional neural network (PCNN) is widely used to divide the output of convolutional layer into three segments, but this approach breaks the consistence and inner relationship of the sentence. We encode the sentence with piecewise-LSTM convolutional neural network (PLSTM-CNN) to alleviate this issue, which adopts BiLSTM after the pooling layer of PCNN. The experimental results show that we achieve significant improvement on relation extraction as compared with the baselines.https://ieeexplore.ieee.org/document/8611069/Distant supervisionrelation extractionshared representationgenerator lossBiLSTM |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Danfeng Yan Bo Hu |
spellingShingle |
Danfeng Yan Bo Hu Shared Representation Generator for Relation Extraction With Piecewise-LSTM Convolutional Neural Networks IEEE Access Distant supervision relation extraction shared representation generator loss BiLSTM |
author_facet |
Danfeng Yan Bo Hu |
author_sort |
Danfeng Yan |
title |
Shared Representation Generator for Relation Extraction With Piecewise-LSTM Convolutional Neural Networks |
title_short |
Shared Representation Generator for Relation Extraction With Piecewise-LSTM Convolutional Neural Networks |
title_full |
Shared Representation Generator for Relation Extraction With Piecewise-LSTM Convolutional Neural Networks |
title_fullStr |
Shared Representation Generator for Relation Extraction With Piecewise-LSTM Convolutional Neural Networks |
title_full_unstemmed |
Shared Representation Generator for Relation Extraction With Piecewise-LSTM Convolutional Neural Networks |
title_sort |
shared representation generator for relation extraction with piecewise-lstm convolutional neural networks |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2019-01-01 |
description |
Traditional distant supervision for relation extraction is faced with the problem of introducing noises. In this paper, we present a shared representation generator to de-emphasize the noisy expressions by extracting common features in relation. Different from computing weighted sum in widespread attention mechanism, we directly generate bag representation in multi-instance learning by feature transformation, which only remains the semantics related to predict relation. We introduce the generator loss into objective function to improve the performance of shared representation. Also, the structure of our proposed generator is flexible and scalable. To capture more structural information, piecewise convolutional neural network (PCNN) is widely used to divide the output of convolutional layer into three segments, but this approach breaks the consistence and inner relationship of the sentence. We encode the sentence with piecewise-LSTM convolutional neural network (PLSTM-CNN) to alleviate this issue, which adopts BiLSTM after the pooling layer of PCNN. The experimental results show that we achieve significant improvement on relation extraction as compared with the baselines. |
topic |
Distant supervision relation extraction shared representation generator loss BiLSTM |
url |
https://ieeexplore.ieee.org/document/8611069/ |
work_keys_str_mv |
AT danfengyan sharedrepresentationgeneratorforrelationextractionwithpiecewiselstmconvolutionalneuralnetworks AT bohu sharedrepresentationgeneratorforrelationextractionwithpiecewiselstmconvolutionalneuralnetworks |
_version_ |
1724191909409193984 |