Shared Representation Generator for Relation Extraction With Piecewise-LSTM Convolutional Neural Networks

Traditional distant supervision for relation extraction is faced with the problem of introducing noises. In this paper, we present a shared representation generator to de-emphasize the noisy expressions by extracting common features in relation. Different from computing weighted sum in widespread at...

Full description

Bibliographic Details
Main Authors: Danfeng Yan, Bo Hu
Format: Article
Language:English
Published: IEEE 2019-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/8611069/
Description
Summary:Traditional distant supervision for relation extraction is faced with the problem of introducing noises. In this paper, we present a shared representation generator to de-emphasize the noisy expressions by extracting common features in relation. Different from computing weighted sum in widespread attention mechanism, we directly generate bag representation in multi-instance learning by feature transformation, which only remains the semantics related to predict relation. We introduce the generator loss into objective function to improve the performance of shared representation. Also, the structure of our proposed generator is flexible and scalable. To capture more structural information, piecewise convolutional neural network (PCNN) is widely used to divide the output of convolutional layer into three segments, but this approach breaks the consistence and inner relationship of the sentence. We encode the sentence with piecewise-LSTM convolutional neural network (PLSTM-CNN) to alleviate this issue, which adopts BiLSTM after the pooling layer of PCNN. The experimental results show that we achieve significant improvement on relation extraction as compared with the baselines.
ISSN:2169-3536