Embedding Logic Rules Into Recurrent Neural Networks
Incorporating prior knowledge into recurrent neural network (RNN) is of great importance for many natural language processing tasks. However, most of the prior knowledge is in the form of structured knowledge and is difficult to be exploited in the existing RNN framework. By extracting the logic rul...
Main Authors: | , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2019-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/8610074/ |
id |
doaj-ad373f848d954e51b66d2899fc3d0bdd |
---|---|
record_format |
Article |
spelling |
doaj-ad373f848d954e51b66d2899fc3d0bdd2021-03-29T22:36:10ZengIEEEIEEE Access2169-35362019-01-017149381494610.1109/ACCESS.2019.28921408610074Embedding Logic Rules Into Recurrent Neural NetworksBingfeng Chen0https://orcid.org/0000-0002-9449-5424Zhifeng Hao1https://orcid.org/0000-0002-9257-2895Xiaofeng Cai2Ruichu Cai3https://orcid.org/0000-0001-8972-167XWen Wen4Jian Zhu5Guangqiang Xie6Department of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaDepartment of Computer Science, Guangdong University of Technology, Guangzhou, ChinaIncorporating prior knowledge into recurrent neural network (RNN) is of great importance for many natural language processing tasks. However, most of the prior knowledge is in the form of structured knowledge and is difficult to be exploited in the existing RNN framework. By extracting the logic rules from the structured knowledge and embedding the extracted logic rule into the RNN, this paper proposes an effective framework to incorporate the prior information in the RNN models. First, we demonstrate that commonly used prior knowledge could be decomposed into a set of logic rules, including the knowledge graph, social graph, and syntactic dependence. Second, we present a technique to embed a set of logic rules into the RNN by the way of feedback masks. Finally, we apply the proposed approach to the sentiment classification and named entity recognition task. The extensive experimental results verify the effectiveness of the embedding approach. The encouraging results suggest that the proposed approach has the potential for applications in other NLP tasks.https://ieeexplore.ieee.org/document/8610074/RNNlogic rulessentiment classificationnamed entity recognition |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Bingfeng Chen Zhifeng Hao Xiaofeng Cai Ruichu Cai Wen Wen Jian Zhu Guangqiang Xie |
spellingShingle |
Bingfeng Chen Zhifeng Hao Xiaofeng Cai Ruichu Cai Wen Wen Jian Zhu Guangqiang Xie Embedding Logic Rules Into Recurrent Neural Networks IEEE Access RNN logic rules sentiment classification named entity recognition |
author_facet |
Bingfeng Chen Zhifeng Hao Xiaofeng Cai Ruichu Cai Wen Wen Jian Zhu Guangqiang Xie |
author_sort |
Bingfeng Chen |
title |
Embedding Logic Rules Into Recurrent Neural Networks |
title_short |
Embedding Logic Rules Into Recurrent Neural Networks |
title_full |
Embedding Logic Rules Into Recurrent Neural Networks |
title_fullStr |
Embedding Logic Rules Into Recurrent Neural Networks |
title_full_unstemmed |
Embedding Logic Rules Into Recurrent Neural Networks |
title_sort |
embedding logic rules into recurrent neural networks |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2019-01-01 |
description |
Incorporating prior knowledge into recurrent neural network (RNN) is of great importance for many natural language processing tasks. However, most of the prior knowledge is in the form of structured knowledge and is difficult to be exploited in the existing RNN framework. By extracting the logic rules from the structured knowledge and embedding the extracted logic rule into the RNN, this paper proposes an effective framework to incorporate the prior information in the RNN models. First, we demonstrate that commonly used prior knowledge could be decomposed into a set of logic rules, including the knowledge graph, social graph, and syntactic dependence. Second, we present a technique to embed a set of logic rules into the RNN by the way of feedback masks. Finally, we apply the proposed approach to the sentiment classification and named entity recognition task. The extensive experimental results verify the effectiveness of the embedding approach. The encouraging results suggest that the proposed approach has the potential for applications in other NLP tasks. |
topic |
RNN logic rules sentiment classification named entity recognition |
url |
https://ieeexplore.ieee.org/document/8610074/ |
work_keys_str_mv |
AT bingfengchen embeddinglogicrulesintorecurrentneuralnetworks AT zhifenghao embeddinglogicrulesintorecurrentneuralnetworks AT xiaofengcai embeddinglogicrulesintorecurrentneuralnetworks AT ruichucai embeddinglogicrulesintorecurrentneuralnetworks AT wenwen embeddinglogicrulesintorecurrentneuralnetworks AT jianzhu embeddinglogicrulesintorecurrentneuralnetworks AT guangqiangxie embeddinglogicrulesintorecurrentneuralnetworks |
_version_ |
1724191305366503424 |