Capsule Networks With Word-Attention Dynamic Routing for Cultural Relics Relation Extraction

Online museums and online cultural relic information provide abundant data for relation extraction research. However, in the relation extraction task of modelling space information, spatially insensitive methods of convolutional neural networks and long short term memory network in most current work...

Full description

Bibliographic Details
Main Authors: Min Zhang, Guohua Geng
Format: Article
Language:English
Published: IEEE 2020-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9095302/
Description
Summary:Online museums and online cultural relic information provide abundant data for relation extraction research. However, in the relation extraction task of modelling space information, spatially insensitive methods of convolutional neural networks and long short term memory network in most current works still remain challenging in rich text structures, which makes models difficult to encode effectively and lacks the ability of text expression. To address this issue, we propose a framework named WAtt-Capsnet (the capsule network with word-attention dynamic routing), which is based on capsule networks with word-attention dynamic routing for the relation extraction task of online cultural relic data for capturing richer instantiation features. We further present combination embedding for capturing the characteristic information of Chinese sentences by considering the contribution of word embedding, parts of speech, character embedding and the position of words to capture rich internal structure information of sentences. More importantly, to reduce the decay of useful information in long sentences, we propose a routing algorithm based on a word-attention mechanism to focus on informative words. The experimental results demonstrate that the proposed method achieves significant performance for the relation extraction task of online cultural relic data.
ISSN:2169-3536