Capsule Networks With Word-Attention Dynamic Routing for Cultural Relics Relation Extraction
Online museums and online cultural relic information provide abundant data for relation extraction research. However, in the relation extraction task of modelling space information, spatially insensitive methods of convolutional neural networks and long short term memory network in most current work...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2020-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/9095302/ |
id |
doaj-061b3ee10ff9429f98cfe28d33eabca4 |
---|---|
record_format |
Article |
spelling |
doaj-061b3ee10ff9429f98cfe28d33eabca42021-03-30T03:00:42ZengIEEEIEEE Access2169-35362020-01-018942369424410.1109/ACCESS.2020.29954479095302Capsule Networks With Word-Attention Dynamic Routing for Cultural Relics Relation ExtractionMin Zhang0https://orcid.org/0000-0001-9605-9194Guohua Geng1https://orcid.org/0000-0002-4234-2119School of Information Science and Technology, Northwest University, Xi’an, ChinaSchool of Information Science and Technology, Northwest University, Xi’an, ChinaOnline museums and online cultural relic information provide abundant data for relation extraction research. However, in the relation extraction task of modelling space information, spatially insensitive methods of convolutional neural networks and long short term memory network in most current works still remain challenging in rich text structures, which makes models difficult to encode effectively and lacks the ability of text expression. To address this issue, we propose a framework named WAtt-Capsnet (the capsule network with word-attention dynamic routing), which is based on capsule networks with word-attention dynamic routing for the relation extraction task of online cultural relic data for capturing richer instantiation features. We further present combination embedding for capturing the characteristic information of Chinese sentences by considering the contribution of word embedding, parts of speech, character embedding and the position of words to capture rich internal structure information of sentences. More importantly, to reduce the decay of useful information in long sentences, we propose a routing algorithm based on a word-attention mechanism to focus on informative words. The experimental results demonstrate that the proposed method achieves significant performance for the relation extraction task of online cultural relic data.https://ieeexplore.ieee.org/document/9095302/Capsule networkscultural relicsdynamic routingrelation extractionword-attention mechanism |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Min Zhang Guohua Geng |
spellingShingle |
Min Zhang Guohua Geng Capsule Networks With Word-Attention Dynamic Routing for Cultural Relics Relation Extraction IEEE Access Capsule networks cultural relics dynamic routing relation extraction word-attention mechanism |
author_facet |
Min Zhang Guohua Geng |
author_sort |
Min Zhang |
title |
Capsule Networks With Word-Attention Dynamic Routing for Cultural Relics Relation Extraction |
title_short |
Capsule Networks With Word-Attention Dynamic Routing for Cultural Relics Relation Extraction |
title_full |
Capsule Networks With Word-Attention Dynamic Routing for Cultural Relics Relation Extraction |
title_fullStr |
Capsule Networks With Word-Attention Dynamic Routing for Cultural Relics Relation Extraction |
title_full_unstemmed |
Capsule Networks With Word-Attention Dynamic Routing for Cultural Relics Relation Extraction |
title_sort |
capsule networks with word-attention dynamic routing for cultural relics relation extraction |
publisher |
IEEE |
series |
IEEE Access |
issn |
2169-3536 |
publishDate |
2020-01-01 |
description |
Online museums and online cultural relic information provide abundant data for relation extraction research. However, in the relation extraction task of modelling space information, spatially insensitive methods of convolutional neural networks and long short term memory network in most current works still remain challenging in rich text structures, which makes models difficult to encode effectively and lacks the ability of text expression. To address this issue, we propose a framework named WAtt-Capsnet (the capsule network with word-attention dynamic routing), which is based on capsule networks with word-attention dynamic routing for the relation extraction task of online cultural relic data for capturing richer instantiation features. We further present combination embedding for capturing the characteristic information of Chinese sentences by considering the contribution of word embedding, parts of speech, character embedding and the position of words to capture rich internal structure information of sentences. More importantly, to reduce the decay of useful information in long sentences, we propose a routing algorithm based on a word-attention mechanism to focus on informative words. The experimental results demonstrate that the proposed method achieves significant performance for the relation extraction task of online cultural relic data. |
topic |
Capsule networks cultural relics dynamic routing relation extraction word-attention mechanism |
url |
https://ieeexplore.ieee.org/document/9095302/ |
work_keys_str_mv |
AT minzhang capsulenetworkswithwordattentiondynamicroutingforculturalrelicsrelationextraction AT guohuageng capsulenetworkswithwordattentiondynamicroutingforculturalrelicsrelationextraction |
_version_ |
1724184154171506688 |