Chinese Knowledge Base Question Answering by Attention-Based Multi-Granularity Model
Chinese knowledge base question answering (KBQA) is designed to answer the questions with the facts contained in a knowledge base. This task can be divided into two subtasks: topic entity extraction and relation selection. During the topic entity extraction stage, an entity extraction model is built...
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2018-04-01
|
Series: | Information |
Subjects: | |
Online Access: | http://www.mdpi.com/2078-2489/9/4/98 |
id |
doaj-b58c123c0208443fac1e6bf44a2c016e |
---|---|
record_format |
Article |
spelling |
doaj-b58c123c0208443fac1e6bf44a2c016e2020-11-25T00:32:10ZengMDPI AGInformation2078-24892018-04-01949810.3390/info9040098info9040098Chinese Knowledge Base Question Answering by Attention-Based Multi-Granularity ModelCun Shen0Tinglei Huang1Xiao Liang2Feng Li3Kun Fu4Key Laboratory of Technology in Geo-Spatial Information Processing and Application System, Institute of Electronics, Chinese Academy of Sciences, Beijing 100190, ChinaKey Laboratory of Technology in Geo-Spatial Information Processing and Application System, Institute of Electronics, Chinese Academy of Sciences, Beijing 100190, ChinaKey Laboratory of Technology in Geo-Spatial Information Processing and Application System, Institute of Electronics, Chinese Academy of Sciences, Beijing 100190, ChinaKey Laboratory of Technology in Geo-Spatial Information Processing and Application System, Institute of Electronics, Chinese Academy of Sciences, Beijing 100190, ChinaKey Laboratory of Technology in Geo-Spatial Information Processing and Application System, Institute of Electronics, Chinese Academy of Sciences, Beijing 100190, ChinaChinese knowledge base question answering (KBQA) is designed to answer the questions with the facts contained in a knowledge base. This task can be divided into two subtasks: topic entity extraction and relation selection. During the topic entity extraction stage, an entity extraction model is built to locate topic entities in questions. The Levenshtein Ratio entity linker is proposed to conduct effective entity linking. All the relevant subject-predicate-object (SPO) triples to topic entity are searched from the knowledge base as candidates. In relation selection, an attention-based multi-granularity interaction model (ABMGIM) is proposed. Two main contributions are as follows. First, a multi-granularity approach for text embedding is proposed. A nested character-level and word-level approach is used to concatenate the pre-trained embedding of a character with corresponding embedding on word-level. Second, we apply a hierarchical matching model for question representation in relation selection tasks, and attention mechanisms are imported for a fine-grained alignment between characters for relation selection. Experimental results show that our model achieves a competitive performance on the public dataset, which demonstrates its effectiveness.http://www.mdpi.com/2078-2489/9/4/98knowledge base question answeringtopic entity extractionrelation selectionmulti-granularity embeddingsattention mechanism |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Cun Shen Tinglei Huang Xiao Liang Feng Li Kun Fu |
spellingShingle |
Cun Shen Tinglei Huang Xiao Liang Feng Li Kun Fu Chinese Knowledge Base Question Answering by Attention-Based Multi-Granularity Model Information knowledge base question answering topic entity extraction relation selection multi-granularity embeddings attention mechanism |
author_facet |
Cun Shen Tinglei Huang Xiao Liang Feng Li Kun Fu |
author_sort |
Cun Shen |
title |
Chinese Knowledge Base Question Answering by Attention-Based Multi-Granularity Model |
title_short |
Chinese Knowledge Base Question Answering by Attention-Based Multi-Granularity Model |
title_full |
Chinese Knowledge Base Question Answering by Attention-Based Multi-Granularity Model |
title_fullStr |
Chinese Knowledge Base Question Answering by Attention-Based Multi-Granularity Model |
title_full_unstemmed |
Chinese Knowledge Base Question Answering by Attention-Based Multi-Granularity Model |
title_sort |
chinese knowledge base question answering by attention-based multi-granularity model |
publisher |
MDPI AG |
series |
Information |
issn |
2078-2489 |
publishDate |
2018-04-01 |
description |
Chinese knowledge base question answering (KBQA) is designed to answer the questions with the facts contained in a knowledge base. This task can be divided into two subtasks: topic entity extraction and relation selection. During the topic entity extraction stage, an entity extraction model is built to locate topic entities in questions. The Levenshtein Ratio entity linker is proposed to conduct effective entity linking. All the relevant subject-predicate-object (SPO) triples to topic entity are searched from the knowledge base as candidates. In relation selection, an attention-based multi-granularity interaction model (ABMGIM) is proposed. Two main contributions are as follows. First, a multi-granularity approach for text embedding is proposed. A nested character-level and word-level approach is used to concatenate the pre-trained embedding of a character with corresponding embedding on word-level. Second, we apply a hierarchical matching model for question representation in relation selection tasks, and attention mechanisms are imported for a fine-grained alignment between characters for relation selection. Experimental results show that our model achieves a competitive performance on the public dataset, which demonstrates its effectiveness. |
topic |
knowledge base question answering topic entity extraction relation selection multi-granularity embeddings attention mechanism |
url |
http://www.mdpi.com/2078-2489/9/4/98 |
work_keys_str_mv |
AT cunshen chineseknowledgebasequestionansweringbyattentionbasedmultigranularitymodel AT tingleihuang chineseknowledgebasequestionansweringbyattentionbasedmultigranularitymodel AT xiaoliang chineseknowledgebasequestionansweringbyattentionbasedmultigranularitymodel AT fengli chineseknowledgebasequestionansweringbyattentionbasedmultigranularitymodel AT kunfu chineseknowledgebasequestionansweringbyattentionbasedmultigranularitymodel |
_version_ |
1725320515068887040 |