Learning attention-based representations from multiple patterns for relation prediction in knowledge graphs

Knowledge bases, and their representations in the form of knowledge graphs (KGs), are naturally incomplete. Since scientific and industrial applications have extensively adopted them, there is a high demand for solutions that complete their information. Several recent works tackle this challenge by...

Full description

Bibliographic Details
Main Authors: Lourenço, V. (Author), Paes, A. (Author)
Format: Article
Language:English
Published: Elsevier B.V. 2022
Subjects:
Online Access:View Fulltext in Publisher
LEADER 02600nam a2200325Ia 4500
001 10.1016-j.knosys.2022.109232
008 220718s2022 CNT 000 0 und d
020 |a 09507051 (ISSN) 
245 1 0 |a Learning attention-based representations from multiple patterns for relation prediction in knowledge graphs 
260 0 |b Elsevier B.V.  |c 2022 
856 |z View Fulltext in Publisher  |u https://doi.org/10.1016/j.knosys.2022.109232 
520 3 |a Knowledge bases, and their representations in the form of knowledge graphs (KGs), are naturally incomplete. Since scientific and industrial applications have extensively adopted them, there is a high demand for solutions that complete their information. Several recent works tackle this challenge by learning embeddings for entities and relations, then employing them to predict new relations among the entities. Despite their aggrandizement, most of those methods focus only on the local neighbors of a relation to learn the embeddings. As a result, they may fail to capture the KGs’ context information by neglecting long-term dependencies and the propagation of entities’ semantics. In this manuscript, we propose ÆMP (Attention-based Embeddings from Multiple Patterns), a novel model for learning contextualized representations by: (i) acquiring entities context information through an attention-enhanced message-passing scheme, which captures the entities local semantics while focusing on different aspects of their neighborhood; and (ii) capturing the semantic context, by leveraging the paths and their relationships between entities. Our empirical findings draw insights into how attention mechanisms can improve entities’ context representation and how combining entities and semantic path contexts improves the general representation of entities and the relation predictions. Experimental results on several large and small knowledge graph benchmarks show that ÆMP either outperforms or competes with state-of-the-art relation prediction methods. © 2022 Elsevier B.V. 
650 0 4 |a Attention mechanism 
650 0 4 |a Attention mechanisms 
650 0 4 |a Context information 
650 0 4 |a Embeddings 
650 0 4 |a Entity contexts 
650 0 4 |a Forecasting 
650 0 4 |a High demand 
650 0 4 |a Knowledge graph 
650 0 4 |a Knowledge graphs 
650 0 4 |a Learn+ 
650 0 4 |a Long-term dependencies 
650 0 4 |a Message passing 
650 0 4 |a Multiple patterns 
650 0 4 |a Representation learning 
650 0 4 |a Semantics 
700 1 |a Lourenço, V.  |e author 
700 1 |a Paes, A.  |e author 
773 |t Knowledge-Based Systems