Pre-Trained Embeddings for Enhancing Multi-Hop Reasoning
Langue
EN
Communication dans un congrès
Ce document a été publié dans
IJCAI 2023 Workshop KBCG, International Joint Conference on Artificial Intelligence 2023 Workshop on Knowledge-Based Compositional Generalization, 2023-08-21, Macao. 2023-06-16
Résumé en anglais
Knowledge graphs are an efficient way to represent heterogeneous data from multiple sources or disciplines by utilizing nodes and their relations. Nevertheless, they are frequently incom- plete in terms of the subject they ...Lire la suite >
Knowledge graphs are an efficient way to represent heterogeneous data from multiple sources or disciplines by utilizing nodes and their relations. Nevertheless, they are frequently incom- plete in terms of the subject they represent. Link prediction methods are used to discover additional links (or even to create new ones) between entities present in the Knowledge Graph (KG). In order to achieve this, multi-hop reasoning models have demonstrated good predictive performance and the ability to generate interpretable decisions, thereby enabling their application in high-stakes domains such as finance and public health. A multi-hop reasoning model usually has two tasks: 1) construct an accurate representation of the entities and relationships of the KG; 2) use these representations to explore the reasoning paths in the KG that support the newly predicted links. In this paper, we investi- gate how the performance of a multi-hop reasoning model changes when using pre-trained embeddings for the KG’s nodes and relations. The experiments conducted on three benchmark datasets, respec- tively WN18RR, NELL-995 and FB15K-237, suggest that using pre-trained embeddings improves: (i) the predictive performance of multi-hop reasoning models for all three datasets, (ii) the number of newly predicted links, and (iii) the quality of paths used as explanations.< Réduire
Mots clés en anglais
Link Prediction
XAI
Knowledge Graphs
Unités de recherche