27 May 2022

ACL 2022

Authors: Angus Brayne, Maciej Wiatrak, Dane Corneil

Abstract

In the real world, many relational facts require context; for instance, a politician holds a given elected position only for a particular timespan. This context (the timespan) is typically ignored in knowledge graph link prediction tasks, or is leveraged by models designed specifically to make use of it (i.e. n-ary link prediction models). Here, we show that the task of n-ary link prediction is easily performed using language models, applied with a basic method for constructing cloze-style query sentences. We introduce a pre-training methodology based around an auxiliary entity-linked corpus that outperforms other popular pre-trained models like BERT, even with a smaller model. This methodology also enables n-ary link prediction without access to any n-ary training set, which can be invaluable in circumstances where expensive and time-consuming curation of n-ary knowledge graphs is not feasible. We achieve state-of-the-art performance on the primary n-ary link prediction dataset WD50K and on WikiPeople facts that include literals - typically ignored by knowledge graph embedding methods.


Back to publications

Latest publications

01 Jun 2024
arXiv Computer Science
Retrieve to Explain: Evidence-driven Predictions with Language Models
Read more
01 May 2024
Journal of Biomedical Semantics, volume 15, Article number: 5 (2024)
Elucidating the Semantics-Topology Trade-off for Knowledge Inference-Based Pharmacological Discovery
Read more
12 Oct 2023
Translational Neurodegeneration. 2023; 12: 47
Janus kinase inhibitors are potential therapeutics for amyotrophic lateral sclerosis
Read more