We propose KGT5-context, a simple sequence-to-sequence model for link prediction (LP) in knowledge graphs (KG). Our work expands on KGT5, a recent LP model that exploits textual features of the KG, has small model size, and is scalable. To reach good predictive performance, however, KGT5 relies on an ensemble with a knowledge graph embedding model, which itself is excessively large and costly to use. In this short paper, we show empirically that adding contextual information — i.e., information about the direct neighborhood of the query entity — alleviates the need for a separate KGE model to obtain good performance. The resulting KGT5-context model is simple, reduces model size significantly, and obtains state-of-the-art performance in our experimental study.
Adrian Kochsiek, Apoorv Saxena, Inderjeet Nair, and Rainer Gemulla. 2023.Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction. InProceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023), pages 131–138, Toronto, Canada. Association for Computational Linguistics.
@inproceedings{kochsiek-etal-2023-friendly, title = "Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction", author = "Kochsiek, Adrian and Saxena, Apoorv and Nair, Inderjeet and Gemulla, Rainer", editor = "Can, Burcu and Mozes, Maximilian and Cahyawijaya, Samuel and Saphra, Naomi and Kassner, Nora and Ravfogel, Shauli and Ravichander, Abhilasha and Zhao, Chen and Augenstein, Isabelle and Rogers, Anna and Cho, Kyunghyun and Grefenstette, Edward and Voita, Lena", booktitle = "Proceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023)", month = jul, year = "2023", address = "Toronto, Canada", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2023.repl4nlp-1.11/", doi = "10.18653/v1/2023.repl4nlp-1.11", pages = "131--138", abstract = "We propose KGT5-context, a simple sequence-to-sequence model for link prediction (LP) in knowledge graphs (KG). Our work expands on KGT5, a recent LP model that exploits textual features of the KG, has small model size, and is scalable. To reach good predictive performance, however, KGT5 relies on an ensemble with a knowledge graph embedding model, which itself is excessively large and costly to use. In this short paper, we show empirically that adding contextual information {---} i.e., information about the direct neighborhood of the query entity {---} alleviates the need for a separate KGE model to obtain good performance. The resulting KGT5-context model is simple, reduces model size significantly, and obtains state-of-the-art performance in our experimental study."}
%0 Conference Proceedings%T Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction%A Kochsiek, Adrian%A Saxena, Apoorv%A Nair, Inderjeet%A Gemulla, Rainer%Y Can, Burcu%Y Mozes, Maximilian%Y Cahyawijaya, Samuel%Y Saphra, Naomi%Y Kassner, Nora%Y Ravfogel, Shauli%Y Ravichander, Abhilasha%Y Zhao, Chen%Y Augenstein, Isabelle%Y Rogers, Anna%Y Cho, Kyunghyun%Y Grefenstette, Edward%Y Voita, Lena%S Proceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023)%D 2023%8 July%I Association for Computational Linguistics%C Toronto, Canada%F kochsiek-etal-2023-friendly%X We propose KGT5-context, a simple sequence-to-sequence model for link prediction (LP) in knowledge graphs (KG). Our work expands on KGT5, a recent LP model that exploits textual features of the KG, has small model size, and is scalable. To reach good predictive performance, however, KGT5 relies on an ensemble with a knowledge graph embedding model, which itself is excessively large and costly to use. In this short paper, we show empirically that adding contextual information — i.e., information about the direct neighborhood of the query entity — alleviates the need for a separate KGE model to obtain good performance. The resulting KGT5-context model is simple, reduces model size significantly, and obtains state-of-the-art performance in our experimental study.%R 10.18653/v1/2023.repl4nlp-1.11%U https://aclanthology.org/2023.repl4nlp-1.11/%U https://doi.org/10.18653/v1/2023.repl4nlp-1.11%P 131-138
Adrian Kochsiek, Apoorv Saxena, Inderjeet Nair, and Rainer Gemulla. 2023.Friendly Neighbors: Contextualized Sequence-to-Sequence Link Prediction. InProceedings of the 8th Workshop on Representation Learning for NLP (RepL4NLP 2023), pages 131–138, Toronto, Canada. Association for Computational Linguistics.