What is it about?

This work introduces a Retrieve-and-Read framework for the task of Knowledge Graph Link Prediction. Conventional message-passing uses the entire KG for aggregation which leads to superfluous computation and also limits its scalability to large KGs. We propose a novel retrieve-and-read framework, which first retrieves a relevant subgraph context for the query and then jointly reasons over the context and the query with a reader. We propose a novel Transformer-based GNN as the reader, which incorporates graph-based attention structure and cross-attention between query and context for deep fusion.

Featured Image

Why is it important?

It helps address the limitations of message-passing based GNNs for KG link prediction. Conventional GNNs for KG link prediction follow the standard message-passing paradigm on the entire KG, which leads to superfluous computation, over-smoothing of node representations, and also limits their expressive power. We propose a Transformer-based GNN as the reader which is more expressive and has the potential for better scalability.

Read the Original

This page is a summary of: A Retrieve-and-Read Framework for Knowledge Graph Link Prediction, October 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3583780.3614769.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page