menu
techminis

A naukri.com initiative

google-web-stories
Home

>

ML News

>

Learning E...
source image

Arxiv

2d

read

102

img
dot

Image Credit: Arxiv

Learning Efficient and Generalizable Graph Retriever for Knowledge-Graph Question Answering

  • Large Language Models (LLMs) suffer from outdated knowledge and hallucinations, hindering their reliability.
  • Retrieval-Augmented Generation helps ground LLMs with external knowledge, but current pipelines mostly use unstructured text, limiting interpretability and structured reasoning.
  • Knowledge graphs offer a more structured and compact alternative to unstructured text, representing facts as relational triples.
  • Recent studies have integrated knowledge graphs with LLMs for Knowledge Graph Question Answering (KGQA), often using a retrieve-then-reason paradigm.
  • Graph-based retrievers in KGQA have shown strong empirical performance but struggle with generalization ability.
  • A new framework called RAPL is proposed for efficient and effective graph retrieval in KGQA.
  • RAPL addresses limitations through a two-stage labeling strategy, a model-agnostic graph transformation approach, and a path-based reasoning strategy.
  • The two-stage labeling strategy combines heuristic signals with parametric models to provide causally grounded supervision.
  • The model-agnostic graph transformation captures intra- and inter-triple interactions, enhancing representational capacity.
  • The path-based reasoning strategy enables learning from rational knowledge injections and supports downstream reasoners with structured inputs.
  • Empirically, RAPL outperforms state-of-the-art methods by 2.66%-20.34% and reduces the performance gap between different LLM-based reasoners as well as in cross-dataset settings.
  • The framework highlights superior retrieval capability and generalizability in KGQA.
  • Codes for RAPL are available at: https://github.com/tianyao-aka/RAPL.

Read Full Article

like

6 Likes

For uninterrupted reading, download the app