Главная
Study mode:
on
1
Augment LLMs with Knowledge Graphs
2
Subgraph retrievers
3
Agents for Integrating LLM and KG
4
NEW IDEA by MIT & HK Univ
5
Example of Decode on Graphs
6
Implementation PROMPT DoG
7
Linear graph forms
8
Graph aware constrained decoding
9
Harvard MED Agents for LLM on KG
Description:
Watch a 29-minute research presentation exploring Decoding on Graphs (DoG), a groundbreaking framework developed by MIT and the University of Hong Kong that enhances Large Language Models' capabilities through Knowledge Graph integration. Learn how DoG employs "well-formed chains" - sequences of interconnected fact triplets - to improve question-answering tasks by ensuring LLMs generate responses that align with Knowledge Graph structures. Discover the implementation of graph-aware constrained decoding using trie data structures and beam search execution techniques that enable multiple reasoning paths while maintaining accuracy. Explore practical applications through examples, including Harvard Medical implementations, and understand how this framework outperforms existing methods in complex multi-hop reasoning scenarios. Delve into key concepts including subgraph retrievers, LLM-KG integration agents, linear graph forms, and constrained decoding mechanisms that make this innovative approach both faithful and effective. Read more

Decoding on Graphs: Empowering LLMs with Knowledge Graphs Through Well-Formed Chains

Discover AI
Add to list
0:00 / 0:00