Главная
Study mode:
on
1
SIGIR 2024 W1.4 [fp] MTMS: Multi-teacher Multi-stage Knowledge Distillation
Description:
Learn about an innovative approach to machine reading comprehension through this 21-minute conference presentation from SIGIR 2024. Explore the MTMS (Multi-teacher Multi-stage Knowledge Distillation) framework for reasoning-based machine reading comprehension, presented by researchers Zhao Zhuo, Xie Zhiwen, Zhou Guangyou, and Huang Xiangji. Dive into how multiple teacher models and staged knowledge distillation techniques can enhance reading comprehension systems' reasoning capabilities, as demonstrated in this Association for Computing Machinery (ACM) session focused on Question Answering and Summarisation.

Multi-teacher Multi-stage Knowledge Distillation for Reasoning-Based Machine Reading Comprehension

Association for Computing Machinery (ACM)
Add to list