SIGIR 2024 M3.3 [fp] Self-Imp Teacher Cultivates Better Student: Distillation Calibration For MM LLM
Description:
Explore a cutting-edge approach to improving multimodal large language models in this 14-minute conference talk from SIGIR 2024. Delve into the concept of "Self-Improving Teacher Cultivates Better Student: Distillation Calibration For Multimodal Large Language Models" presented by authors Xinwei Li, Li Lin, Shuai Wang, and Chen Qian. Learn about innovative techniques for enhancing the performance and capabilities of multimodal AI systems through self-improvement and distillation calibration methods. Gain insights into the latest advancements in artificial intelligence and machine learning, specifically focused on multimodal large language models and their potential applications.
Self-Improving Teacher Cultivates Better Student: Distillation Calibration for Multimodal Large Language Models - Lecture 3.3