Results from comprehensive fine-tuning and distillation from got-4o to gpt-4o-mini
14
Notes on OpenAI Evals and Gemini fine-tuning
Description:
Explore the differences between OpenAI fine-tuning and distillation in this 23-minute video tutorial. Learn why these techniques are valuable and how distillation differs from fine-tuning. Follow along with a free Colab notebook to implement a simple distillation approach, including model selection, training question setup, evaluation, dataset generation, and fine-tuning. Discover advanced techniques for creating larger datasets and review comprehensive results from fine-tuning and distilling GPT-4 to GPT-4-mini. Gain insights on OpenAI Evals and Gemini fine-tuning, with additional resources provided for synthetic data preparation and model distillation from scratch.
OpenAI Fine-tuning vs Distillation - Techniques and Implementation