Explore the challenges and potential solutions in meta-learning through this comprehensive seminar on theoretical machine learning. Delve into high-parameter optimization, automatic model selection, and program induction as Ke Li, a Member of the School of Mathematics at the Institute for Advanced Study, presents "Meta-Learning: Why It's Hard and What We Can Do." Gain insights into design considerations, proof frameworks, and optimization-based mental learning. Examine objective functions, methods for preventing overfitting, and forward dynamics in both deterministic and stochastic settings. Understand the original and new formulations of meta-learning problems, with practical examples and illustrations. Investigate the role of neural networks, parameter intervals, and gradients in meta-learning experiments. Learn about empirical learning techniques and strategies to improve and accelerate the learning process.
Meta-Learning - Why It’s Hard and What We Can Do - Ke Li