Explore the limitations of differentiable programming techniques in machine learning through this in-depth video analysis. Delve into the chaos-based failure mode that affects various differentiable systems, from recurrent neural networks to numerical physics simulations. Examine the connection between this failure and the spectrum of the Jacobian, and learn criteria for predicting when differentiation-based optimization algorithms might falter. Investigate examples in policy learning, meta-learning optimizers, and disk packing to understand the practical implications. Discover potential solutions and consider the advantages of black-box methods in overcoming these challenges.
Gradients Are Not All You Need - Machine Learning Research Paper Explained