Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Grab it
Explore the mathematical foundations of deep learning in this comprehensive lecture by Simone Brugiapaglia from Concordia University. Delve into two case studies: rating impossibility theorems in cognitive science applications and practical existence theorems in scientific computing. Examine the limitations of deep learning in generalizing outside training sets for identity effect classification tasks. Discover how universal approximation results for deep neural networks combine with compressed sensing and high-dimensional polynomial approximation theory to yield sufficient conditions for accurate function approximation. Gain insights into ongoing research and open questions in the field, covering topics such as AI subfields, computational mathematics, deep neural networks, and approximation techniques. Enhance your understanding of the mathematical challenges and potential advancements in deep learning through this in-depth presentation.
The Mathematical Foundations of Deep Learning: From Rating Impossibility to Practical Existence Theorems