Главная
Study mode:
on
1
Introduction
2
Collaborators
3
AI Index Report
4
AI Subfields
5
Impact of Deep Learning
6
Computational Mathematics and Deep Learning
7
Deep Learning Skepticism
8
Mathematical Problems for the Next Century
9
Presentation Structure
10
Deep Neural Networks
11
Research Question
12
Can Deep Learning generalize
13
The connectionist
14
Notation
15
General Results
16
Tau
17
Training History
18
Approximation
19
Approximation with orthogonal polynomials
20
Approximation techniques
21
In practice
22
Compressed sensing
23
Recap
24
Research directions
25
Conclusion
Description:
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only! Grab it Explore the mathematical foundations of deep learning in this comprehensive lecture by Simone Brugiapaglia from Concordia University. Delve into two case studies: rating impossibility theorems in cognitive science applications and practical existence theorems in scientific computing. Examine the limitations of deep learning in generalizing outside training sets for identity effect classification tasks. Discover how universal approximation results for deep neural networks combine with compressed sensing and high-dimensional polynomial approximation theory to yield sufficient conditions for accurate function approximation. Gain insights into ongoing research and open questions in the field, covering topics such as AI subfields, computational mathematics, deep neural networks, and approximation techniques. Enhance your understanding of the mathematical challenges and potential advancements in deep learning through this in-depth presentation.

The Mathematical Foundations of Deep Learning: From Rating Impossibility to Practical Existence Theorems

Centre de recherches mathématiques - CRM
Add to list