Главная
Study mode:
on
1
Introduction
2
Deep Equilibrium Models
3
Depth
4
Agenda
5
Explicit Layers
6
Implicit Layers
7
Sat Optimization
8
Implicit Function Theorem
9
Takeaway
10
Proposed Class of Structured Layers
11
Weight Tied Input Injected Networks
12
Expanding Depth to Capture Both Layers
13
Deep Networks
14
Summary
15
Stacking Layers
16
Do they exist
17
Equilibrium point
18
Residual point
19
Sequence modeling
20
Smallscale benchmarks
Description:
Explore the integration of constraints into deep learning architectures through structured layers in this 50-minute lecture by Zico Kolter from CMU Bosch. Delve into topics such as deep equilibrium models, explicit and implicit layers, SAT optimization, and the implicit function theorem. Examine weight-tied input-injected networks, the expansion of depth to capture different layer types, and the concept of equilibrium points in deep networks. Gain insights into sequence modeling and smallscale benchmarks as part of the "Emerging Challenges in Deep Learning" series presented at the Simons Institute.

Integrating Constraints into Deep Learning Architectures with Structured Layers

Simons Institute
Add to list