– Coding a 2×2 linear transformation & Gilbert Strang
8
– Coding a 2×2 linear transformation w/ PyTorch
9
– Hyperbolic tangent
10
– Rotation + squashing + rotation: ooooh, a neural net
11
– Rectifying linear unit ReLU
12
– Shoutout to @vcubingx and his animation
13
– Spiky transformation: what happen here?
14
– A *very deep* neural net
15
– A deep net with tanh
16
– Summary of today lesson
Description:
Explore the fundamentals of neural networks in this comprehensive lecture focusing on rotation and squashing operations. Delve into affine transformations and non-linearities, gaining intuitive understanding through visual explanations. Learn to implement 2x2 linear transformations using both Jupyter and PyTorch, and discover the power of activation functions like hyperbolic tangent and ReLU. Witness the construction of deep neural networks step-by-step, from basic components to complex architectures. Gain practical coding experience and theoretical insights, concluding with a thorough summary of key concepts in neural network design and functionality.