Главная
Study mode:
on
1
Introduction by MIT's Prof. Alan Edelman
2
Agenda of Lecture0-1:30 Transformations and Automatic Differentiations
3
General Linear Transformation
4
Shear Transformation
5
Non-Linear Transformation(Warp)
6
Rotation
7
Compose Transformation(Rotate followed by Warp)
8
More Transformations(xy, rθ)
9
Linear and Non-Linear Transformations
10
Linear combinations of Images
11
Functions in Maths and in Julia(Short form, anonymous and long form)
12
Automatic Differentiation of Univariates
13
Scalar Valued Multivariate Functions
14
Automatic Differentiation: Scalar valued and Multivariate Functions
15
Minimizing "loss function" in Machine Learning
16
Transformations: Vector Valued Multivariate Functions
17
Automatic Differentiation of Transformations
18
But what is a transformation, really?
19
Significance of Determinants in Scaling
20
Resource for Automatic Differentiation in 10 minutes with Julia
Description:
Explore transformations and automatic differentiation in this comprehensive lecture from MIT's Computational Thinking Spring 2021 series. Delve into general linear transformations, shear transformations, non-linear warping, rotations, and composite transformations. Learn about linear combinations of images and various function representations in mathematics and Julia programming. Discover automatic differentiation techniques for univariate and multivariate functions, and their applications in machine learning. Investigate vector-valued multivariate functions and their role in transformations. Gain insights into the significance of determinants in scaling and access additional resources for mastering automatic differentiation with Julia in just 10 minutes.

Transformations and Automatic Differentiation in Computational Thinking - Lecture 3

The Julia Programming Language
Add to list