Plans for the future: ML for us is a compiler problem
13
Our main compiler problem: automatic differentiation (AD)
14
AD normally needs expression trees
15
Can we do better? Our answer: Zygote.jl
16
Taking derivative and Julia IR (Internal Representation)
17
Benchmarking more complex examples
18
Speed, but at what cost?
19
Defining custom gradients
20
Convenient error messages
21
We have fully dynamics AD
22
In Julia, we can just hack compilers with different tricks when we need it
23
Demo of simple derivative
24
Question: what is that arrow?
25
Q&A: can you differential different functions that number to number (scalar to scalar)?
26
Comment: removing of the stack in some cases
27
Q&A: what are the Zygote.jl limitations right now?
28
Q&A: what is relation of Zygote.jl and Casset.jl?
29
Q&A: can we use Zygote.jl to differentiate function from one parameter to many parameters?
30
Annulment of Flux.jl on hackathon
Description:
Explore a 30-minute conference talk on Flux, an innovative machine learning library for the Julia programming language. Discover how Flux combines ease of use with state-of-the-art performance through Julia's advanced features and modern compiler technology. Learn about various applications of Flux, including image recognition, speech recognition with CUDA, and reinforcement learning. Delve into the library's internals, its integration with ONNX, and its ability to export models to browsers using FluxJS.jl. Gain insights into the future plans for Flux, focusing on automatic differentiation as a compiler problem. Understand the development of Zygote.jl, a novel approach to automatic differentiation that improves upon traditional methods. Witness demonstrations, benchmarks, and discussions on custom gradients, error messages, and the flexibility of Julia's compiler. Engage with a Q&A session covering topics such as Zygote.jl's capabilities, limitations, and its relationship with other Julia packages.
Read more
Flux - The Elegant Machine Learning Library for Julia