Factorial Switching Linear Dynamical System (FSLDS)
18
Control Theory
19
Conditional Random Fields (CRFS)
20
Recurrent Neural Networks
21
Sequential Data
22
Simplest recurrent network
23
Recurrent network unfolded in time
24
Vanishing and exploding gradients
25
speech recognition with recurrent networks
26
speech recognition with stacked LSTMs
27
recurrent network language models
28
recurrent encoder-decoder
29
Encoder-Recurrent-Decoder Networks
30
Summary
Description:
Explore advanced concepts in time series analysis through this comprehensive lecture by Professor Chris Williams from the University of Edinburgh. Delve into state-space models, including hidden Markov models and Kalman filters, and their practical applications. Learn about parameter estimation, likelihood-based inference, and forecasting techniques for time series data. Discover the intricacies of recurrent neural network models and their applications in sequential data processing. Gain insights into advanced topics such as Switching Linear Dynamical Systems, Control Theory, and Conditional Random Fields. Examine the challenges of vanishing and exploding gradients in recurrent networks and explore solutions through LSTM architectures. Investigate the applications of recurrent networks in speech recognition and language modeling, and understand the principles behind encoder-decoder networks for sequence-to-sequence tasks.
Time Series Class - Part 2 - Professor Chris Williams, University of Edinburgh