Week 47: Recurrent neural networks and Autoencoders
Contents
Plan for week 47
Reading recommendations
TensorFlow examples
What is a recurrent NN?
Why RNNs?
RNNs in more detail
RNNs in more detail, part 2
RNNs in more detail, part 3
RNNs in more detail, part 4
RNNs in more detail, part 5
RNNs in more detail, part 6
RNNs in more detail, part 7
The mathematics of RNNs, the basic architecture
Gating mechanism: Long Short Term Memory (LSTM)
Implementing a memory cell in a neural network
LSTM details
Basic layout (All figures from Raschka
et al.,
)
LSTM details
Comparing with a standard RNN
LSTM details I
LSTM details II
LSTM details III
Forget gate
The forget gate
Basic layout
Input gate
Short summary
Forget and input
Basic layout
Output gate
Summary of LSTM
LSTM implementation using TensorFlow
And the corresponding one with PyTorch
Dynamical ordinary differential equation
The Runge-Kutta-4 code
Using the above data to train an RNN
RNNs in more detail
«
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
...
37
»