Week 45, Convolutional Neural Networks (CCNs) and Recurrent Neural Networks (RNNs)
Contents
Plans for week 45
Material for the lab sessions, additional ways to present classification results and other practicalities
Material for Lecture Monday November 4
Convolutional Neural Networks (recognizing images)
What is the Difference
Neural Networks vs CNNs
Why CNNS for images, sound files, medical images from CT scans etc?
Regular NNs don’t scale well to full images
3D volumes of neurons
Layers used to build CNNs
CNNs in brief
A deep CNN model ("From Raschka et al":"https://github.com/rasbt/machine-learning-book")
Key Idea
Building convolutional neural networks in Tensorflow and Keras
Setting it up
The MNIST dataset again
Strong correlations
Layers of a CNN
Systematic reduction
Prerequisites: Collect and pre-process data
Importing Keras and Tensorflow
Running with Keras
Final part
Final visualization
The CIFAR01 data set
Verifying the data set
Set up the model
Add Dense layers on top
Compile and train the model
Finally, evaluate the model
Building our own CNN code
List of contents:
Schedulers
Usage of schedulers
Cost functions
Usage of cost functions
Activation functions
Usage of activation functions
Convolution
Layers
Convolution2DLayer: convolution in a hidden layer
Backpropagation in the convolutional layer
Demonstration
Pooling Layer
Flattening Layer
Fully Connected Layers
Optimized Convolution2DLayer
The Convolutional Neural Network (CNN)
Usage of CNN code
Additional Remarks
Remarks on the speed
Convolution using separable kernels
Convolution in the Fourier domain
From FFNNs and CNNs to recurrent neural networks (RNNs)
Feedback connections
Vanishing gradients
Recurrent neural networks (RNNs): Overarching view
Sequential data only?
Differential equations
A simple example
RNNs
What kinds of behaviour can RNNs exhibit?
Basic layout, "Figures from Sebastian Rashcka et al, Machine learning with Sickit-Learn and PyTorch":"https://sebastianraschka.com/blog/2022/ml-pytorch-book.html"
Solving differential equations with RNNs
Two first-order differential equations
Velocity only
Linking with RNNs
Minor rewrite
RNNs in more detail
RNNs in more detail, part 2
RNNs in more detail, part 3
RNNs in more detail, part 4
RNNs in more detail, part 5
RNNs in more detail, part 6
RNNs in more detail, part 7
Backpropagation through time
The backward pass is linear
The problem of exploding or vanishing gradients
Mathematical setup
Back propagation in time through figures, part 1
Back propagation in time, part 2
Back propagation in time, part 3
Back propagation in time, part 4
Back propagation in time in equations
Chain rule again
Gradients of loss functions
Summary of RNNs
Summary of a typical RNN
Four effective ways to learn an RNN and preparing for next week
Gating mechanism: Long Short Term Memory (LSTM)
Implementing a memory cell in a neural network
LSTM details
Basic layout
More LSTM details
The forget gate
Input gate
Forget and input
Output gate
RNNs in more detail, part 6
«
1
...
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
...
77
»