Week 45, Convolutional Neural Networks (CCNs)
Contents
Plans for week 45
Material for the lab sessions
Material for Lecture Monday November 3
Convolutional Neural Networks (recognizing images), reminder from last week
What is the Difference
Neural Networks vs CNNs
Why CNNS for images, sound files, medical images from CT scans etc?
Regular NNs don’t scale well to full images
3D volumes of neurons
More on Dimensionalities
Further remarks
Layers used to build CNNs
Transforming images
CNNs in brief
A deep CNN model ("From Raschka et al":"https://github.com/rasbt/machine-learning-book")
Key Idea
Mathematics of CNNs
Convolution Examples: Polynomial multiplication
Efficient Polynomial Multiplication
Further simplification
A more efficient way of coding the above Convolution
Commutative process
Toeplitz matrices
Fourier series and Toeplitz matrices
Generalizing the above one-dimensional case
Memory considerations
Padding
New vector
Rewriting as dot products
Cross correlation
Two-dimensional objects
CNNs in more detail, simple example
The convolution stage
Finding the number of parameters
New image (or volume)
Parameters to train, common settings
Examples of CNN setups
Summarizing: Performing a general discrete convolution ("From Raschka et al":"https://github.com/rasbt/machine-learning-book")
Pooling
Pooling arithmetic
Pooling types ("From Raschka et al":"https://github.com/rasbt/machine-learning-book")
Building convolutional neural networks using Tensorflow and Keras
Setting it up
The MNIST dataset again
Strong correlations
Layers of a CNN
Systematic reduction
Prerequisites: Collect and pre-process data
Importing Keras and Tensorflow
Running with Keras
Final part
Final visualization
The CIFAR01 data set
Verifying the data set
Set up the model
Add Dense layers on top
Compile and train the model
Finally, evaluate the model
Building code using Pytorch
A deep CNN model (
From Raschka et al
)
Figure 3: A deep CNN
«
1
...
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
...
59
»