Week 44, Convolutional Neural Networks (CNN)
Contents
Plan for week 44
Lab sessions on Tuesday and Wednesday
Material for Lecture Monday October 28
Convolutional Neural Networks (recognizing images)
What is the Difference
Neural Networks vs CNNs
Why CNNS for images, sound files, medical images from CT scans etc?
Regular NNs don’t scale well to full images
3D volumes of neurons
More on Dimensionalities
Further remarks
Layers used to build CNNs
Transforming images
CNNs in brief
A deep CNN model ("From Raschka et al":"https://github.com/rasbt/machine-learning-book")
Key Idea
How to do image compression before the era of deep learning
The SVD example
Mathematics of CNNs
Convolution Examples: Polynomial multiplication
Efficient Polynomial Multiplication
Further simplification
A more efficient way of coding the above Convolution
Commutative process
Toeplitz matrices
Fourier series and Toeplitz matrices
Generalizing the above one-dimensional case
Memory considerations
Padding
New vector
Rewriting as dot products
Cross correlation
Two-dimensional objects
CNNs in more detail, simple example
The convolution stage
Finding the number of parameters
New image (or volume)
Parameters to train, common settings
Examples of CNN setups
Summarizing: Performing a general discrete convolution ("From Raschka et al":"https://github.com/rasbt/machine-learning-book")
Pooling
Pooling arithmetic
Pooling types ("From Raschka et al":"https://github.com/rasbt/machine-learning-book")
Building convolutional neural networks in Tensorflow and Keras
Setting it up
The MNIST dataset again
Strong correlations
Layers of a CNN
Systematic reduction
Prerequisites: Collect and pre-process data
Importing Keras and Tensorflow
Running with Keras
Final part
Final visualization
The CIFAR01 data set
Verifying the data set
Set up the model
Add Dense layers on top
Compile and train the model
Finally, evaluate the model
Building our own CNN code
List of contents:
Schedulers
Usage of schedulers
Cost functions
Usage of cost functions
Activation functions
Usage of activation functions
Convolution
Layers
Convolution2DLayer: convolution in a hidden layer
Backpropagation in the convolutional layer
Demonstration
Pooling Layer
Flattening Layer
Fully Connected Layers
Optimized Convolution2DLayer
The Convolutional Neural Network (CNN)
Usage of CNN code
Additional Remarks
Remarks on the speed
Convolution using separable kernels
Convolution in the Fourier domain
Week 44, Convolutional Neural Networks (CNN)
Morten Hjorth-Jensen
Department of Physics, University of Oslo, Norway
October 28
Read »
1
2
3
4
5
6
7
8
9
10
...
61
»
© 1999-2024, Morten Hjorth-Jensen. Released under CC Attribution-NonCommercial 4.0 license