Week 42 Constructing a Neural Network code with introduction to Tensor flow
Contents
Plan for week 42
Lecture Thursday October 19
Review of the back propagation algorithm
Setting up the Back propagation algorithm
Setting up a Multi-layer perceptron model for classification
Defining the cost function
Example: binary classification problem
The Softmax function
Developing a code for doing neural networks with back propagation
Collect and pre-process data
Train and test datasets
Define model and architecture
Layers
Weights and biases
Feed-forward pass
Matrix multiplications
Choose cost function and optimizer
Optimizing the cost function
Regularization
Matrix multiplication
Improving performance
Full object-oriented implementation
Evaluate model performance on test data
Adjust hyperparameters
Visualization
scikit-learn implementation
Visualization
Testing our code for the XOR, OR and AND gates
The AND and XOR Gates
Representing the Data Sets
Setting up the Neural Network
The Code using Scikit-Learn
Building neural networks in Tensorflow and Keras
Tensorflow
Using Keras
Collect and pre-process data
The Breast Cancer Data, now with Keras
Fine-tuning neural network hyperparameters
Hidden layers
Which activation function should I use?
Is the Logistic activation function (Sigmoid) our choice?
The derivative of the Logistic funtion
The RELU function family
Which activation function should we use?
More on activation functions, output layers
Batch Normalization
Dropout
Gradient Clipping
A very nice website on Neural Networks
A top-down perspective on Neural networks
Limitations of supervised learning with deep networks
The AND and XOR Gates
The AND gate is defined as
\( x_1 \)
\( x_2 \)
\( y \)
0
0
0
0
1
0
1
0
0
1
1
1
And finally we have the XOR gate
\( x_1 \)
\( x_2 \)
\( y \)
0
0
0
0
1
1
1
0
1
1
1
0
«
1
...
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
...
52
»