Week 40: Gradient descent methods (continued) and start Neural networks
Contents
Lecture Monday September 29, 2025
Suggested readings and videos
Lab sessions Tuesday and Wednesday
Logistic Regression, from last week
Classification problems
Optimization and Deep learning
Basics
Two parameters
Maximum likelihood
The cost function rewritten
Minimizing the cross entropy
A more compact expression
Extending to more predictors
Including more classes
More classes
Optimization, the central part of any Machine Learning algortithm
Revisiting our Logistic Regression case
The equations to solve
Solving using Newton-Raphson's method
Example code for Logistic Regression
Synthetic data generation
Using
Scikit-learn
Using the correlation matrix
Discussing the correlation data
Other measures in classification studies
Introduction to Neural networks
Artificial neurons
Neural network types
Feed-forward neural networks
Convolutional Neural Network
Recurrent neural networks
Other types of networks
Multilayer perceptrons
Why multilayer perceptrons?
Illustration of a single perceptron model and a multi-perceptron model
Examples of XOR, OR and AND gates
Does Logistic Regression do a better Job?
Adding Neural Networks
Mathematical model
Mathematical model
Mathematical model
Mathematical model
Mathematical model
Matrix-vector notation
Matrix-vector notation and activation
Activation functions
Activation functions, Logistic and Hyperbolic ones
Relevance
Lecture Monday September 29, 2025
Logistic regression and gradient descent, examples on how to code
Start with the basics of Neural Networks, setting up the basic steps, from the simple perceptron model to the multi-layer perceptron model
Video of lecture at
https://youtu.be/MS3Tv8FVArs
Whiteboard notes at
https://github.com/CompPhysics/MachineLearning/blob/master/doc/HandWrittenNotes/2025/FYSSTKweek40.pdf
«
1
2
3
4
5
6
7
8
9
10
11
...
48
»