Week 40: Gradient descent methods (continued) and start Neural networks
Contents
Lecture Monday September 29, 2025
Suggested readings and videos
Lab sessions Tuesday and Wednesday
Logistic Regression, from last week
Classification problems
Optimization and Deep learning
Basics
Two parameters
Maximum likelihood
The cost function rewritten
Minimizing the cross entropy
A more compact expression
Extending to more predictors
Including more classes
More classes
Optimization, the central part of any Machine Learning algortithm
Revisiting our Logistic Regression case
The equations to solve
Solving using Newton-Raphson's method
Example code for Logistic Regression
Synthetic data generation
Using
Scikit-learn
Using the correlation matrix
Discussing the correlation data
Other measures in classification studies
Introduction to Neural networks
Artificial neurons
Neural network types
Feed-forward neural networks
Convolutional Neural Network
Recurrent neural networks
Other types of networks
Multilayer perceptrons
Why multilayer perceptrons?
Illustration of a single perceptron model and a multi-perceptron model
Examples of XOR, OR and AND gates
Does Logistic Regression do a better Job?
Adding Neural Networks
Mathematical model
Mathematical model
Mathematical model
Mathematical model
Mathematical model
Matrix-vector notation
Matrix-vector notation and activation
Activation functions
Activation functions, Logistic and Hyperbolic ones
Relevance
Lab sessions Tuesday and Wednesday
Work on project 1 and discussions on how to structure your report
No weekly exercises for week 40, project work only
Video on how to write scientific reports recorded during one of the lab sessions at
https://youtu.be/tVW1ZDmZnwM
A general guideline can be found at
https://github.com/CompPhysics/MachineLearning/blob/master/doc/Projects/EvaluationGrading/EvaluationForm.md
.
«
1
2
3
4
5
6
7
8
9
10
11
12
13
...
48
»