For exercise sessions: Why Linear Regression (aka Ordinary Least Squares and family), repeat from last week
We need first a reminder from last week about linear regression.
Fitting a continuous function with linear parameterization in terms of the parameters \( \boldsymbol{\beta} \).
- Method of choice for fitting a continuous function!
- Gives an excellent introduction to central Machine Learning features with understandable pedagogical links to other methods like Neural Networks, Support Vector Machines etc
- Analytical expression for the fitting parameters \( \boldsymbol{\beta} \)
- Analytical expressions for statistical propertiers like mean values, variances, confidence intervals and more
- Analytical relation with probabilistic interpretations
- Easy to introduce basic concepts like bias-variance tradeoff, cross-validation, resampling and regularization techniques and many other ML topics
- Easy to code! And links well with classification problems and logistic regression and neural networks
- Allows for easy hands-on understanding of gradient descent methods
- and many more features
For more discussions of Ridge and Lasso regression, Wessel van Wieringen's article is highly recommended.
Similarly, Mehta et al's article is also recommended.