Week 36: Linear Regression and Statistical interpretations
Contents
Plans for week 36
Material for lecture Monday September 2
Important technicalities: More on Rescaling data
Test Function for what happens with OLS, Ridge and Lasso
Linking the regression analysis with a statistical interpretation
Assumptions made
Expectation value and variance
Expectation value and variance for \( \boldsymbol{\beta} \)
Deriving OLS from a probability distribution
Independent and Identically Distributed (iid)
Maximum Likelihood Estimation (MLE)
A new Cost Function
More basic Statistics and Bayes' theorem
Marginal Probability
Conditional Probability
Bayes' Theorem
Interpretations of Bayes' Theorem
Example of Usage of Bayes' theorem
Doing it correctly
Bayes' Theorem and Ridge and Lasso Regression
Ridge and Bayes
Lasso and Bayes
Material for the active learning sessions Tuesday and Wednesday
Linear Regression and the SVD
What does it mean?
And finally \( \boldsymbol{X}\boldsymbol{X}^T \)
Code for SVD and Inversion of Matrices
Inverse of Rectangular Matrix
Ridge and LASSO Regression
From OLS to Ridge and Lasso
Deriving the Ridge Regression Equations
Note on Scikit-Learn
Comparison with OLS
SVD analysis
Interpreting the Ridge results
More interpretations
Deriving the Lasso Regression Equations
Simple example to illustrate Ordinary Least Squares, Ridge and Lasso Regression
Ridge Regression
Lasso Regression
Yet another Example
The OLS case
The Ridge case
Writing the Cost Function
Lasso case
The first Case
Simple code for solving the above problem
With Lasso Regression
Another Example, now with a polynomial fit
Conditional Probability
The conditional probability, if \( p(Y) > 0 \), is
$$ p(X\vert Y)= \frac{p(X,Y)}{p(Y)}=\frac{p(X,Y)}{\sum_{i=0}^{n-1}p(Y\vert X=x_i)p(x_i)}. $$
«
1
...
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
...
49
»