February 12-16: Optimization and gradient methods
Contents
Plans for the week of February 12-16
Top-down start
Motivation
Simple example and demonstration
Simple example and demonstration
Exercise 1: Find the local energy for the harmonic oscillator
Variance in the simple model
Computing the derivatives
Expressions for finding the derivatives of the local energy
Derivatives of the local energy
Exercise 2: General expression for the derivative of the energy
Python program for 2-electrons in 2 dimensions
Using Broyden's algorithm in scipy
Brief reminder on Newton-Raphson's method
The equations
Simple geometric interpretation
Extending to more than one variable
Steepest descent
More on Steepest descent
The ideal
The sensitiveness of the gradient descent
Convex functions
Convex function
Conditions on convex functions
More on convex functions
Some simple problems
Standard steepest descent
Gradient method
Steepest descent method
Steepest descent method
Final expressions
Conjugate gradient method
Conjugate gradient method
Conjugate gradient method
Conjugate gradient method
Conjugate gradient method and iterations
Conjugate gradient method
Conjugate gradient method
Conjugate gradient method
Broyden–Fletcher–Goldfarb–Shanno algorithm
Codes from numerical recipes
Finding the minimum of the harmonic oscillator model in one dimension
Functions to observe
Conjugate gradient method
An example is given by the eigenvectors of the matrix
$$ \begin{equation*} \hat{v}_i^T\hat{A}\hat{v}_j= \lambda\hat{v}_i^T\hat{v}_j, \end{equation*} $$
which is zero unless \( i=j \).
«
1
...
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
...
44
»