Processing math: 100%

 

 

 

BFGS optimization problem, setting up the equations

We are given the following unconstrained optimization problem:

x=argminxf(x)

where f:RnR is a differentiable objective function, and xRn is the vector of decision variables.

The first-order necessary conditions for optimality are given by:

f(x)=0

where f(x) denotes the gradient of f(x).