Loading [MathJax]/extensions/TeX/boldsymbol.js

 

 

 

Example 3

We start with a new scalar but where now the vector \boldsymbol{y} is replaced by a vector \boldsymbol{x} and the matrix \boldsymbol{A} is a square matrix with dimension n\times n .

\alpha = \boldsymbol{x}^T\boldsymbol{A}\boldsymbol{x},

with \boldsymbol{x} a vector of length n .

We write out the specific sums involved in the calculation of \alpha

\alpha = \sum_{i=0}^{n-1}\sum_{j=0}^{n-1}x_i a_{ij}x_j,

taking the derivative of \alpha with respect to a given component x_k we get the two sums

\frac{\partial \alpha}{\partial x_k} = \sum_{i=0}^{n-1}a_{ik}x_i+\sum_{j=0}^{n-1}a_{kj}x_j,

for \forall k =0,1,2,\dots,n-1 . We identify these sums as

\frac{\partial \alpha}{\partial \boldsymbol{x}} = \boldsymbol{x}^T\left(\boldsymbol{A}^T+\boldsymbol{A}\right).

If the matrix \boldsymbol{A} is symmetric, that is \boldsymbol{A}=\boldsymbol{A}^T , we have

\frac{\partial \alpha}{\partial \boldsymbol{x}} = 2\boldsymbol{x}^T\boldsymbol{A}.