The back propagation equations for a neural network

We have thus

$$ \frac{\partial{\cal C}((\boldsymbol{\Theta}^L)}{\partial w_{ij}^L} = \left(a_j^L - y_j\right)a_j^L(1-a_j^L)a_i^{L-1}, $$

Defining

$$ \delta_j^L = a_j^L(1-a_j^L)\left(a_j^L - y_j\right) = \sigma'(z_j^L)\frac{\partial {\cal C}}{\partial (a_j^L)}, $$

and using the Hadamard product of two vectors we can write this as

$$ \boldsymbol{\delta}^L = \sigma'(\boldsymbol{z}^L)\circ\frac{\partial {\cal C}}{\partial (\boldsymbol{a}^L)}. $$