Loading [MathJax]/extensions/TeX/boldsymbol.js

 

 

 

Simple neural network and the back propagation equations

Let us now try to increase our level of ambition and attempt at setting up the equations for a neural network with two input nodes, one hidden layer with two hidden nodes and one output layer with one output node/neuron only (see graph)..

We need to define the following parameters and variables with the input layer (layer (0) ) where we label the nodes x_1 and x_2

x_1 = a_1^{(0)} \wedge x_2 = a_2^{(0)}.

The hidden layer (layer (1) ) has nodes which yield the outputs a_1^{(1)} and a_2^{(1)} ) with weight \boldsymbol{w} and bias \boldsymbol{b} parameters

w_{ij}^{(1)}=\left\{w_{11}^{(1)},w_{12}^{(1)},w_{21}^{(1)},w_{22}^{(1)}\right\} \wedge b^{(1)}=\left\{b_1^{(1)},b_2^{(1)}\right\}.