Loading [MathJax]/extensions/TeX/boldsymbol.js

 

 

 

Simple neural network and the back propagation equations

Let us now try to increase our level of ambition and attempt at setting up the equations for a neural network with two input nodes, one hidden layer with two hidden nodes and one output layer with one output node/neuron only (see graph)..

We need to define the following parameters and variables with the input layer (layer (0) ) where we label the nodes x_0 and x_1

x_0 = a_0^{(0)} \wedge x_1 = a_1^{(0)}.

The hidden layer (layer (1) ) has nodes which yield the outputs a_0^{(1)} and a_1^{(1)} ) with weight \boldsymbol{w} and bias \boldsymbol{b} parameters

w_{ij}^{(1)}=\left\{w_{00}^{(1)},w_{01}^{(1)},w_{10}^{(1)},w_{11}^{(1)}\right\} \wedge b^{(1)}=\left\{b_0^{(1)},b_1^{(1)}\right\}.