Processing math: 100%

 

 

 

Definitions

With our definition of the targets y, the outputs of the network ˜y and the inputs x we define now the activation zlj of node/neuron/unit j of the l-th layer as a function of the bias, the weights which add up from the previous layer l1 and the forward passes/outputs al1 from the previous layer as

zlj=Ml1i=1wlijal1i+blj,

where blk are the biases from layer l. Here Ml1 represents the total number of nodes/neurons/units of layer l1. The figure in the whiteboard notes illustrates this equation. We can rewrite this in a more compact form as the matrix-vector products we discussed earlier,

zl=(Wl)Tal1+bl.