Loading [MathJax]/extensions/TeX/boldsymbol.js

 

 

 

Inputs to the activation function

With the activation values \boldsymbol{z}^l we can in turn define the output of layer l as \boldsymbol{a}^l = f(\boldsymbol{z}^l) where f is our activation function. In the examples here we will use the sigmoid function discussed in our logistic regression lectures. We will also use the same activation function f for all layers and their nodes. It means we have

a_j^l = \sigma(z_j^l) = \frac{1}{1+\exp{-(z_j^l)}}.