Loading [MathJax]/extensions/TeX/boldsymbol.js

 

 

 

Mathematical model

The output y is produced via the activation function f

y = f\left(\sum_{i=1}^n w_ix_i + b_i\right) = f(z),

This function receives x_i as inputs. Here the activation z=(\sum_{i=1}^n w_ix_i+b_i) . In an FFNN of such neurons, the inputs x_i are the outputs of the neurons in the preceding layer. Furthermore, an MLP is fully-connected, which means that each neuron receives a weighted sum of the outputs of all neurons in the previous layer.