Another widely studied model, is the so-called perceptron model, which is an example of a "hard classification" model. We will encounter this model when we discuss neural networks as well. Each datapoint is deterministically assigned to a category (i.e \( y_i=0 \) or \( y_i=1 \)). In many cases, and the coronary heart disease data forms one of many such examples, it is favorable to have a "soft" classifier that outputs the probability of a given category rather than a single value. For example, given \( x_i \), the classifier outputs the probability of being in a category \( k \). Logistic regression is the most common example of a so-called soft classifier. In logistic regression, the probability that a data point \( x_i \) belongs to a category \( y_i=\{0,1\} \) is given by the so-called logit function (or Sigmoid) which is meant to represent the likelihood for a given event,
$$ p(t) = \frac{1}{1+\mathrm \exp{-t}}=\frac{\exp{t}}{1+\mathrm \exp{t}}. $$Note that \( 1-p(t)= p(-t) \).