diff --git a/_posts/2024-05-07-sigmoid-and-softmax.md b/_posts/2024-05-07-sigmoid-and-softmax.md index e83532e..1adf56d 100644 --- a/_posts/2024-05-07-sigmoid-and-softmax.md +++ b/_posts/2024-05-07-sigmoid-and-softmax.md @@ -143,6 +143,7 @@ $$ \end{align} $$ +## Discussion and Conclusion ![Sigmoid with entropy](https://github.com/sigmoidandsoftmax/sigmoidandsoftmax.github.io/blob/main/assets/img/sigmoid_and_softmax/sigmoid_neg_entropy.png?raw=true) @@ -150,5 +151,5 @@ $$ ![Softmax with entropy](https://github.com/sigmoidandsoftmax/sigmoidandsoftmax.github.io/blob/main/assets/img/sigmoid_and_softmax/softmax_neg_entropy.png?raw=true) Figure 1. Effect of the entropy term in values of the objective function of sigmoid and softmax. -## Discussion and Conclusion + Sigmoid is the function when applied to a scalar gives the probability that has maximum entropy and multiplication value with the input scalar. Softmax is the function that when applied on a vector gives a probability vector that has minimum negative entropy and dot product value with the input vector. Figure 1 shows the values of the objective function for different weights to the entropy term.