Machine Learning
Mastering the Softmax Function: Understanding its Derivative with a Step-by-Step Example
Mastering the Softmax Function: Understanding its Derivative with a Step-by-Step Example This article focuses on obtaining the derivative of the softmax function by means of a simple example. It assumes
The Sigmoid and its Derivative
The simoid function, $\sigma(x)$, is also called the logistic function, or expit \textcite{wiki_logit}. It is the inverse of the logit function. It’s function definition is:\begin{equation}\sigma(x) = \frac{1}{(1+e^{-x})} Let’s get familiar
What is a Logit?
The term logit has different meanings in math and in the TensorFlow library. In Ten-sorFlow it means “Per-label activations, typically a linear output. These activation energies are interpreted as unnormalized
Expectation Maximization
https://stats.stackexchange.com/questions/72774/numerical-example-to-understand-expectation-maximization http://noiselab.ucsd.edu/ECE228/Murphy_Machine_Learning.pdf https://bjlkeng.github.io/posts/the-expectation-maximization-algorithm/
