machine learning
Understanding the Directional Derivative and the Gradient
1 Introduction Understanding how functions change in different directions is crucial in many fields. For example in the context of neural networks where gradients are used to update weights during
Mastering the Softmax Function: Understanding its Derivative with a Step-by-Step Example
\title{Mastering the Softmax Function: Understanding its Derivative with a Step-by-Step Example} \maketitle This article focuses on obtaining the derivative of the softmax function by means of a simple example. It
Maximum Likelihood Estimation
The method of maximum likelihood estimation allows to estimate point parameters for a given distribution underlying some observed data. Let’s look at an example to understand what this means:Imagine you
How to create a random variable with a Beta distribution from scratch, using only Uniform random variables
You can use software, like scipy.stats.beta if you want to sample from a Beta distribution. But you can also create a Beta distribution yourself — from scratch. The only thing you need
Understanding the Probability Density Function of the Normal Distribution
A random variable $Z$ is said to have the standard normal distribution, if its probability density function (pdf) is as follows: \[\begin{equation}f_Z(z)=\frac{1}{\sqrt{2\pi}} * \exp(\frac{-z^2}{2}), \quad -\infty<z<\infty\end{equation} \tag{1}\label{eq:eq1} \] This formula





