Let $X,Y,Z$ represent random variables and let $p_X(x)$ represent the probability density for the random variable $X$. Similarly, we write $p_{X,Y}(x,y)$ for the joint probability distribution of two random variables.

Let $f(x)$ be any function of $x$. Consider the random variable $Z$ defined by the equation

\begin{align} Z=f(X)\ . \end{align}

What is the probability density of the random variable $Z$ given the probability density of $X$? Let us pick a value $z$ and let $x_1,x_2,\ldots$ be the set of points that solve the equation $z=f(x)$. Thus, the probability density $p_Z(z)$ will be determined by the probability density of the random variable $X$ at $x_1,x_2,\ldots$. We need to add the different contributions with some weight that has to be determined. How do we do that? We claim that the following holds:

\begin{align} p_Z(z) = \int dx\ \delta\big(z-f(x)\big)\ p_X(x) \end{align}

The above formula satisfies all the required conditions: (i) given a value $z$, it gets contributions only from the set of points $x_1,x_2,\ldots$ that solve the equation $z=f(x)$; (ii) it is suitably normalized. A simple way to prove the above is to first consider the case when the random variables take values in a discrete set. Then, one deals with probabilities rather than densities and the Dirac delta function is replaced by the Kronecker delta. Of course, the integral gets replaced by a sum.

Further simplification of the above formula can be carried out by using the following identity involving delta functions:

\begin{align} \delta\big(z-f(x)\big)= \sum_i \frac{\delta(x-x_i)}{|f'(x_i)|}\ . \end{align}

Using this we can see that:

\begin{align} p_Z(z) = \sum_i \frac1{|f'(x_i)|}\ p_X(x_i)\ . \end{align}

Note that $z$ is implicitly present in the RHS of the above equation as the roots $x_i$ implicitly depend on it.

Another generalization of the above formula occurs when we consider multi-variable functions such as $z=f(x,y)$. The corresponding equation involving random variables $X,Y,Z$ is

\begin{align} Z=f(X,Y)\ . \end{align}

One writes a formula similar to Eq. (2) above:

\begin{align} p_Z(z) =\int dy\ \int dx\ \delta\big(z-f(x,y)\big)\ p_{X,Y}(x,y) \end{align}

When $X$ and $Y$ are independent random variables, then the joint probability density is nothing but the product of the individual probability densities. Then, we can write

\begin{align} p_Z(z) =\int dy\ \int dx\ \delta\big(z-f(x,y)\big)\ p_{X}(x) p_{Y}(y)\ . \end{align}

The most common function that is considered is the sum i.e., $z=x+y$. In such situations, one of the integrals in the above equation can be carried out to obtain the probability density as the convolution of the two probability densities.

\begin{align} p_Z(z) =\int dy\ \int dx\ \delta\big(z-x-y\big)\ p_{X}(x) p_{Y}(y)=\int dx\ p_X(x) p_Y(z-x)\ . \end{align}
Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-ShareAlike 3.0 License