Transformations of Random Variables
Transformations of Discrete and Continuous RVs
When a random variable with known density is transformed, the result is a random variable as well.
For a discrete random variable \(X\), the PMF of a function \(Y = g(X)\) is simply the table:
\(g(X)\) | \(g(x_1), \ g(x_2), \ \ldots, \ g(x_n), \ \ldots\) |
---|---|
Prob | \(p_1, \ p_2, \ \ldots, \ p_n, \ \ldots\) |
in which only realizations of \(X\) are transformed while the probabilities are kept unchanged.
Transformation of acontinuous random variable is more complex. Let \(X\) and \(Y\) be two random variables, related to one another by a monotonic function \(g\) and its inverse \(h\), i.e., \(h = g^{-1}\):
Suppose that \(X\) has a PDF \(f_X(x)\). Then the density of random variable \(Y = g(X)\) can be computed as:
where we have assumed that \(dx\) and \(dy\) are small.
If \(g\) is not one-to-one, but has \(k\) one-to-one inverse branches, \(h_1, h_2, \ldots, h_k\), then:
An example of a function which is not one-to-one is \(g(x) = x^2\), for which inverse branches are \(h_1(y) = \sqrt{y}\) and \(h_2(y) = -\sqrt{y}\).
Examples
Linear Mapping
Suppose \(X \sim \mathcal{N}(\bar{X}, \sigma^2_X)\) and \(Y = g(X) = aX + b\), where \(a, b \in \mathbb{R}\) and \(a \neq 0\).
The inverse mapping function and its Jacobian are:
Then the PDF of \(Y\), \(f_Y(y)\) can be computed as:
In summary, the RV \(Y\) will be Gaussian with a mean and variance:
Quadratic Mapping
Suppose \(X \sim \mathcal{N}(0, \sigma^2_X)\) and \(Y = g(X) = X^3\). The inverse mapping function and its Jacobian are:
Then the PDF of \(Y\), \(f_Y(y)\) can be computed as:
In summary, \(g\) converts a Gaussian RVt oa non-Gaussian RV.
Square Root Mapping
Suppose \(X \sim \text{Exp}(\lambda)\) and \(Y = \sqrt{X}\). The inverse mapping function and its Jacobian are:
Then the PDF of \(Y\), \(f_Y(y)\) can be computed as:
which is known as the Rayleigh distribution.
An alternative approach is to consider the CDF:
Then the PDF is:
Mean and Variance
The distribution of a transformation can be often quite messy and lacks closed form. Sometimes only the mean and variance may be needed.
If \(X\) is a random variable with \(\mathbb{E}X = \mu\) and \(\mathbb{V}\text{ar} X = \sigma^2\), then for a function \(Y = g(X)\), the following approximation holds:
If \(n\) independent random variables are transformed as \(Y = g(X_1, X_2, \ldots, X_n)\), then:
where \(\mathbb{E} X_i = \mu_i\) and \(\mathbb{V}\text{ar}X_i = \sigma^2_i\).
Note that if \(X_1, \ldots, X_n\) are correlated, then additional covariance term must be added for the variance computation: