Probability distributions
Introduction
Table of contents
1. The cumulative distribution function
A complete probabilstic description of a random variable X for a single observation (or series of independent observations) is the probability that X is less than or equal to a real value x. One can write this as a function, known as the cumulative distribution function (CDF):
\[F(x) = P(\boldsymbol{X} \leq x).\]A function must meet 3 requirements to be a CDF:
- Monotonicity: if x1 < x2, then F(x1) < F(x2).
- As x approaches positive infinity (or the highest allowable value), F(x) approaches 1.
- As x approaches negative infinity (or the lowest allowable value), F(x) approaches 0.
Note: Sometimes people will refer to the CDF more simply as the distribution function.
# Example code
Note: Advanced content.
References:
- Luce, R. D. (1986). Response times: Their role in inferring elementary mental organization. Oxford University Press. →
2. The probability density function
Assuming the cumulative distribution function is differentiable, we can compute the first derivative with respect to the value x to get the probability density function (PDF):
\[f(x) = \frac{d}{dx} F(x).\]As such, one can also compute the cumulative distribution function by taking the integral of the probability density function:
\[F(x) = \int_{-\infty}^{x} f(x) dx.\]Note: Sometimes people will refer to the PDF more simply as the density function.
When working with discrete random variables, it is often possible to compute the probability that the random variable equals an exact value, or \( P(\boldsymbol{X} = x) \), known as the probability mass function.
# Example R code
Note: Advanced content.
3. The hazard function
The likelihood of an event occurring given that it has not yet occurred.
\[\lambda(x) = \frac{ f(x) }{ 1 - F(x) }.\]# Example R code
Note: Advanced content.
References:
- Luce, R. D. (1986). Response times: Their role in inferring elementary mental organization. Oxford University Press. →
Return to: Probability; Sections; Home page