Continuous Probability Distributions

Eric Adsetts
3 min readAug 12, 2020

In probability, mathematicians use continuous distributions to model situations where the random variable is continuous. Two obvious examples of continuous random variables are height and weight.

The most well-known example of a continuous distribution is probably the normal distribution. The normal distribution models many different situations that appear in real life. The power of the normal distribution stems largely from the central limit theorem, which states that the means of random samples of a population can be represented by the normal distribution. The normal distribution with a mean of zero and a standard deviation of one is referred to as the standard normal distribution and is frequently used in probability and statistics.

Another relatively simple and extremely useful continuous distribution is the exponential distribution. The exponential distribution is closely related to the discrete Poisson distribution. The Poisson distribution models the number of occurrences in a given time frame. The exponential distribution models the amount of time between occurrences. For example, if you are waiting at a train station, the exponential distribution will tell you how long you can expect to wait for a train to come. An important feature of the exponential distribution is that it is “memoryless”, meaning if you have waited 10 minutes for an event to occur the probability of the event occurring does not increase. The exponential distribution is a special case of a family of distributions known as gamma distributions.

Another useful special case of gamma distributions is the chi-squared distribution. The chi-squared distribution features a parameter, k, representing the degrees of freedom. The chi-squared distribution is the distribution of the sum of the squares of k independent standard normal random variables. While its definition sounds confusing, the chi-squared distribution is extremely useful in hypothesis tests and constructing confidence intervals. The chi-squared distribution is used to test the goodness of fit of a theoretical distribution as well as the independence of categorical data.

The f distribution is a useful distribution that is used in analysis of variance (ANOVA). The f distribution models a ratio of two chi-squared distributions, so it is often used in statistics when dealing with ratios. ANOVA is useful for figuring out if many population means are equal, and in the end, comes down to a test of the f statistics.

Another distribution that is useful in statistics for hypothesis testing and confidence intervals is the t distribution. The t distribution is essentially a manipulation of the standard normal distribution with thicker tails to account for more uncertainty. The t distribution also takes a degrees of freedom parameter. The t distribution converges to the standard normal distribution as the degrees of freedom parameter approaches infinity.

One last distribution that comes up in probability is the beta distribution. The beta distribution models situations where the outcome falls between 0 and 1. For example, elections can be modeled using a beta distribution as no candidate will ever get more than 100 percent of the vote or less than 0 percent.

These are just some of the distributions used in statistics and probability.

--

--