Probability distribution - histogram, mean, variance & standard deviation

All in One Place

Everything you need for better grades in university, high school and elementary.

Learn with Ease

Made in Canada with help for all provincial curriculums, so you can study in confidence.

Instant and Unlimited Help

Get the best tips, walkthroughs, and practice questions.

0/1
?
Intros
Lessons
  1. Discrete VS. Continuous
0/2
?
Examples
Lessons
  1. Probability Histogram, Mean, Variance and Standard Deviation
    The following table gives the probability distribution of a loaded (weighted) die:

    outcome

    probability

    1

    0.05

    2

    0.10

    3

    0.30

    4

    0.33

    5

    0.15

    6

    0.07

    1. Using calculator commands to find the mean, variance, and standard deviation of the probability distribution.
    2. Based on the "range rule of thumb", determine the outcomes that are considered as "usual" and "unusual".
Topic Notes
?
Table of Contents:


Probability distribution: histogram, mean, variance and standard deviation


For most of this statistics course we have dedicated our time to frequency distributions and their many different graphic representations; it is time now, finally, to take a look into probability distributions: those which produce graphs where the vertical axis depicts the probability of a certain outcome during and experiment and not the frequency.

What is a probability distribution?


A probability distribution is a tool that allows us to understand the values that a random variable may produce. For that, we need to learn about probability and random variables.

Probability refers to the chances of an outcome to occur in a random process, such particular outcome arises from a set of all possible outcomes in the experiment. In other words, when performing a statistical study, data collection procedure or any other experiment defined as a random process, there are a variety of possible results from such experiment but there is no way for the experimenter to know what each outcome from each trial will be; therefore, probability allows us make a guess of each possible outcome value based on the results producen through multiple earlier trials or the mathematical calculated possibility of it occurring based on the proportion of each value in the set of possible values. In simple words, probability gives an idea on how likely is an outcome to result from an experiment.

The probability of a specific value of a random variable xx to occur is mathematically defined as:

probability=P(x)=Number  of  succesful  outcomesTotal  number  of  possible  outcomes probability = P(x) = \frac{Number\;of\;succesful\;outcomes}{Total\;number\;of\;possible\;outcomes}

P(x)=1  P(x) = 1 \;    \; means the value of x has 100% chances of occuring

P(x)=0.75=34  P(x) = 0.75 =\frac{3}{4} \;    \; means the value of x has 75% chances of occuring

P(x)=0.5=12  P(x) = 0.5 =\frac{1}{2} \;    \; means the value of x has 50% chances of occuring

P(x)=0.25=14  P(x) = 0.25 =\frac{1}{4} \;    \; means the value of x has 25% chances of occuring

P(x)=0=  P(x) = 0 = \;    \; means the value of x has NO chances of occuring
Equation 1: Defining probability

When recording the possible outcomes of a random experiment, each of the values produced as results throughout the experiment are what we call random variables. A random variable is formed by data points, each assuming a unique numerical value that has been produced through an experiment, and which is a random result of such experiment; therefore, there is no way to know for sure what the result values of a random variable will be, but we can use its recorded outcomes and the probability of each to study the behaviour of a population.

Random variables are classified into two categories depending on the type of values they can contain: Discrete random variables and continuous random variables.

A discrete random variable is that which contains countable values. What do we mean by countable? Well, these are whole numbers, integers. Therefore, discrete random variables refer to variables that deal with items that can be counted as complete units, not fractions or any infinitesimally small parts of a unit interval.

On the other hand a continuous random variable can have any possible value, as long as it belongs to a particular defined interval that is being studied. Simply said, a continuous random variable can assume any value within a specified interval of values assumed by the variable, that means that they can have values with decimal expressions or fractions; they are said to be continuous because they will contain every single value within the interval and that means that not matter how small scale can you go within an interval, this variable is taking account of it.

For example, let us say a continuous random variable being studied by us may have values within 0 and 5, that means the variable can have an outcome equal to 3.2 or 1.23456 or 4.123 or even 0.0000000000000000001 because is still in between the value of 0 to 5!
If this same range was to be used for a discrete random variable, the possible values of such variable would only be the complete numbers: 0, 1, 2, 3, 4, 5.

With this in mind, we can define a probability distribution as a list or graphic representation of all the possible values of a random variable and the probability of each of them occurring.
A probability distribution table is a table, just as the frequency distribution table, where the possible outcomes of a statistical experiment are listed, but instead of listing the frequency of occurrence of each outcome on their side, it presents the probability of each of them occurring.

Remember from our lesson on frequency distributions and histograms that a histogram is the graphic representation of a frequency distribution for one variable that resembles a bar-graph, where each bar or column in the graph represents the frequencies of consecutive ranges from class intervals in a frequency distribution table. Although this definition of a histogram is correct, it so happens that histograms are so basic and simple to construct that they can be used in many other ways, not only for frequency distributions!

Therefore, we can construct histograms for probability distributions and they can simply be called as a probability distribution graph.

Probability distribution: histogram, mean, variance and standard deviation
Figure 1: Example of a probability distribution histogram

A continuous probability distribution is a probability density function. The area under the smooth curve is equal to 1 and the frequency of occurrence of values between any two points equals the total area under the curve between the two points and the xx-axis

Types of probability distribution


There are two major types of probability distribution: discrete and continuous.
\quad

• Discrete probability distributions

A discrete probability distribution is that which studies and portrays the behaviour of a discrete random variable. A discrete probability distribution is the probability distribution of a discrete random variable and therefore, it describes all of the possible outcome values of the discrete random variable and the probability of each of them occurring during the experiment.

Discrete probability distributions usually have integers as outcome values, and the sum of the probability of each of these outcomes is equal to 1, meaning that having all of the possible outcomes listed, there is a 100% of probability of obtaining any of them, or in other words, one of them has to occur in the experiment. This also means that the probability of each outcome can be expressed as a specific positive value from 0 to 1 (as shown in equation 1).

The most common types of discrete probability distributions are:
Each of them will be explained throughout the next lessons of this course.
\quad

• Continuous distributions

A continuous probability distribution is the probability distribution of a continuous random variable; therefore, it describes a continuous interval of infinite possible outcomes and the probability of outcomes to occur belonging to interval pieces within the continuous interval being studied. This may sound confusing but think of it this way: given that a continuous random variable can have any value within a interval, there are an infinite amount of values possible to be a result from such interval; this produces a probability of zero for each of the possible outcome values (since there are an infinite amount of data points, the probability of any of them being an occurring outcome goes to zero following the logic of: as the amount of possible outcomes increases, the probability of each being the occurring one decreases). Therefore, the probability must be measured over interval pieces.

If we are to graph a continuous probability distribution, we would see a smooth probability curve due the nature of the infinite data points being graphed, then, the probability is the area under the curve for any interval piece. See figure 2 to observe an example of a continuous probability distribution graph.

The most common types of continuous probability distributions are:
  • The normal distribution
  • The uniform probability distribution
  • The exponential probability distribution

As mentioned before, on this and the next few lessons we will be focused on studying discrete probability distributions such as the binomial, poisson, etc. Meanwhile, take a look at the difference of a graphic representation of a discrete probability distribution (left, with a binomial probability distribution) and a continuous probability distribution (right, with a standard normal probability distribution) below:

Probability distribution: histogram, mean, variance and standard deviation
Figure 2: Examples of the two types of probability distributions and their graphic representations

To summarize the probability distributions in a table, take a look at the next figure:

Probability distribution: histogram, mean, variance and standard deviation
Figure 3: Types of probability distributions


How to find the mean of a probability distribution?

For the case of discrete probability distributions, we can use as probability the probability mass function defined as p(x); therefore, in order to calculate the mean of a probability distribution we follow the next formula:

mean:μ=[xp(x)]mean: \mu = \sum \left[ x \cdot p (x) \right]
Equation 2: Mean of binomial distribution

Where xx represents the value of the random variable (the outcome), and p(x)p(x) is the probability mass function or pm f, which defines the probability of each outcome value of xx to occur.

How to find the standard deviation of a probability distribution?


We compute the standard deviation for a probability distribution function similarly to way that we compute the standard deviation for a population, except that after squaring xμx - \mu , we multiply by the probability mass function p(x)p(x) and we do not need to divide by n1n - 1.

variance:σ2=[(xμ)2p(x)]=[x2  p(x)]μ2variance:\sigma^2 = \sum \left[ (x - \mu)^2 \cdot p(x) \right] = \sum \left[x^2 \; \cdot p(x) \right] - \mu^2

standard  deviation:σ=σ2[(xμ)2  p(x)]=[(x2  p(x))μ2]standard\;deviation: \sigma = \sqrt{\sigma^2} \sqrt{\sum \left[ (x - \mu)^2 \; \cdot p(x)\right]} = \sqrt{\sum \left[ (x^2 \; \cdot p(x)) - \mu^2 \right]}
Equation 3: Variance and Standard deviation of the probability distribution

Notice that the variance of a probability distribution can be obtained once you have calculated the standard deviation.

In order to illustrate the process of calculating the mean, variance and standard deviation of a probability distribution let us take a look at the next example:

Example 1

The following table gives the probability distribution of a loaded (weighted) die:

Probability distribution: histogram, mean, variance and standard deviation
Figure 4: Probability distribution table


a) \quad Find the mean, variance, and standard deviation of the probability distribution.

Using equation 2 we can compute the mean:

μ=[x  p(x)]=(1)(0.05)+(2)(0.1)+(3)(0.3)+(4)(0.33)+(5)(0.15)+(6)(0.07) \mu = \sum \left[x \; \cdot p(x) \right] =(1)(0.05)+(2)(0.1)+(3)(0.3)+(4)(0.33)+(5)(0.15)+(6)(0.07)

μ=[x  p(x)]=0.05+0.2+0.9+1.32+0.75+0.42=3.64\mu = \sum \left[x \; \cdot p(x) \right] =0.05+0.2+0.9+1.32+0.75+0.42=3.64
Equation 4: Mean of the probability distribution

Then we use equation 3 to compute the variance and standard deviation:

σ2=[x2  p(x)]μ2\sigma^2 = \sum \left[ x^2 \; \cdot p(x)\right] - \mu^2

σ2=(1)2(0.05)+(2)2(0.1)+(3)2(0.3)+(4)2(0.33)+(5)2(0.15)+(6)2(0.07)(3.64)2\sigma^2 = (1)^2(0.05)+(2)^2(0.1)+(3)^2(0.3)+(4)^2(0.33)+(5)^2(0.15)+(6)^2(0.07)-(3.64)^2

σ2=0.05+0.4+2.7+5.28+3.75+2.5213.2496\sigma^2 = 0.05+0.4+2.7+5.28+3.75+2.52-13.2496

σ2=14.713.2496=1.45  \sigma^2 = 14.7-13.2496=1.45 \;   variance \; variance

σ=σ2=1.2  \sigma = \sqrt{\sigma^2} = 1.2 \;   standard  dev. \; standard \; dev.
Equation 5: Variance and standard deviation of the probability distribution


b) \quad Based on the range rule of thumb, determine the outcomes that are considered as usual and unusual.
In statistics the range rule of thumb is that which delimits a specific interval at which outcome values can be classified as usual or unusual, in order to find these range limits we use the mean and standard deviation to find:

Maximum  usual  value=μ+2σMaximum \; usual\; value = \mu + 2\sigma

Minimum  usual  value=mu2σ Minimum \;usual \;value = \,mu 2\sigma
Equation 6: Range rule of thumb limits

If we use these rules to find the range of usual and unusual values we find that:

Maximum  usual  value=μ+2σ=3.64+2(1.2)=6.04Maximum \; usual\; value = \mu + 2\sigma = 3.64 + 2(1.2) = 6.04

Minimum  usual  value=mu2σ=6.642(1.2)=1.24 Minimum \;usual \;value = \,mu 2\sigma = 6.64 -2(1.2) = 1.24
Equation 7: Range rule of thumb limits

Therefore we find that the value of 1 is an unusual value, while the rest of them: 2, 3 , 4, 5 and 6 are all within the range of usual values.

***

To finalize this lesson we would like to recommend you to take a look at this lesson on probability and probability distributions which goes from basic concepts to extended examples on the matter. Also, this handbook on probability distributions takes care of covering everything in detail and we think it could be very useful for you in your independent studies, or while working in homework and other assignments.

This is it for our lesson of today, we will continue on discrete probability distributions during this whole fourth section on our statistics course so, see you in the next one!
For a probability distribution:
\cdot mean:μ=[xp(x)]mean:\mu = \sum [x \cdot p(x)]
\cdot variance:σ2=[(xμ)2p(x)]=[x2p(x)]μ2variance:\sigma^2 = \sum [(x-\mu)^2 \cdot p(x)]= \sum[x^2 \cdot p(x)] - \mu^2
\cdot standard  deviation:σ=σ2=[(xμ)2p(x)]=[(x2p(x)]μ2standard\;deviation: \sigma = \sqrt{\sigma^2}= \sqrt{\sum [(x-\mu)^2 \cdot p(x)]} = \sqrt{\sum [(x^2 \cdot p(x)]- \mu^2}

Range Rule of Thumb (Usual VS. Unusual):
\cdot maximum usual value =μ+2σ= \mu+2\sigma
\cdot minimum usual value =μ2σ= \mu-2\sigma