Properties of expectation

All in One Place

Everything you need for better marks in university, secondary, and primary classes.

Learn with Ease

We’ve mastered courses for WA, NSW, QLD, SA, and VIC, so you can study with confidence.

Instant and Unlimited Help

Get the best tips, walkthroughs, and practice questions.

0/2
?
Intros
Lessons
  1. What is the expected value and variance for random variables?
  2. Properties of Expectation and Variance
0/10
?
Examples
Lessons
  1. Verifying Expectation and Variance
    If a 6 sided die is rolled what is the expected value shown on the die?
    1. A certain car breaks down every 50 hours of driving time. If the car is driven for a total of 175 hours;
      1. What is the expected number of breakdowns?
      2. What is the variance of the breakdowns?
    2. Clara is trying to make the perfect teapot out of pottery. Each time she attempts to make the perfect teapot she will use a lump of clay and she will succeed with a probability of 0.20. Once she makes the perfect teapot she will stop potting.
      1. What is the expected number of lumps of clay Clara will use to make this perfect teapot?
      2. What is the variance on the number of lumps of clay Clara will use?
    3. Properties of Expectation and Variance
      If a 6 sided die is rolled what is the expected value shown on the die? What would be the expected value if 10 die were rolled?
      1. Suppose we have two independent random variable one with parameters E[X]=4E[X]=4 and Var(X)=3(X)=3, and the other with parameters E[Y]=9E[Y]=9 and Var(Y)=6(Y)=6.
        1. What is E[X+Y+2] E[X+Y+2]?
        2. What is E[3X+2Y5] E[3X+2Y-5] ?
        3. What is Var(3X+2) (3X+2)?
        4. What is Var(2(X+Y+1)) (2(X+Y+1)) ?
      Topic Notes
      ?

      Properties of expectation


      To finalize with the important topics on probability statistics referring to discrete probability distributions, we have to learn about expectations.

      Although expectation statistics may seem a new concept to us, we will soon realize we have been studying and working with it through a few lessons now, and thus why we have to close this chapter for our statistics course with this topic. And so, to get to our topic, let us learn about the definition of expected value and how to calculate it.

      How to calculate the expected value


      Remember from our lesson on the mean and standard deviation of a binomial distribution, we learned that the mean of a binomial probability distribution is equal to the expected number of successes per each n trials run in a statistical analysis, and so, we define the expectation of xx, usually called E[X]E[X] as the mean of discrete random variable xx.

      μ=E[X]\mu = E\left[X\right]
      Equation 1: Expected value of x

      Where XX is a random variable with a range of possible outcomes depending on each case. But equation 1 is not the only expected value formula, going back to how we define expectation in relation to the number of successes in a set of trials, we can see that the expected value can also be related to the mean of a binomial distribution; which is defined as the multiplication of the number of trials in the experiments, times the probability of success in each trial, written as:

      μ=np=E[X]\mu = np = E\left[X\right]
      Equation 2: Mean of a binomial distribution

      Where:
      nn = number of trials
      xx = number of successes in n trials
      pp = probability of success in each trial

      The mean provides the best approximation of the amount of times you can expect a certain outcome to occur per a certain number of trials. In other words, the mean of a binomial distribution give us the expectations of success.

      Since we are focused on the expected values of a distribution, something important to learn is the formula for variance:

      Var(X)=σ2=np(1p)=varianceVar(X)= \sigma^{2} = np (1-p) = variance
      Equation 3: Variance of a binomial distribution

      Variance is important when studying expectations, because when constructing graphic representations of the probability distribution in question, one can see how far away each data point in a graph is from the expected value and have an idea of how realistic results are, if there is any bias or errors and how far results are from true experimentally-obtained values. Therefore, the variance equation allows us to calculate how dispersed outcomes are from an expected value.

      E[X]= E\left[X\right] = 1p\large \frac{1}{p}
      Var(X)= Var(X)= 1pp2\large \frac{1-p}{p^{2}} =variance=variance
      Equation 4: Expected value and variance related to geometric distributions

      Properties of expectation

      Now that we have learned what expectation is, and how it relates two some discrete probability distributions, it is important to know the mathematical properties and identities its calculations follows, therefore, below we have listed the properties of expectation:

      1. E[X+a]=E[X]+a E\left[X + a \right] = E\left[X\right] + a
      2. E[bX]=bE[X] E\left[bX\right] = bE\left[X\right]
      3. E[X+Y]=E[X]+E[Y] E\left[X + Y\right] = E\left[X\right] + E\left[Y\right]

      Where in full generality we have that :

      E[X1+X2+...+Xn]=E[X1]+E[X2]+...+E[Xn] E\left[X_{1} + X_{2} + ... + X_{n} \right] = E\left[X_{1}\right] + E\left[X_{2}\right] + ... + E\left[X_{n}\right]
      Equation 5: Expected value of different outcomes added

      Using the same approach, we list the properties of variance:

      1. Var[X+a]=Var[X] Var\left[X + a \right] = Var\left[X\right]
      2. Var[bX]=b2Var[X] Var\left[bX\right] = b^{2}Var\left[X\right]
      3. Var[X+Y]=Var[X]+Var[Y]; Var\left[X + Y\right] = Var\left[X\right] + Var\left[Y\right] ; whenX \, X \, andY \, Y \, are independent

      Expected value examples


      Example 1

      For verifying expectation and variance we look at the next situation: If a 6 sided die is rolled what is the expected value shown on the die?

      For a die we have 6 different probable outcomes: 1, 2, 3, 4, 5, and 6. And each of those probable outcomes have a probability of occurring of 16\frac{1}{6} since all of them have the same chances of happening (in a fair die of course!).
      Using the mathematical expectation definition shown above we solve that:

      μ=E[X]= \mu = E\left[X\right] = 1+2+3+4+5+66\large \frac{1+2+3+4+5+6}{6} =3.5=3.5
      Equation 6: Expected value of a rolled die

      Now this problem is a great (and very simple) example where we can see that the term expected value shouldnt be taken literally. In other words, the result of 3.5 is not a value that can actually be produced from a die, because all of the probable outcomes of a die are whole numbers! And so, we can see here that the expected value is actually an average that has been weighted given a large number of rolls of the die performed, and this done repeatedly always obtaining a similar result to 3.5. Therefore, the expected value is not necessarily an exact value that can be produced from an experiment, but an average of what can occur, is the mean of random variable xx.

      Example 2

      A certain car breaks down every 50 hours of driving time. If the car is driven for a total of 175 hours;
      • What is the expected number of breakdowns?
      Since the car breaks down every 50 hours of driving time, we have a probability of breaking down equal to 1/50 each hour the car is being driven. Given that we have two very distinguishable possible outcomes, either the car works or dont, let us use the binomial distribution method to work through this problem, and so we have the next information:

      E[X]=np E\left[X\right] = np

      Var(X)=np(1p)Var(X)=np(1-p)

      Where p=150\, p=\frac{1}{50} \, and n=175 \, n=175, therefore:

      E[X]=np=(150)175=3.5 E\left[X\right] = np = (\frac{1}{50})175 = 3.5
      Equation 7: Expected number of car breakdowns in 175 hours

      Therefore, during the 175 hours that the car is being driven, one can expect to have about 3.5 breakdowns. Good luck to the driver uh?!
      Remember this 3.5 number really means that there is a good chance the car will breakdown 3 or 4 times, since there are no half breakdowns (are they?, no we dont think so).
      • What is the expected number of breakdowns?
      Now let us take a look at the variance of this result. Using the binomial variance definition we have that:

      Since p=150\, p=\frac{1}{50} \, and n=175\, n=175 \, , we have that:

      Var(X)=np(1p)=175(150)(1150)=17550(4950)=85752500=3.43Var(X)=np (1-p)=175(\frac{1}{50})(1-\frac{1}{50})=\frac{175}{50}(\frac{49}{50})=\frac{8575}{2500}=3.43
      Equation 8: Variance for the car breaking down an average of 3.5 times

      A variance of 3.43 means that from an expectation of 3 to 4 car breakdowns during a 175 hour driving trip, this breakdown numbers can actually vary about 3.42 time, so there is a chance (even if slight) that you will only have one breakdown, or none (which even more improbable).

      Example 3

      Clara is trying to make the perfect teapot out of pottery. Each time she attempts to make the perfect teapot she will use a lump of clay and she will succeed with a probability of 0.20. Once she makes the perfect teapot she will stop potting.

      • What is the expected number of lumps of clay Clara will use to make this perfect teapot?
      In other words, what is the expected number of trials Clara is gonna go before achieving the perfect teapot (since each time she tries, there is one lump of clay used).
      With the information given on the problem, and using the formula for the expectation value from its geometric definition, we have that:

      Having p=0.2\, p=0.2 \, , we solve for the expectation value:

      E[X]=1p=10.2=5 E\left[X\right] = \frac{1}{p}=\frac{1}{0.2}=5
      Equation 9: Expectation value of lumps used before Clara achieves the perfect teapot

      So, we should expect Clara uses at least 5 lumps of clay before she achieves to make the perfect teapot.

      • What is the variance on the number of lumps of clay Clara will use?
      Using variance as defined in equation 4 we have that:

      Var[X]= Var\left[X\right] = 1pp2=10.2(0.2)2=0.80.04\large \frac{1-p}{p^{2}}=\frac{1-0.2}{(0.2)^{2}} = \frac{0.8}{0.04} =5 = 5
      Equation 10: Variance, dispersion on how many lumps Clara will use for the perfect teapot

      So, although Clara is expected to use 5 lumps of clay for the perfect teapot, this number can be dispersed so much as to 20 lumps! Let us put this in perspective: since this number is too high, it means there are small chances of she messing up so many lumps of clay. In reality, most chances are of her messing up a one standard deviation of dispersion of lumps of clay, and if you calculate the square root of the variance, you will see the standard deviation is equal to 4.5; therefore, Clara is most probably going to use between zero to 10 lumps of clay in order to create her perfect teapot.

      Example 4

      Properties of Expectation and Variance
      If a 6 sided die is rolled, what is the expected value shown on the die? What would be the expected combined value of 10 die rolled at the same time?

      We know from example one that the expected value of rolling a die is equal to 3.5. Then, what would be the expected value if we roll 10 dice at the same time? Well, we use the expectation properties we learnt above in this lesson, and calculate:

      we know that: E[X1+X2+...+X10]=E[X1]+E[X2]+...+E[X10] E\left[X_{1} + X_{2} + ... + X_{10} \right] = E\left[X_{1}\right] + E\left[X_{2}\right] + ... + E\left[X_{10}\right]

      where E[X1]=3.5E\left[X_{1}\right] = 3.5 is the expected value for the first die, which is equal for all other dice:

      E[X1]=E[X2]=...=E[X10]E\left[X_{1}\right] = E\left[X_{2}\right] = ... = E\left[X_{10}\right]

      therefore: E[X1+X2+...+X10]=E[X1]+E[X2]+...+E[X10]=3.5(1)=35 E\left[X_{1} + X_{2} + ... + X_{10} \right] = E\left[X_{1}\right] + E\left[X_{2}\right] + ... + E\left[X_{10}\right] = 3.5(1) = 35
      Equation 11: expected combined value of 10 die rolled at the same time


      Example 5

      Suppose we have two independent random variables, one with parameters E[X]=4 E\left[X\right]=4 \, and Var(X)=3\, Var(X)=3, and the other with parameters E[Y]=9 E\left[Y\right]=9\, and Var(Y)=6 \, Var(Y)=6.

      • What is E[X+Y+2]?E\left[X + Y + 2\right]?
      Using the properties for the expectation of a random variable as learnt above we can solve:

      property 1: E[X+a]=E[X]+a \, E\left[X + a \right] = E\left[X\right] + a

      therefore: E[X+Y+2]=E[X+Y]+2\, E\left[X + Y +2\right] = E\left[X + Y\right] +2

      property 3: E[X+Y]=E[X]+E[Y]\, E\left[X + Y \right] = E\left[X\right] + E\left[Y\right]

      therefore: E[X+Y]+2=E[X]+E[Y]+2\, E\left[X + Y \right] + 2 = E\left[X\right] + E\left[Y\right] +2

      since: E[X]=4 \, E\left[X\right] = 4 \, andE[Y]=9 \, E\left[Y\right] = 9

      E[X]+E[Y]+2=4+9+2=15\, E\left[X\right] + E\left[Y\right] +2 = 4 + 9 + 2 =15
      Equation 12: Solving for E[X+Y+2]

      • What is E[3X+2Y5]?E\left[3X + 2Y - 5\right]?
      Once more, following the properties of expectation we work through the next steps:

      property 1: E[X+a]=E[X]+a \, E\left[X + a \right] = E\left[X\right] + a

      therefore: E[3X+2Y5]=E[3X+2Y]5 \, E\left[3X + 2Y -5\right] = E\left[3X + 2Y\right] -5

      property 3: E[X+Y]=E[X]+E[Y]\, E\left[X + Y \right] = E\left[X\right] + E\left[Y\right]

      therefore: E[3X+2Y]5=E[3X]+E[2Y]5 \, E\left[3X + 2Y \right] - 5 = E\left[3X\right] + E\left[2Y\right] - 5

      property 2: E[bX]=bE[X]\, E\left[bX\right] = bE\left[X\right]

      therefore: E[3X]+E[2Y]5=3E[X]+2E[Y]5 \, E\left[3X\right] + E\left[2Y\right] - 5 = 3E\left[X\right] + 2E\left[Y\right] - 5

      since: E[X]=4 \, E\left[X\right] = 4 \, andE[Y]=9 \, E\left[Y\right] = 9

      3E[X]+2E[Y]5=3(4)+2(9)5=12+185=253E\left[X\right] + 2E\left[Y\right] - 5 =3(4)+2(9)-5=12+18-5=25
      Equation 13: Solving for E[3X+2Y-5]

      • What is Var(3X+2)?Var(3X+2)?
      On this, and next, part of problem 4, we use the properties of variance to solve them in a similar fashion as we have done for the first two cases of this problem:

      property 1: Var[X+a]=Var[X] \, Var\left[X+a\right] = Var\left[X\right]

      therefore: Var(3X+2)=Var(3X)\, Var(3X+2)=Var(3X)

      property 2: Var[bX]=b2Var[X]\, Var\left[bX\right] = b^{2}Var\left[X\right]

      therefore: Var(3X)=9Var(X)\, Var(3X)=9 Var(X)

      Where: Var(X)=3, \, Var(X)=3, \, therefore:9Var(X)=27: \, 9 Var(X)=27
      Equation 14: Solving for Var(3x+2)

      • What is Var(2(X+Y+1))?Var(2(X+Y+1))?
      Following the same approach as above we obtain:

      property 2: Var[bX]=b2Var[X] \, Var\left[bX\right] = b^{2}Var\left[X\right]

      therefore: Var(2(X+Y+1))=22Var(X+Y+1)=4Var(X+Y+1)\, Var(2(X+Y+1))=2^{2}Var(X+Y+1)=4 Var(X+Y+1)

      property 1: Var[X+a]=Var[X]\, Var\left[X + a\right] = Var\left[X\right]

      therefore: 4Var(X+Y+1)=4Var(X+Y)\, 4 Var(X+Y+1)=4 Var(X+Y)

      property 3: Var[X+Y]=Var[X]+Var[Y]\, Var\left[X + Y\right] = Var\left[X\right] + Var\left[Y\right] \, becauseX \, X \, andY \, Y \, are independent

      therefore: 4Var(X+Y)=4(Var(X)+Var(Y))\, 4 Var(X+Y)=4 (Var(X)+Var(Y))

      Where: Var(X)=3andVar(Y)=6\, Var(X)=3 and Var(Y)=6

      therefore: 4(Var(X)+Var(Y))=4(3+6)=4(9)=36\, 4 (Var(X)+Var(Y))=4(3+6)=4(9)=36
      Equation 15: Solving for Var(2(x+Y+1))

      ***

      To finalize this lesson, we leave you with a few link recommendations that can be useful throughout your independent studies. This lesson on expectation and variance is complete with examples and concepts, then, this printer friendly page on the properties of expectation provides some extra videos with examples.

      This is it for our lesson of today, see you in the next one!
      We can write out mean as an expected value:
      μ=E[X]\mu=E[X]

      And likewise for variance:
      σ2=\sigma^2= Var(X)(X)

      n: number of trials
      x: number of success in n trials
      p: probability of success in each trial

      Binomial:
      E[X]=npE[X]=np
      Var(X)=np(1p)(X)=np(1-p)

      Geometric:
      E[X]=1pE[X]=\frac{1}{p}
      Var(X)=1pp2(X)=\frac{1-p}{p^2}

      Properties of Expectation:

      \cdot E[X+a]=E[X]+aE[X+a]=E[X]+a
      \cdot E[bX]=bE[X]E[bX]=bE[X]
      \cdot E[X+Y]=E[X]+E[Y]E[X+Y]=E[X]+E[Y]

      Or in full generality:
      \cdot E[X1+X2++Xn]=E[X1]+E[X2]++E[Xn]E[X_1+X_2+ \cdots +X_n ]=E[X_1 ]+E[X_2 ]+ \cdots +E[X_n]

      Properties of Variance:
      \cdot Var[X+a]=[X+a]= Var[X][X]
      \cdot Var[bX]=b2[bX]=b^2Var[X][X]
      \cdot Var[X+Y]=[X+Y]= Var[X]+[X]+ Var[Y][Y] if X and Y are independent