Expected Value

In Probability Theory, the expected value (also called expectation, expectancy, mathematical expectation, mean, average, or first moment) is a generalization of the weighted average.

Expected Value (Definition)

Let be a discrete Random Variable with p.m.f. , then the expected value is

For a continuous Random Variable, we have

Informally, the expected value is the arithmetic mean of a large number of independently selected outcomes of a random variable.

When we change the term, it becomes below (notice that doesn’t change). see property 2 below

Properties

  1. For any constant and , (linearity)
  2. , is the p.m.f. of
    • What this means for example, if you have a p.d.f f(x) where
      • f(0) = 0.2
      • f(1) = 0.5
      • f(2) = 0.3
    • To find , you have (notice how the p.d.f. doesn’t change)
    • Another example:

Other Properties

  1. ; , where is a constant
  2. for any r.v.
    1. To get to this answer, expand the squared term to , you can treat as a constant, and realize that , so it simplifies to
    2. I was still a little confused, the proof is here: https://www.probabilitycourse.com/chapter3/3_2_4_variance.php#:~:text=The%20variance%20of%20a%20random,%E2%88%92%CE%BCX)2%5D.
  3. If and are independent then

Random

Wow, another Serendipity moment. Tying the Reinforcement Learning stuff (expected return) I was seeing to Competitive Programming, as well as the probability theory that I will be learning for my upcoming Statistics class.