Once we can identify some sort of statistical model for our data, the goal is to estimate the parameters of this statistical model. This process is called estimation.

Estimation = “What is our best guess for these unknown parameters?”

We use Estimators to do estimation.


  • The Problem: is a sequence of i.i.d. r.v. with pmf , where is are the unknown true parameters
  • Data:
  • The Goal: to construct , which is an estimate of (true parameters)

Naive Approach

We can test various parameters . Suppose a coin is tossed with . We know that the coin is biased, where or (this is the parameter in the Binomial Distribution that we are trying to guess, ).

We run the experiment and we observe 60 heads. We can then “test” these parameters of the Binomial Distribution, where we find that:

Therefore, you conclude that is more likely!

Now, instead of only these two parameters that you test, what if I asked you, out of all the possible parameters (which in this case, are ranging from to , which one is the most likely?)

We can quantify the likelihood of a parameters with the Likelihood Function.

In this example, the likelihood function of a particular parameter given our observations would be We want to find the parameters that maximizes .

There exists multiple methods for Estimation:

In STAT206, we focus on MLE.