Bayesian Inference

Bayesian inference was created to update the probability as we gather more data.

The basic setup is the following:

  1. We start with some guess about the distribution of (the unknown quantity we want to estimate), which we call the Prior Distribution
  2. Then, we observe data data and update our guessed distribution using Bayes’ Rule

Recommendation by soham: https://www.probabilitycourse.com/chapter9/9_1_0_bayesian_inference.php

The core of Bayesian Inference is to combine two different distributions (likelihood and Prior Probability) into one “smarter” distribution (Posterior Probability).

Bayesian Inference has three steps:

  1. Calculate the Prior: Choose a PDF to model your parameter , aka the prior distribution . This is your best guess about parameters before seeing the data .
  2. Calculate the Likelihood: Choose a PDF for . Basically you are modeling how the data will look like given the parameter θ.
  3. Calculate the Posterior: Calculate the posterior distribution and pick the that has the highest .

It seems like a lot of modern Machine Learning techniques use Bayesian Inference.