Statistical inference is the process of deducing properties of an underlying distribution by analysis of data.
Bayes' Theorem...
In statistics, the likelihood function has a very precise definition:
$$L(\theta | x) = P(x | \theta).$$The concept of likelihood plays a fundamental role in both Bayesian and frequentist statistics.
Choose a probability distribution from which to generate i.i.d. samples.
Choose a sample size \(n\) and sample once from your chosen distribution.
Use the slider to generate the likelihood function using the given sample.
At the core of Bayesian statistics is the idea that prior beliefs should be updated as new data is acquired. Use the blue slider to choose the true bias of the coin (which would be unknown in practice). The green sliders control the shape of the initial prior. As we acquire data in the form of coin tosses, we update the distribution on \(p\). This updated distribution then serves as the prior for future coin tosses.