Homepage > MCMC

Bernoulli Trials

Consider independent $ n$ Bernoulli trials with success probability $ \theta$. Having observed the data $ \mathbf{x} = (x_1,\ldots,x_n)$, we wish to estimate the parameter $ \theta$. In maximum likelihood methed we construct the likelihood function

$\displaystyle L(\theta, \mathbf{x})
= \theta^{k} (1-\theta)^{n-k}

where $ k = \sum_{i=1}^n x_i$. And we obtain the MLE

$\displaystyle \theta^* = \dfrac{k}{n}

Beta distribution. The pdf

$\displaystyle f(x; \alpha, \beta) = \frac{\Gamma(\alpha+\beta)}{\Gamma(\alpha)\Gamma(\beta)}
x^{\alpha-1} (1 - x)^{\beta-1},
\quad 0 \le x \le 1,

is known as the beta distribution with parameters $ \alpha$ and $ \beta$. The mean and the variance are $ \dfrac{\alpha}{\alpha+\beta}$ and $ \dfrac{\alpha\beta}{(\alpha+\beta)^2 (\alpha+\beta+1)}$, respectively. And the mode (at which the density is maximized) is given by $ \dfrac{\alpha-1}{\alpha+\beta-2}$ if $ \alpha > 1$ and $ \beta > 1$. The function
> rbeta(n,alpha,beta)
generates $ n$ independent random sample from the beta distribution.

Bayes estimate. When the uninformative prior $ f(\theta; 1, 1) \equiv 1$, the posterior density is give by $ f(\theta; k+1, n-k+1)$, and the expected value of $ \theta$ is calculated as

$\displaystyle \bar{\theta} = \dfrac{k+1}{n+2}

Explore it. Download bernoulli.r. The function bernoulli() generates data, and compares the estimates of two distinct methods. See how they differ in a particular outcome, and repeat the experiment with the same size. Increase the size, and observe the similarity of the two estimate.

> source("bernoulli.r")
> bernoulli()
> bernoulli(size=20)

© TTU Mathematics