Homepage > MCMC

Dirichlet Distribution

Let $ \alpha_1, \ldots, \alpha_k$ be parameters. Then we can define the pdf

$\displaystyle \pi(\theta; \alpha_1, \ldots, \alpha_k)
= \dfrac{\Gamma(\alpha_1...
...\cdots\Gamma(\alpha_k)}
\theta_1^{\alpha_1 -1} \cdots \theta_k^{\alpha_k -1}
$

over the simplex

$\displaystyle \mathcal{Q} = \{\theta\in [0,1]^k: \theta_1+\cdots+\theta_k = 1\} .
$

This is called the Dirichlet distribution.

Multinomial distribution. Suppose that $ \theta_j$ represents the probability for the $ j$-th outcome for each $ j=1,\ldots,m$, and $ n$ outcomes are repeatedly obtained according to the probability $ \theta_1,\ldots,\theta_k$. Then we can count for each $ j$ the number $ m_j$ of the $ j$-th outcome, and the chance of getting the values $ (m_1,\ldots,m_k)$ is given by the multinomial distribution

$\displaystyle f(m_1,\ldots,m_k; \theta)
= \dfrac{n!}{m_1! \cdots m_k!}
\theta_1^{m_1} \cdots \theta_k^{m_k}
$

over

$\displaystyle \mathcal{Z}^\dagger = \{(m_1,\ldots,m_k): m_1+\cdots+m_k = n\}
$

Conjugate family of Dirichlet distribution. Let $ \pi(\theta;$ $ \alpha_1, \ldots, \alpha_k)$ be the prior density for $ \theta$ with hyperparameter $ (\alpha_1, \ldots, \alpha_k)$. Given the data from the multinomial distrubution with parameter $ \theta$, we can obtain the posterior density

$\displaystyle \pi(\theta\:\vert\: m_1,\ldots,m_k)
\propto \theta_1^{\alpha_1 + m_1 - 1} \cdots \theta_1^{\alpha_k + m_k - 1}
$

which is the Dirichlet distribution $ \pi(\theta; \alpha_1+m_1, \ldots, \alpha_k+m_k)$.


© TTU Mathematics