Homepage > MCMC

# Bayesian Models

Bayesian uses the concept of prior belief about the parameter of interest. Then the uncertainty of changes according to the data . Here Bayesian interprets as a random variable, and the prior belief is given in the form of probability density of . In a Bayesian model we will investigates the postrior density of .

Conditional probability. Let and be two events where . Then the conditional probability of given can be defined as

The idea of conditioning'' is that if we have known that has occurred, the sample space should have become .'' It is often the case that one can use to find

Law of total probability. Let and be two events. Then we can write the probability as

In general, suppose that we have a sequence of mutually disjoint events satisfying , where mutual disjointness'' means that for all . (The events are called a partition of .'') Then for any event we have

Bayes rule. Let and be events such that the 's are mutually disjoint, and for all . Then

is called Bayes rule.

Concept of Independence. Intuitively we would like to say that and are independent if knowing about one event give us no information about another. That is, and . We say and are independent if

This definition is symmetric in and , and allows and/or to be 0. Furthermore, a collection of events is said to be mutually independent if it satisfies

for any subcollection .