Wednesday, June 08, 2005

Bayes Theorem

Taken from : http://www.niedermayer.ca/papers/bayesian/bayes.html

Mathematically it is expressed as:

P(HE,c) = [P(H c) * P(E H,c)] / P(E c)

where we can update our belief in hypothesis H given the additional evidence E and the background context c.

The left-hand term, P(HE,c) is known as the "posterior probability" or the probability of H after considering the effect of E on c.

The term P(Hc) is called the "prior probability" of H given c alone.

The term P(EH,c) is called the "likelihood" and gives the probability of the evidence assuming the hypothesis H and the background information c is true.

Finally, the last term P(Ec) is independent of H and can be regarded as a normalizing or scaling factor.

It is important to note that all of these probabilities are conditional. They specify the degree of belief in some proposition or propositions based on the assumption that some other propositions are true. As such, the theory has no meaning without prior resolution of the probability of these antecedent propositions.

Comments: Post a Comment



<< Home

This page is powered by Blogger. Isn't yours?