[Home]Probability axioms

HomePage | Recent Changes | Preferences

Showing revision 9
The probability P that some event in the event set E (denoted P(E)) is defined with respect to a "universe" or sample space S of all possible events in such a way that P must satisfy these axioms:

  1. For any set E, 0 <= P(E) <= 1. That is, the probability of an event set is represented by a real number between 0 and 1.
  2. P(S) = 1. That is, the probability that some event in the entire sample set will occur is 1, or certainty. More specifically, there are no events outside the sample set. This is often overlooked in some mistaken probability calculations; if you cannot precisely define the whole sample set, then the probability of any subset cannot be defined either.
  3. Any sequence of mutually disjoint events E1, E2, ... satisfies P(E1 + E2 + ...) = ∑ Ei. That is, the probability of an event set which is the union of other disjoint subsets is the sum of the probabilities of those subsets. This is called σ-additivity. If there is any overlap among the subsets--if they are not totally independent--this relation does not hold.

These axioms are known as the Kolmogorov Axioms, after A. Kolmogorov who developed them.

In the event that the sample space is finite or countably infinite, a probability function can also be defined by its values on the elementary events {e1}, {e2}, ... where S = {e1, e2, ...}

Alternatively, a probability is a measure on a set of events, such that the measure of the whole set equals 1.


From these axioms one can deduce other useful rules for calculating probabilities. For example:

See also frequency probability -- personal probability -- eclectic probability -- statistical regularity


/Talk


HomePage | Recent Changes | Preferences
This page is read-only | View other revisions | View current revision
Edited September 13, 2001 11:34 am by Iwnbap (diff)
Search: