These axioms are known as the Kolmogorov Axioms, after A. Kolmogorov who developed them. |
These axioms are known as the Kolmogorov Axioms, after Andrey Kolmogorov who developed them. |
"probablity of B given A". B and A are said do be independant if the conditional probability of B given A is the same as the probability of B. |
"probablity of B given A". B and A are said to be independent if the conditional probability of B given A is the same as the probability of B. |
These axioms are known as the Kolmogorov Axioms, after Andrey Kolmogorov who developed them.
In the event that the sample space is finite or countably infinite, a probability function can also be defined by its values on the elementary events {e1}, {e2}, ... where S = {e1, e2, ...}
Alternatively, a probability is a measure on a set of events, such that the measure of the whole set equals 1. This property is important, since it gives rise to the natural concept of conditional probability. Every set A with non-zero probability defines another probability on the space: P(B|A) = P(B intersection A)/P(A). This is usually read as "probablity of B given A". B and A are said to be independent if the conditional probability of B given A is the same as the probability of B.
See also frequency probability -- personal probability -- eclectic probability -- statistical regularity