Independence (probability theory)

Stated in terms of log probability, two events are independent if and only if the log probability of the joint event is the sum of the log probability of the individual events:

or to the odds of one event, given the other event, being the same as the odds of the event, given the other event not occurring:

and an infinite family of σ-algebras is said to be independent if all its finite subfamilies are independent.

The converse does not hold: if two random variables have a covariance of 0 they still may be not independent. See uncorrelated.

In particular the characteristic function of their sum is the product of their marginal characteristic functions:

though the reverse implication is not true. Random variables that satisfy the latter condition are called subindependent.

Independence can be seen as a special kind of conditional independence, since probability can be seen as a kind of conditional probability given no events.