26 1. Preparatory material

thus conditioning can magnify probabilities by a factor of at most 1/P(E).

In particular:

(i) If F occurs unconditionally surely, it occurs surely conditioning on

E also.

(ii) If F occurs unconditionally almost surely, it occurs almost surely

conditioning on E also.

(iii) If F occurs unconditionally with overwhelming probability, it oc-

curs with overwhelming probability conditioning on E also, pro-

vided that P(E) ≥

cn−C

for some c, C 0 independent of n.

(iv) If F occurs unconditionally with high probability, it occurs with

high probability conditioning on E also, provided that P(E) ≥

cn−a

for some c 0 and some suﬃciently small a 0 independent

of n.

(v) If F occurs unconditionally asymptotically almost surely, it occurs

asymptotically almost surely conditioning on E also, provided that

P(E) ≥ c for some c 0 independent of n.

Conditioning can distort the probability of events and the distribution

of random variables. Most obviously, conditioning on E elevates the prob-

ability of E to 1, and sends the probability of the complementary event E

to zero. In a similar spirit, if X is a random variable uniformly distributed

on some finite set S, and S is a non-empty subset of S, then conditioning

to the event X ∈ S alters the distribution of X to now become the uniform

distribution on S rather than S (and conditioning to the complementary

event produces the uniform distribution on S\S ).

However, events and random variables that are independent of the event

E being conditioned upon are essentially unaffected by conditioning. Indeed,

if F is an event independent of E, then (F |E) occurs with the same proba-

bility as F ; and if X is a random variable independent of E (or equivalently,

independently of the indicator I(E)), then (X|E) has the same distribution

as X.

Remark 1.1.19. One can view conditioning to an event E and its comple-

ment E as the probabilistic analogue of the law of the excluded middle. In

deterministic logic, given a statement P , one can divide into two separate

cases, depending on whether P is true or false; and any other statement Q is

unconditionally true if and only if it is conditionally true in both of these two

cases. Similarly, in probability theory, given an event E, one can condition

into two separate sample spaces, depending on whether E is conditioned to

be true or false; and the unconditional statistics of any random variable or

event are then a weighted average of the conditional statistics on the two