1.1. A review of probability theory 21

(ii) Show that for any k ≥ 2, that (X1,...,Xn) is k-wise independent if

and only if V is not contained in any hyperplane which is definable

using at most k of the coordinate variables.

(iii) Show that (X1,...,Xn) is jointly independent if and only if V =

Fn.

Informally, we thus see that imposing constraints between k variables at a

time can destroy k-wise independence, while leaving lower-order indepen-

dence unaffected.

Exercise 1.1.13. Let V ⊂ F2

3

be the subspace of triples (x1,x2,x3) ∈

F23

with x1 + x2 = 0, and let (X1,X2,X3) be drawn uniformly at random from

V . Then X3 is independent of (X1,X2) (and in particular, is independent

of x1 and x2 separately), but X1,X2 are not independent of each other.

Exercise 1.1.14. We say that one random variable Y (with values in RY )

is determined by another random variable X (with values in RX ) if there

exists a (deterministic) function f : RX → RY such that Y = f(X) is surely

true (i.e., Y (ω) = f(X(ω)) for all ω ∈ Ω). Show that if (Xα)α∈A is a family

of jointly independent random variables, and (Yβ)β∈B is a family such that

each Yβ is determined by some subfamily (Xα)α∈Aβ of the (Xα)α∈A, with

the Aβ disjoint as β varies, then the (Yβ)β∈B are jointly independent also.

Exercise 1.1.15 (Determinism vs. independence). Let X, Y be random

variables. Show that Y is deterministic if and only if it is simultaneously

determined by X, and independent of X.

Exercise 1.1.16. Show that a complex random variable X is a complex

Gaussian random variable (i.e., its distribution is a complex normal distri-

bution) if and only if its real and imaginary parts Re(X), Im(X) are indepen-

dent real Gaussian random variables with the same variance. In particular,

the variance of Re(X) and Im(X) will be half of the variance of X.

One key advantage of working with jointly independent random variables

and events is that one can compute various probabilistic quantities quite

easily. We give some key examples below.

Exercise 1.1.17. If E1,...,Ek are jointly independent events, show that

(1.28) P(

k

i=1

Ei) =

k

i=1

P(Ei)

and

(1.29) P(

k

i=1

Ei) = 1 −

k

i=1

(1 − P(Ei)).