1.1. Simple random walk 3
compute E [|Sn|]. It turns out to be much easier to compute E[Sn],2
E[XjXk] = n +
This calculation uses an important property of average values:
E[X + Y ] = E[X] + E[Y ].
The fact that the average of the sum is the sum of the averages for random
variables even if the random variables are dependent is easy to prove but can
be surprising. For example, if one looks at the rolls of n regular 6-sided dice,
the expected value of the sum is (7/2) n whether one takes one die and uses
that number n times or rolls n different dice and adds the values. In the first
case the sum takes on the six possible values n, 2n, . . . , 6n with probability
1/6 each while in the second case the probability distribution for the sum is
hard to write down explicitly.
If j = k, there are four possibilities for the (Xj,Xk); for two
of them XjXk = 1 and for two of them XjXk = −1. Therefore,
E[XjXk] = 0 for j = k and
Var[Sn] = E[Sn]
Here Var denotes the variance of a random variable, defined by
Var[X] = E (X −
(a simple calculation establishes the second equality). Our calculation
illustrates an important fact about variances of sums: if X1,...,Xn
are independent, then
Var[X1 + · · · + Xn] = Var[X1] + · · · + Var[Xn].