2 1. THE DERIVATIVE OPERATOR
From
∂F
∂x
= tF we obtain
(1.4) Hn(x) = Hn−1(x),
for any n 1. Also, from
∂F
∂t
= (x t)F we get the following recursive formula
(1.5) (n + 1)Hn+1(x) = xHn(x) Hn−1(x),
for any n 1. The first Hermite polynomials are H0(x) = 1, H1(x) = x and
H2(x) =
1
2
(x2 1). From the expansion of F (0, t) = exp(−
t2
2
) in powers of t, we
get Hn(0) = 0 if n is odd and H2k(0) =
(−1)k
2kk!
for all k 1.
The relationship between Hermite polynomials and Gaussian random variables
is explained by the following result.
Lemma 1.3. Let X, Y be two random variables with joint Gaussian distribution
such that E(X) = E(Y ) = 0 and E(X2) = E(Y 2) = 1. Then for all n, m 0 we
have
E(Hn(X)Hm(Y )) =
0 if n = m,
1
n!
(E(XY
))n
if n = m.
Proof. For all s, t R we have
E exp(sX
s2
2
) exp(tY
t2
2
) = exp(stE(XY )).
Expanding both sides of this equality in power series of the variables s and t using
(1.3), and identifying the coefficient of
sntm
yields
E(Hn(X)Hm(Y )) =
0 if n = m,
1
n!
(E(XY
))n
if n = m.
This completes the proof.
Suppose that H is infinite-dimensional and let {ei, i 1} be an orthonormal
basis of H. We will denote by Λ the set of all sequences a = (a1, a2, . . . ), ai N,
such that all the terms, except a finite number of them, vanish. For a Λ we set
a! =

i=1
ai! and |a| =
∑∞
i=1
ai. For any multiindex a Λ we define
(1.6) Φa =

a!

i=1
Hai (W (ei)).
The above product is well defined because H0(x) = 1 and ai = 0 only for a finite
number of indices.
Proposition 1.4. The family of random variables {Φa, a Λ} is a complete
orthonormal system in
L2(Ω,
F, P ).
Proof. For any a, b Λ we have, using the independence of the random
variables {W (ei), i 1} and Lemma 1.3,
E

i=1
Hai (W (ei))Hbi (W (ei)) =

i=1
E(Hai (W (ei))Hbi (W (ei)))
=
1
a!
if a = b,
0 if a = b.
This implies that the random variables {Φa, a Λ} are orthonormal. On the other
hand, suppose that F is an element in
L2(Ω,
F, P ) such that E(F Φa) = 0 for all
Previous Page Next Page