Exercises 9
Therefore, for any value of θ, the variance of
¯n
X achieves the Cram´er-Rao
lower bound 1/In(θ) =
σ2/n.
The concept of the Fisher efficiency seems to be nice and powerful. In-
deed, besides being unbiased, an efficient estimator has the minimum pos-
sible variance uniformly in θ Θ. Another feature is that it applies to any
sample size n. Unfortunately, this concept is extremely restrictive. It works
only in a limited number of models. The main pitfalls of the Fisher efficiency
are discussed in the next chapter.
Exercises
Exercise 1.1. Show that the Fisher information can be computed by the
formula
In(θ) = n
∂2
ln p (X, θ)
θ2
.
Hint: Make use of the representation (show!)
ln p (x, θ)
θ
2
p (x, θ) =
∂2
p (x, θ)
∂θ2

∂2
ln p (x, θ)
∂θ2
p (x, θ).
Exercise 1.2. Let X1,...,Xn be independent observations with the N (μ, θ)
distribution, where μ has a known value (refer to Example 1.1(b)). Prove
that
θn

=
1
n
n
i = 1
(Xi
μ)2
is an efficient estimator of θ. Hint: Use Exercise 1.1 to show that In(θ) =
n/(2
θ2).
When computing the variance of θn
∗,
first notice that the variable

n
i = 1
(Xi
μ)2/θ
has a chi-squared distribution with n degrees of freedom,
and, thus, its variance equals 2n.
Exercise 1.3. Suppose that independent observations X1,...,Xn have a
Bernoulli distribution with the probability mass function
p (x, θ) = θ
x
(1
θ)1−x,
x { 0, 1 } , 0 θ 1.
Show that the Fisher information is of the form
In(θ) =
n
θ (1 θ)
,
and verify that the estimator θn

=
¯n
X is efficient.
Previous Page Next Page