8 1. The Fisher Eﬃciency

In the regular case, the differentiation and integration are interchangeable,

hence differentiating in θ , we get the equation,

1 + bn (θ) =

Rn

ˆ

θ n(x1,...,xn) ∂p (x1,...,xn,θ)/∂θ dx1 . . . dxn

=

Rn

ˆn(x1,...,xn)

θ

∂p (x1,...,xn,θ)/∂θ

p (x1,...,xn,θ)

p (x1,...,xn,θ) dx1 . . . dxn

= Eθ

ˆ

θ

n

Ln(θ) = Covθ

ˆ

θ

n

, Ln(θ)

where we use the fact that Eθ Ln(θ) = 0. The correlation coeﬃcient ρn of

ˆn

θ and Ln(θ) does not exceed 1 in its absolute value, so that

1 ≥ ρn

2

=

(

Covθ

ˆn

θ , Ln(θ)

)2

Varθ[ˆn]

θ Varθ[Ln(θ)]

=

(1 +

bn(θ))2

Varθ[ˆn]

θ In(θ)

.

1.4. Eﬃciency of Estimators

An immediate consequence of Theorem 1.8 is the formula for unbiased esti-

mators.

Corollary 1.9. For an unbiased estimator

ˆ

θ

n

, the Cram´ er-Rao inequality

(1.1) takes the form

(1.2) Varθ

ˆn

θ ≥

1

In(θ)

, θ ∈ Θ.

An unbiased estimator θn

∗

= θn

∗(X1,

. . . , Xn) in a regular statistical

experiment is called Fisher eﬃcient (or, simply, eﬃcient) if, for any θ ∈ Θ,

the variance of θn

∗

reaches the Cram´ er-Rao lower bound, that is, the equality

in (1.2) holds:

Varθ θn

∗

=

1

In(θ)

, θ ∈ Θ.

Example 1.10. Suppose, as in Example 1.1(a), the observations X1,...,Xn

are independent N (θ,

σ2)

where

σ2

is assumed known. We show that the

sample mean

¯

X

n

= (X1 + · · · + Xn)/n is an eﬃcient estimator of θ. Indeed,

¯n

X is unbiased and Varθ

¯n

X =

σ2/n.

On the other hand,

ln p (X, θ) = −

1

2

ln(2 π

σ2)

−

(X −

θ)2

2σ2

and

l (X , θ) =

∂ ln p (X, θ)

∂θ

=

X − θ

σ2

.

Thus, the Fisher information for the statistical experiment is

In(θ) = n Eθ

(

l (X , θ)

)2

=

n

σ4

Eθ (X −

θ)2

=

nσ2

σ4

=

n

σ2

.