8 1. The Fisher Efficiency In the regular case, the differentiation and integration are interchangeable, hence differentiating in θ , we get the equation, 1 + b n (θ) = Rn ˆ n (x1,...,xn) ∂p (x1,...,xn,θ)/∂θ dx1 . . . dxn = Rn ˆ n (x1,...,xn) ∂p (x1,...,xn,θ)/∂θ p (x1,...,xn,θ) p (x1,...,xn,θ) dx1 . . . dxn = Eθ ˆ n L n (θ) = Covθ ˆ n , L n (θ) where we use the fact that Eθ L n (θ) = 0. The correlation coefficient ρn of ˆ n and Ln(θ) does not exceed 1 in its absolute value, so that 1 ≥ ρ2 n = ( Covθ ˆ n , L n (θ) ) 2 Varθ[ˆ n ] Varθ[Ln(θ)] = (1 + bn(θ))2 Varθ[ˆ n ] In(θ) . 1.4. Efficiency of Estimators An immediate consequence of Theorem 1.8 is the formula for unbiased esti- mators. Corollary 1.9. For an unbiased estimator ˆ n , the Cram´ er-Rao inequality (1.1) takes the form (1.2) Varθ ˆ n ≥ 1 In(θ) , θ ∈ Θ. An unbiased estimator θ ∗ n = θ ∗ n (X1,...,Xn) in a regular statistical experiment is called Fisher efficient (or, simply, efficient) if, for any θ ∈ Θ, the variance of θ ∗ n reaches the Cram´ er-Rao lower bound, that is, the equality in (1.2) holds: Varθ θn ∗ = 1 In(θ) , θ ∈ Θ. Example 1.10. Suppose, as in Example 1.1(a), the observations X1,...,Xn are independent N (θ, σ2) where σ2 is assumed known. We show that the sample mean ¯ n = (X1 + · · · + Xn)/n is an efficient estimator of θ. Indeed, ¯ n is unbiased and Varθ ¯ n = σ2/n. On the other hand, ln p (X, θ) = − 1 2 ln(2 π σ2) − (X − θ)2 2σ2 and l (X , θ) = ∂ ln p (X, θ) ∂θ = X − θ σ2 . Thus, the Fisher information for the statistical experiment is In(θ) = n Eθ ( l (X , θ) ) 2 = n σ4 Eθ (X − θ)2 = nσ2 σ4 = n σ2 .
Purchased from American Mathematical Society for the exclusive use of nofirst nolast (email unknown) Copyright 2011 American Mathematical Society. Duplication prohibited. Please report unauthorized use to cust-serv@ams.org. Thank You! Your purchase supports the AMS' mission, programs, and services for the mathematical community.