1.1. Statistical Experiment 5 If the function L is differentiable at its attainable maximum, then θ n is a solution of the equation ∂Ln(θ) ∂θ = 0. Note that if the maximum is not unique, this equation has multiple solutions. The function bn(θ) = bn(θ , ˆ n ) = ˆ n θ = ˆ n (X1,...,Xn) θ is called the bias of ˆ n . An estimator ˆ n (X1,...,Xn) is called an unbiased estimator of θ if its bias equals zero, or equivalently, ˆ n = θ for all θ Θ. Example 1.3. Assume that the underlying distribution of the random sam- ple X1,...,Xn is Poisson with mean θ. The probability mass function is given by pn(x, θ) = θx x! e−θ , θ 0, x {0, 1, 2,... }. Then the log-likelihood function has the form Ln(θ) = n i = 1 Xi ln θ n i = 1 ln (Xi!). Setting the derivative equal to zero yields the solution θ n = ¯ n , where ¯ n = (X1 + · · · + Xn)/n denotes the sample mean. In this example, the MLE is unbiased since θn = ¯ n = X1 = θ. Nonetheless, we should not take the unbiased MLE for granted. Even for common densities, its expected value may not exist. Consider the next example. Example 1.4. For the exponential distribution with the density p(x, θ) = θ exp θ x , x 0, θ 0, the MLE θ n = 1/ ¯ n has the expected value θ n = n θ/(n 1) (see Exercise 1.6). In particular, for n = 1, the expectation does not exist since 0 x−1 θ exp θ x dx = ∞. In this example, however, an unbiased estimator may be found for n 1. Indeed, the estimator (n 1)θ n /n is unbiased. As the next example shows, an unbiased estimator may not exist at all.
Previous Page Next Page