1.1. Statistical Experiment 5
If the function L is differentiable at its attainable maximum, then θn

is a
solution of the equation
∂Ln(θ)
∂θ
= 0.
Note that if the maximum is not unique, this equation has multiple solutions.
The function
bn(θ) = bn(θ ,
ˆn)
θ =
ˆn
θ θ =
ˆn(X1,...,Xn)
θ θ
is called the bias of
ˆ
θ n. An estimator
ˆ
θ n(X1,...,Xn) is called an unbiased
estimator of θ if its bias equals zero, or equivalently,
ˆ
θ
n
= θ for all
θ Θ.
Example 1.3. Assume that the underlying distribution of the random sam-
ple X1,...,Xn is Poisson with mean θ. The probability mass function is
given by
pn(x, θ) =
θx
x!
e−θ
, θ 0, x {0, 1, 2,... }.
Then the log-likelihood function has the form
Ln(θ) =
n
i = 1
Xi ln θ
n
i = 1
ln (Xi!).
Setting the derivative equal to zero yields the solution θn

=
¯n,
X where
¯n
X = (X1 + · · · + Xn)/n
denotes the sample mean. In this example, the MLE is unbiased since
θn

=
¯n
X = X1 = θ.
Nonetheless, we should not take the unbiased MLE for granted. Even
for common densities, its expected value may not exist. Consider the next
example.
Example 1.4. For the exponential distribution with the density
p(x, θ) = θ exp θ x , x 0, θ 0,
the MLE θn

= 1/
¯
X
n
has the expected value θn

= n θ/(n 1) (see
Exercise 1.6). In particular, for n = 1, the expectation does not exist since

0
x−1
θ exp θ x dx = ∞.
In this example, however, an unbiased estimator may be found for n 1.
Indeed, the estimator (n 1)θn
∗/n
is unbiased. As the next example shows,
an unbiased estimator may not exist at all.
Previous Page Next Page