4 1. The Fisher Efficiency
Example 1.2. Suppose n independent observations X1,...,Xn come from
a distribution with density
p(x, θ) = p 0(x θ), −∞ x, θ ∞,
where p
0
is a fixed probability density function. Here θ determines the shift
of the distribution, and therefore is termed the location parameter. The
location parameter model can be written as Xi = θ + εi, i = 1,...,n, where
ε1,...,εn are independent random variables with a given density p 0, and
θ Θ = R.
The independence of observations implies that the joint density of Xi’s
equals
p (x1,...,xn, θ) =
n
i = 1
p (xi, θ).
We denote the respective expectation by Eθ[ · ] and variance by Varθ[ · ].
In a statistical experiment, all observations are obtained under the same
value of an unknown parameter θ. The goal of the parametric statistical
estimation is to assess the true value of θ from the observations X1,...,Xn.
An arbitrary function of observations, denoted by
ˆ
θ =
ˆ
θ
n
=
ˆ
θ n(X1,...,Xn),
is called an estimator (or a point estimator) of θ.
A random variable
l(Xi , θ) = ln p(Xi , θ)
is referred to as a log-likelihood function related to the observation Xi.
The joint log-likelihood function of a sample of size n (or, simply, the log-
likelihood function) is the sum
Ln(θ) = Ln(θ | X1 , . . . , Xn) =
n
i = 1
l(Xi , θ) =
n
i = 1
ln p(Xi , θ).
In the above notation, we emphasize the dependence of the log-likelihood
function on the parameter θ, keeping in mind that it is actually a random
function that depends on the entire set of observations X1,...,Xn.
The parameter θ may be evaluated by the method of maximum likelihood
estimation. An estimator θn

is called the maximum likelihood estimator
(MLE), if for any θ Θ the following inequality holds:
Ln(θn
∗)
Ln(θ).
If the log-likelihood function attains its unique maximum, then the MLE
reduces to
θn

= argmax
θ∈Θ
Ln(θ).
Previous Page Next Page