2. THE MODEL 11
We will end up with a stochastic process on the population space
(
A
)m
. Since
the genetic composition of a population contains all the necessary information to
describe its future evolution, our process will be Markovian.
Discrete versus continuous time. We can either build a discrete time Markov
chain or a continuous time Markov process. Although the mathematical construc-
tion of a discrete time Markov chain is simpler, a continuous time process seems
more adequate as a model of evolution for a population: births, deaths and mu-
tations can occur at any time. In addition, the continuous time model is mathe-
matically more appealing. We will build both types of models, in continuous and
discrete time. Continuous time models are conveniently defined by their infinitesi-
mal generators, while discrete time models are defined by their transition matrices
(see the appendix). It should be noted, however, that the discrete time and the
continuous time processes are linked through a standard stochastization procedure
and they have the same stationary distribution. Therefore the asymptotic results
we present here hold in both frameworks.
Infinitesimal generator. The continuous time Moran model is the Markov pro-
cess (Xt)t∈R+ having the following infinitesimal generator: for φ a function from
(
A
)m
to R and for any x
(
A
)m
,
lim
t→0
1
t
E
(
φ(Xt)|X0 = x
)
φ(x) =
1≤i,j≤m u∈A
A(x(i))M(x(i),u) φ
(
x(j u)
)
φ(x) .
Transition matrix. The discrete time Moran model is the Markov chain (Xn)n∈N
whose transition matrix is given by
∀n N ∀x
(
A
)m
∀j { 1,...,m } ∀u A \ { x(j) }
P
(
Xn+1 = x(j u) | Xn = x
)
=
1
m2λ
1≤i≤m
A(x(i))M(x(i),u) ,
where λ 0 is a constant such that
λ max A(u) : u A .
The other non diagonal coefficients of the transition matrix are zero. The diagonal
terms are chosen so that the sum of each row is equal to one. Notice that the
continuous time formulation is more concise and elegant: it does not require the
knowledge of the maximum of the fitness function A in its definition.
Loose description of the dynamics. We explain first the discrete time dynamics
of the Markov chain (Xn)n∈N. Suppose that Xn = x for some n N and let us
describe loosely the transition mechanism to Xn+1 = y. An index i in { 1,...,m }
is selected randomly with the uniform probability. With probability 1 A(x(i))/λ,
nothing happens and y = x. With probability A(x(i))/λ, the chromosome x(i)
enters the replication process and it produces an offspring u according to the law
M(x(i), ·) given by the mutation matrix. Another index j is selected randomly
with uniform probability in { 1,...,m }. The population y is obtained by replacing
the chromosome x(j) in the population x by a chromosome u.
Previous Page Next Page