Students are often surprised when they first hear the following definition: A
stochastic process is a collection of random variables indexed by time. There
seems to be no content here. There is no structure. How can anyone say
anything of value about a stochastic process? The content and structure
are in fact provided by the definitions of the various classes of stochastic
processes that are so important for both theory and applications. There are
processes in discrete or continuous time. There are processes on countable
or general state spaces. There are Markov processes, random walks, Gauss-
ian processes, diffusion processes, martingales, stable processes, infinitely
divisible processes, stationary processes, and many more. There are entire
books written about each of these types of stochastic process.
The purpose of this book is to provide an introduction to a particularly
important class of stochastic processes — continuous time Markov processes.
My intention is that it be used as a text for the second half of a year-long
course on measure-theoretic probability theory. The first half of such a
course typically deals with the classical limit theorems for sums of inde-
pendent random variables (laws of large numbers, central limit theorems,
random infinite series), and with some of the basic discrete time stochastic
processes (martingales, random walks, stationary sequences). Alternatively,
the book can be used in a semester-long special topics course for students
who have completed the basic year-long course. In this case, students will
probably already be familiar with the material in Chapter 1, so the course
would start with Chapter 2.
The present book stresses the new issues that appear in continuous time.
A difference that arises immediately is in the definition of the process. A
discrete time Markov process is defined by specifying the law that leads from