Introduction
Background
Linear algebra plays a key role in the theory of dynamical systems, and
concepts from dynamical systems allow the study, characterization and gen-
eralization of many objects in linear algebra, such as similarity of matrices,
eigenvalues, and (generalized) eigenspaces. The most basic form of this in-
terplay can be seen as a quadratic matrix A gives rise to a discrete time
dynamical system xk+1 = Axk, k = 0, 1, 2,... and to a continuous time
dynamical system via the linear ordinary differential equation ˙ x = Ax.
The (real) Jordan form of the matrix A allows us to write the solution
of the differential equation ˙ x = Ax explicitly in terms of the matrix ex-
ponential, and hence the properties of the solutions are intimately related
to the properties of the matrix A. Vice versa, one can consider properties
of a linear flow in
Rd
and infer characteristics of the underlying matrix A.
Going one step further, matrices also define (nonlinear) systems on smooth
manifolds, such as the sphere Sd−1 in Rd, the Grassmannian manifolds, the
flag manifolds, or on classical (matrix) Lie groups. Again, the behavior of
such systems is closely related to matrices and their properties.
Since A.M. Lyapunov’s thesis [97] in 1892 it has been an intriguing prob-
lem how to construct an appropriate linear algebra for time-varying systems.
Note that, e.g., for stability of the solutions of ˙ x = A(t)x it is not sufficient
that for all t R the matrices A(t) have only eigenvalues with negative
real part (see, e.g., Hahn [61], Chapter 62). Classical Floquet theory (see
Floquet’s 1883 paper [50]) gives an elegant solution for the periodic case,
but it is not immediately clear how to build a linear algebra around Lya-
punov’s ‘order numbers’ (now called Lyapunov exponents) for more general
time dependencies. The key idea here is to write the time dependency as a
xi
Previous Page Next Page