Chapter 1

BASIC NOTIONS

1.1. A problem from diﬀerential equations

Suppose we are given the problem of ﬁnding a solution of

(1.1) f (x) + f(x) = g(x)

in an interval a ≤ x ≤ b with the solution satisfying

(1.2) f(a) = 1,f (a) = 0.

(We shall not enter into a discussion as to why anyone would want to solve

this problem, but content ourselves with the statement that such equations

do arise in applications.) From your course in diﬀerential equations you will

recall that when g = 0, equation (1.1) has a general solution of the form

(1.3) f(x) = A sin x + B cos x,

where A and B are arbitrary constants. However, if we are interested in

solving (1.1) for g(x) an arbitrary function continuous in the closed inter-

val, not many of the methods developed in the typical course in diﬀerential

equations will be of any help. A method which does work is the least pop-

ular and would rather be forgotten by most students. It is the method of

variation of parameters which states, roughly, that one can obtain a solution

of (1.1) if one allows A and B to be functions of x instead of just constants.

Since we are only interested in a solution of (1.1), we shall not go into any

justiﬁcation of the method, but merely apply it and then check to see if

what we get is actually a solution. So we diﬀerentiate (1.3) twice, substitute

into (1.1) and see what happens. Before proceeding, we note that we shall

get one equation with two unknown functions. Since we were brought up

from childhood to believe that one should have two equations to determine

1

http://dx.doi.org/10.1090/gsm/036/01