Ordinary Differential Equations
G63.2470
Spring 2009, Wednesdays 5:00-7:00 pm, WWH Room 102
Instructor: Olof B. Widlund
Coordinates
Office: WWH 612
Telephone: 998-3110
Office Hours: Drop by any time, or send email or call for an appointment
Email: widlund at cims.nyu.edu
Course URL: http://math.cims.nyu.edu/courses/spring09/G63.2470.001/index.html
Announcements
Class will meet every Wednesday except on March 18. The last class will
be on April 29 and the in-class final on May 6 starting at 5:00 pm.
Here is a Practice Final.
Blackboard
A Blackboard account has been activated and you can use it to keep
track of homework scores. I will also use it to send e-mail occasionally.
The assignments will only be available, in pdf format, via this course
home page.
Requirements
There will be regular homework assignments. It is important that you do the homework yourself, but when you get stuck,
I encourage you to consult with other students,
or me, to get help when necessary. However, when you get help,
it's important to acknowledge it in writing. Passing off other people's
work as your own is not acceptable.
Homework
Lectures
These are short descriptions of the content of each lecture, after the
fact.
- January 21. Initial value problems for first order ordinary differential
equations with one dependent variable x'=f(t,x). If f(t,x) is continuous,
there exists at least one solution locally. x' = x^2 provides an example
of an ODE with a solution which cannot be continued for all values of t.
x'=x^{1/2} and x'=x^{1/3} illustrate that there can be more than one solution.
Uniqueness is guaranteed if f(t,x) is Lipschitz continuous with respect to
the second variable. Picard iteration and an existence proof for f(t,x) which
satisfies the Lipschitz condition. An example, with f(t,x) not Lipschitz
and for which the Picard iteration does not converge. Several integration
techniques: (i) f(t,x) a function of t only; (ii) separation of variables;
(iii) homogeneous equations.
- January 28. An existence proof for f(t,x), which are not Lipschitz;
approximate this function by a family of polynomials in t and x and show,
by using Arzela's theorem (given in the C & L book) that when
the polynomial approximation converges to f(t,x), then a family of
solutions converges to a solution of x'=f(t,x). Another proof of the same
result in given in the C & L book; is uses a finite difference approach.
Euler' method, a method using Taylor series, and two other finite difference
methods to solve the ODEs numerically. We can obtain better than linear
convergence in the time step if we use the more advanced methods. Turning
higher order ODEs into first order systems of ODE. There are no surprises
as far as existence and uniqueness are concerned. Continuing solutions of
an ODE if two solutions are defined on different intervals and the two
solutions are the same on a common interval of the t-axis. A nonlinear
spring problem for which we can show existence for all times by establishing
a bound on x and x', which is valid for all t. Solutions of ODE can cease
to exist only if |x(t)| goes to infinity. Additional solution techniques:
(iv) solving x'(t)=f((a_1 t + b_1 x + c_1)/a_2 t + b_2 x + c_2)); (v) using
the transformation x=y^m; (vi) linear equations by "variation of the constant".
- February 4. First order systems of ODE with initial values and right hand
sides which depend smoothly on a number of parameters. A method of perturbation
which allow us to compute a linear approximation of solution of such a system
valid for small values of the parameters.
The cost is the solution of the unperturbed
problem and a set of linear systems of ODE. A proof of the continuity of
the solutions as a function of the parameters. Under suitable assumptions,
we can also show that we have continuous derivatives with respect to the
parameters. This latter result can be used to give a full justification of the
perturbation method. Linear systems of ODE. Their solution sets define
a fundamental solution matrix and these sets are also vector spaces of
the same dimension as the systems. The solution matrices can be used
to extend the variation of constant method previously developed for
scalar problems. The Wronskian of a linear system; it satisfies a scalar
linear ODE with a coefficient obtained from the trace of the matrix of
the linear system. The form of the Wronskian if the system originates
from a scalar ODE of higher order. How to use information of special
solutions and the Wronskian to obtain other columns of the fundamental
solution matrix.
- February 11. How to turn Bernoulli equations into linear ODEs.
Ricatti equations and how they can be turned into Bernoulli equations if
we know on particular solution. Euler's equations and how an exponential
transformation of the independent variable turns them into linear equations
with constant coefficients. Finding information on two linearly independent
solutions of a particular linear ODE of second order by first developing one
solution as a power series and then, by using the Wronskian, showing that
the second solution has a logarithmic singularity. Reducing the size of
a linear system of equations if one solution is known; to be revisited.
Linear systems of ODE's with constant coefficients. The use of eigenvalues
and eigenvectors of the coefficient matrix. The deficient case when we can
use Jordan's canonical form. Solving the equations when there are "forcing
terms"; to be revisited. The exponential of matrices defined in terms of
convergent power series. The exponential of a Jordan box. Solving complete
differentials; results in an algebraic equation satisfied by the solution.
How to turn other problems into complete differentials by multiplying by
a common factor.
- February 18. Reducing the construction of the fundamental solution
matrix for an n-by-n linear systems to an (n-1)-by-(n-1) problem if
one solution is known. Details about how a related transformation can
be computed. An attempt to solve a 3-by-3 problem with two known solutions;
more rapid progress seems to be possible using the Wronskian.
Linear system with constant coefficients. The real Jordan
form. Finding the solutions for special "forcing functions"; different
cases depending on the size of the Jordan blocks, etc. Liapunov's construction
of a norm, equivalent to the Euclidean norm, and a proof that all solutions
of a linear system with constant coefficients approach the zero solution
even in the presence of a small nonlinear perturbation provided that all
eigenvalues of the matrix have strictly negative real parts and the initial
values are small.
- February 25. Why is it interesting to study the effect of linear
or nonlinear perturbations of linear systems of ODE? We can linearize
many nonlinear problems around a solution of the linear problem and
ask interesting questions about the stability of the linear systems.
This can be done by using Taylor expansions. Simple examples which
illustrates the importance of the linearized problem to determine
the behavior for large values of the independent variable t. The Schur
normal form as an alternative to the Jordan normal form. Linear
problems with periodic coefficients; the fundamental matrix of
solutions has a special from. It is a product of a matrix with periodic
coefficients and an exponential factor given by a constant matrix. This
matrix determines the behavior for large t. Linear perturbations of
the coefficient matrix by a matrix B(t) and the effect on the such
perturbations if the unperturbed problem has constant or periodic
coefficients.
- March 4. How to diagonalize the matrix of linear system of ODE;
it can be done successfully, if the matrix is smooth enough and
has distinct eigenvalues. An extra term appears in the system written
in the transformed variables and this affects the formulation of
a result which establishes asymptotic stability of a perturbed problem
given that the unperturbed problem is asymptotically stable. A counter
example to a proposed general stability result on asymptotic stability
under the assumption that the norm of the perturbation matrix is
integrable. Adding a lower finite bound on the integral of the trace
of the coefficient matrix assures that the inverse of the fundamental
matrix of solutions will remain bounded; a computation of the Wronskian
is involved. Autonomous systems; they can be converted it a system,
which normally is not autonomous, and which has one equation less. Any
system can, at the expense of increasing its order by 1 be converted
into an autonomous system. Singular points and singular solutions:
system at rest. Periodic solution; they have well defined period
and can be mapped one-to-one and continuously onto the unit circle.
Other solutions; a third option.
- March 11. Linear autonomous systems in the 2-by-2 case; there are
six cases. Adding nonlinear perturbations; the type can change unless
we restrict the non-linearity to vanish faster than o(r). An
autonomous system with solutions on a torus and which is very sensitive
to perturbations of the initial data; problems with periodic solutions
can change to orbits on the surface of the torus which come arbitrarily
close to any point.
- March 25. Further comments on the 3-by-3 autonomous system. The Poincare-
Bendixson theorem.
- April 1. Boundary value problems for systems of ordinary differential
equations; in the nonlinear case there might be no solutions or several
solutions even if the system is defined by very regular functions. Scalar
second order problems on a finite interval. The adjoint of the operator
and conditions for self-adjointness. The Green's function, its properties
and an explicit construction; it provides a representation of the solution
of inhomogeneous problems in integral form. Eigenvalue problems and a
determinant condition; the determinant is an entire function of the
eigenvalue parameter which gives important insight. For selfadjoint
problems, the eigenfunctions of different eigenvectors are orthogonal
and the eigenvalues are real. Using the integral operator to establish
the existence of eigenfunctions; the compactness of that operator
plays a crucial role in establishing the smoothness of elements in
its range. An outline of how the first eigenfunction is found
and how we then can proceed to find the next, etc.
- April 8. Further comments on eigensystems of Sturm-Liouville problems.
Completeness of expansions using eigenfunctions. Separation and comparison
theorems due to Sturm. Orthogonal polynomials.
Orthogonality with respect to a non-negative weight function. Examples
of well-known sets of orthogonal polynomials; Legendre, Chebyshev, Jacobi,
Laguerre, and Hermite. The Legendre ODE. Three term recurrences to compute
orthogonal polynomials. Oscillations and zeros of orthogonal polynomials.
- April 15. Gaussian quadrature. Power series solutions of second order
scalar ODE in the regular case. A proof that the power series solutions
have the same radius of convergence as the series that define the coefficients.
Linear systems of equations defined by a single-valued matrix function
with complex-valued elements defined in a simply connected domain. The case
of an isolated singularity at the origin. The form of the solution; it can
be analytic everywhere, it can have a branch point or a pole, or even have
an essential singularity. Using the Jordan form of a matrix to determine
a fundamental matrix of solutions.
- April 22. Linear systems with a singularity of the first kind at the
origin of the complex plane; solutions have at most a pole at the origin.
Construction of solutions in terms of power series and powers of log(z).
When we have a singularity of the first kind, more can be established
of the form of the solution, in particular, we can at times prove that
z^{A_0} is a factor in the fundamental matrix of solutions. Equations
with a singularity of the first kind only at infinity; the proper definition
of such a singularity by using a simple change of variables. This assumption
restricts the coefficients considerably. Remarks on cases when we have
one of two additional singularities of the first kind.
- April 29. The coefficients of a second order ordinary differential
equation with two independent power series solutions can be calculated
and it can be shown that the singularity is of the first kind.
Bessel functions; finding two independent solutions to
Bessel's equation. The presence of a logarithmic factor in the case
of integer values of the parameter. Second order linear equations with
coefficients analytic everywhere including infinity: there are none.
Equations with only
one singularity of the first kind at infinity or at finite point in the
complex plane: this is a very special problem. Problems with two singularities
of the first kind; the coefficients have to be of a special form and the
problem can be reduced to a second order differential equation with constant
coefficients. The case with three singular points, Riemann's differential
equation, and hypergeometric series. Comments on special cases which gives
us Jacobi polynomials.
Required Text
Coddington and Levinson, Theory of Ordinary Differential Equations.
Don't Hesitate to Ask for Help
If you have questions, send me email, give me a call, or drop by my office.
Don't wait until it's too late!