Spring 2014, Coordinates of Olof Widlund
Lectures: Tuesdays and Thursdays, 12:30 - 1:45 pm. CIWWH 201
Office hours: 30 minutes before and after each class or by appointment.
Recitation sessions: Wednesdays, 8:00 - 9:15 am. CIWWH 201.
Instructor: Juan Calvo
Office hours: Wednesdays 10-11am, WWH605.
Office: WWH 612
Email: widlund at cims.nyu.edu
Coordinates of Juan Calvo
Office: WWH 607
Email: calvo at cims.nyu.edu
An Introduction to Numerical Analysis by Endre Suli and David Mayers.
Cambridge University Press.
There will be a final and a midterm exam and
also relatively frequent and short quizzes. Scores will be made
available on NYU Classes.
The midterm exam will be held on March 6, 12:30-1:45pm.
The first quiz was held on Thursday 2/13.
Here it is
Here are five sample questions
that could be given on a midterm exam.
You can bring one regular-sized paper covered on both sides with notes
written by yourself to the midterm exam. The textbook and other notes will
not be allowed. To understand what topics will be covered by the midterm
and final exams, have a look at the Course Content section of this home page.
There will be regular homework assignments, including programming
using Matlab, Octave, or any reasonable alternative.
Scores will be made available on NYU Classes.
When turning in programming assignments, please provide a listing of
your program and output from runs with actual data that shows that
your program works as intended.
It is important that you do the homework yourself, but if you get stuck,
I encourage you to consult with other students.
However, when you get help,
it's important to acknowledge it in writing. Passing off other people's
work as your own is called plagiarism and is not acceptable.
Please staple everything together and order the problems in the
same order as given.
The best way of turning in homework is to give it to me
personally, in class or in my office. If I am not in my office, you can
slide it under my door. Do not put homework in my mailbox.
Homework set 1
Assigned January 31, due at 12:00 noon on Friday February 14.
Homework set 2
Assigned February 17, due at 12:00 noon on Friday February 28.
Homework set 3
Assigned March 11, due at 12:30 pm on Friday March 28.
Course Content. All page and section
references are to the
For each week, there will be a short description of the content
of the lectures.
Note that you should make sure that you obtain and save
all handouts; they should be an important part of your reading.
Request copies if you miss any of them.
Week 1: Chapter 1 of textbook. Finding roots of one nonlinear
equation of one real variable. Brouwer's fixed point theorem. Contractions
and the contraction mapping theorem. Rates of convergence. Linear and
super-linear convergence. The bisection algorithm and its rate of convergence.
Newton's method and its convergence rate in case of a simple root and a
double root. The secant method. The Illinois method; this method will be
further described in the first homework set. Sections not covered: 1.7 and 1.8.
Week 2: Review of linear system of algebraic equations for the case
of n unknowns and n equations; linear independence of the columns (and the
rows) of the matrix guarantees that the there is a unique solution. Gaussian
elimination. What operations can be done at the same time? Expressing the
process in terms of multiplication of the coefficient matrix from the left
by (n-1) special unit lower triangular matrices. How to find the elements
of the unit lower triangular matrix in the factorization PA=LU. The need
for pivoting. Partial pivoting and an example of a matrix of order 2 for
which partial pivoting is required in order to avoid very poor accuracy of
the solution. Sections not covered: 2.8 and 2.10.
Week 3: How to decide if we can compute the values of a function with
good relative accuracy. What matters is xdf(x)/dx/f(x) known as the
condition number of the problem. Norms of vectors in R^n. A family of ell_p
norms for 1 \leq p \leq infinity. A proof that all these norms satisfy
the three requirements; to prove the triangle inequality is the hardest task.
While doing these proofs, we develop the Cauchy-Schwarz and Hoelder
inequalities. Norms of matrices for the case of p=1, p=2, and p=infinity.
Week 4: More on the norm of matrices defined by vector norms. The norm
of a product of matrices can be bounded by the product of their norms. The
condition number of the solution of a linear system of equations: the relative
change of the norm of the solution as a function of the relative change of
the right hand side and of the relative change of the coefficient matrix.
Symmetric, positive definite matrices; no pivoting is necessary when factoring
them by Cholesky's algorithm. This algorithm can also be used to decide
if a symmetric matrix is positive definite. How to save arithmetic and
storage when using Cholesky. Diagonally dominant matrices; another family
for which pivoting is not needed. Linear least squares problems when we
have more linear equations than unknowns. The normal equations. How
to fit data to a model defined by a polynomial. Why to try to develop an
algorithm which does not require the computation of the matrix of the
normal equations; to be continued. Sections not covered: 3.4 and 3.5.
Week 5: Linear independence and the uniqueness of the solutions of
least squares problems. Householder transformations as in Section 5.5.
The orthogonality of any Householder matrix; it can most conveniently be stored
by just one vector. The basic geometry of Householder transformations.
The QR factorization of an m-by-n matrix in terms of a product of
(n-1) Householder matrices and an m-by-n matrix with a lead-off n-by-n
upper triangular matrix; this matrix is nonsingular if A has linearly
independent columns. How to solve a least least squares problem using
a QR factorization. How Householder matrices can be used to compute
a symmetric tridiagonal matrix with the same eigenvalues as a given
symmetric m-by-m matrix. Further comments on Householder transformations
and on manipulating block matrices. Solving systems of non-linear equations
where the number of unknowns and the number of equations are the same.
Contraction mapping and a derivation of Newton's method; a linear system
of equations must be solved in each iteration where the matrix elements
are partial derivatives of the different functions with respect to the
different variable. This matrix is known as the Jacobian. If it is well
behaved close to the solution, we will have fast convergence from any
starting approximation close to the solution. A review of the algebraic
eigenvalue problem, in particular for symmetric matrices.
Week 6: Using eigenvalues and eigenvectors to solve initial value problems
for simple linear systems of ordinary differential equations. The danger of
using the roots of the characteristic equation to compute the eigenvalues
of a matrix. Two theorems by Gerschgorin. Determinants of symmetric tridiagonal
matrices. Cauchy's interlace theorem and how it can be combined with a
bisection algorithm to located eigenvalues of a tridiagonal matrix.