**Numerical Linear Algebra**

**Fall 2017**- Prerequisites: mathematical analysis, advanced linear algebra, matlab
- If any other mathematical topic is as fundamental to the mathematical sciences as calculus and differential equations, it is numerical linear algebra. -- Trefethen & Bau.
- Numerical linear algebra is the study of how to accurately and efficiently solve linear algebra problems on a computer. -- William Ford.

## Contents

## Instructor & Tutor

- Kui Du, School of Mathematical Sciences, Xiamen University Email: kuidu@xmu.edu.cn Tel: 0592-2580672
- Xiao Qi, School of Mathematical Sciences, Xiamen University Email: 530194696@qq.com

## Textbooks (bookcover)

- Numerical Linear Algebra, Lloyd N. Trefethen and David Bau, III, SIAM, 1997
- (supplement) Applied Numerical Linear Algebra, James Demmel, SIAM, 1997
- (supplement) Numerical Linear Algebra and Applications, Biswa Nath Datta, 2nd Edition, SIAM, 2010
- (supplement) Matrix Computations, Gene H. Golub and Charles F. Van Loan, 4th Edition, Johns Hopkins University Press, 2013
- (supplement) Numerical Linear Algebra with Applications Using MATLAB, William Ford, Academic Press, 2015
- (supplement) Numerical Linear Algebra (in Chinese), Shufang Xu, Li Gao, and Pingwen Zhang, 2nd Edition, Peking University Press, 2013
- (supplement) MATLAB Primer, Timothy A. Davis, 2011

## Lecture contents (tentative)

- Part 1: (a) Inner product, Orthogonality, Vector/Matrix norm; (b) Singular value decomposition (SVD)
- Part 2: (a) Projector, Classical/Modified Gram-Schmidt; (b) QR factorization; (c) Householder reflector, Givens rotation; (d) Least squares
- Part 3: (a) Conditioning; (b) Conditioning of least squares problems; (c) Floating point arithmetic; (d) Stability; (e) Stability of back substitution
- Part 4: (a) LU factorization, Gaussian elimination; (b) Pivoting; (c) Cholesky factorization; (d) Stationary iterative methods: Jocabi, Gauss-Seidel and SOR etc.
- Part 5: (a) Eigenvalue problems; (b) Power/Inverse iteration, Rayleight quotient iteration; (c) QR algorithm; (d) Hessenberg/Tridiagonal reduction; (e) other algorithms
- Part 6: Other topics depending on time: Krylov subspace methods, Preconditioning, FFT and Structured matrices, Pseudospectra, ...
- Matlab m-files for lectures

## Grading policy

- Assignment 30% + Programming 30% + Final exam 40%
- Bonus 5% (overall performance: classroom and discussion participation, learning attitude, reading project, ...)
- Assignment + Programming + Bonus <= 60%

## Homework (tentative)

- Discussion is permissible. However, transcribed solutions and copied programs are both unacceptable!
- Writing down solutions in
**english and latex**is highly encouraged. - Please prepare programming reports using
**matlab**(you must use matlab's`publish`to produce pdf-files). Submit your programming reports to: xmunla@163.com. Only one pdf-file should be submitted with filename studentnumberPx.pdf for Programming x. For example, if your student number is 19020150000000, then your submission for Programming 1 is 19020150000000P1.pdf. Submissions in other formats are unacceptable! - Late submissions for assignments and programming reports get only
**half**the score. Unaccepted submissions get**0**score.**No exceptions!**

- Assignment 1. (latex file) Hand in the assignment in class on
**October 11, 2017**. - Assignment 2. (latex file) Hand in the assignment in class on
**October 25, 2017**. - Assignment 3. (latex file) Hand in the assignment in class on
**November 22, 2017**. - Assignment 4. (latex file) Hand in the assignment in class on
**December 6, 2017**. - Assignment 5.
- Programming 1. (1) Plot Figure 3.1. (2) Exercise 4.3.
**Deadline: October 11, 2017**. For your reference: Programming 1 (pdf file) - Programming 2. (1) Exercise 8.2. (2) Exercise 9.3. (3) Exercise 10.2. (4) Exercise 10.3. (5) Exercise 11.3. (About (g), you don't have to shade with red pen the digits that appear to be wrong.)
**Deadline: October 25, 2017**. For your reference: Programming 2 (pdf file) - Programming 3. (1) Exercise 12.2(b). (Both the equispaced point case and the Chebyshev point case are required. Plot the results in one figure.) (2) Exercise 12.3(a)(b). (3) Exercise 13.3. (Plot the results in one figure.) (4) Exercise 16.2. (5) Exercise 19.2. (6) Bonus question: read the material and complete the corresponding exercise (barcode matlab files).
**Deadline: November 15, 2017**. - Programming 4. (1) Exercise 20.2. (Answer the question and write matlab codes to provide an example with p=3 for a 20x20 matrix A. Plot the sparisity patterns of A, L and U by using matlab's
`spy`.) (2) Exercise 20.4. (Write two matlab functions,`[L,U]=gelu(A)`and`[L,U]=geoplu(A)`, to implement Algorithm 20.1 and the "outer product" form of Guassian elimination you have designed, respectively. Compare the CPU times of`gelu`and`geoplu`for a 500x500 matrix A.) (3) Plot Figure 22.1. (4) Write a matlab function,`[L,U,P]=gepp(A)`, to implement Algorithm 21.1. (5) Write a matlab function,`R=mychol(A)`, to implement Algorithm 23.1.**Deadline: December 6, 2017**. - Programming 5.

## Reading project (not compulsory)

- Group size: 1, 2, or 3 students each group. No more than 3.
- Requirements: (1) choose one project from below; (2) read the corresponding paper or a few related papers you can find under the same topic; (3) submit a reading report (a pdf file by using matlab or latex system, do not accept handwritten or scanned version) or a set of slides that describe the background, motivation, main methods, numerical results, and future direction.
- The submitted documents will be read by the instructor. Points will be taken for missing major parts, missing major novelties or contributions, missing major references, or any significant mathematical errors.
- Deadline:
**December 31, 2017**

- Project 1: Making do with less: An introduction to compressed sensing, Kurt Bryan and Tanya Leise, SIREV, 2013
- Project 2: A DEIM induced CUR factorization, D.C. Sorensen and Mark Embree, SISC, 2016
- Project 3: Accurate low-rank approximations via a few iterations of alternating least squares, Arthur Szlam, Andrew Tulloch, and Mark Tygert, SIMAX, 2017
- Project 4: Rows versus columns: Randomized Kaczmarz or Gauss--Seidel for ridge regression, Ahmed Hefny, Deanna Needell, and Aaditya Ramdas, SISC, 2017
- Project 5: IMRO: A proximal quasi-Newton method for solving -regularized least squares problems, Sahar Karimi and Stephen Vavasis, SIOPT, 2017

## Others

- Comprehension and reflection on machine learning, Zhihua Zhang
- The singular value decomposition, applications and beyond, Zhihua Zhang
- Singular value decomposition, eigenfaces, and 3D reconstructions, Neil Muller, Lourenco Magaia, and B. M. Herbst
- The matrix cookbook, Kaare Brandt Petersen and Michael Syskind Pedersen
- Chebfun-numerical computing with functions, Trefethen's team
- Matrix computations/Numerical linear algebra, Jianyu Pan
- An introduction to the conjugate gradient method without the agonizing pain, Jonathan Shewchuk
- Randomized algorithms in numerical linear algebra, Ravindran Kannan and Santosh Vempala, Acta Numerica, 2017
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions, N. Halko, P. G. Martinsson, and J. A. Tropp, SIREV, 2011
- Numerical linear algebra in data mining, Lars Elden, Acta Numerica, 2006