# Essential papers on matrix decompositions

I recently read Skillicorn’s book on matrix decompositions, and was a bit disappointed, as it was targeted to an undergraduate audience. I would like to compile (for myself and others) a short bibliography of essential papers (surveys, but also breakthrough papers) on matrix decompositions. What I have in mind primarily is something on SVD/PCA (and robust/sparse variants), and NNMF, since those are by far the most used. Do you all have any recommendation/suggestion? I am holding off mine not to bias the answers. I would ask to limit each answer to 2-3 papers.

P.S.: I refer to these two decompositions as the most used in data analysis. Of course QR, Cholesky, LU and polar are very important in numerical analysis. That is not the focus of my question though.

How do you know that SVD and NMF are by far the most used matrix decompositions rather than LU, Cholesky and QR? My personal favourite ‘breakthrough’ would have to be the guaranteed rank-revealing QR algorithm,

• Chan, Tony F. “Rank revealing QR factorizations”. Linear Algebra and its Applications Volumes 88-89, April 1987, Pages 67-82. DOI:10.1016/0024-3795(87)90103-0

… a development of the earlier idea of QR with column-pivoting:

• Businger, Peter; Golub, Gene H. (1965). Linear least squares solutions by Householder transformations. Numerische Mathematik Volume 7, Number 3, 269-276, DOI:10.1007/BF01436084

A (the?) classic textbook is:

• Golub, Gene H.; Van Loan, Charles F. (1996). Matrix Computations (3rd ed.), Johns Hopkins, ISBN 978-0-8018-5414-9.

(i know you didn’t ask for textbooks but i can’t resist)

Edit:
A bit more googling finds a paper whose abstract suggests we could be slightly at cross porpoises. My above text was coming from a ‘numerical linear algebra’ (NLA) perspective; possibly you’re concerned more with an ‘applied statistics / psychometrics’ (AS/P) perspective? Could you perhaps clarify?