Tensor Computations (CS 598 EVS) Fall 2020
The course will cover theory, algorithms, and applications of tensor decompositions and tensor networks.
The key prerequisite for this material is familiarity with numerical linear algebra topics and algorithmic analysis.
The planned syllabus for the course is below.
The major learning objectives for the courser are to establish fluency/inuition of understanding for computations involving tensor contractions and decompositions, to provide a general understanding of theoretical foundations of associated nonlinear optimization problems, to develop a methodology for efficient implementaton of tensor algorithms, and to provide a view of the application landscape of tensor computations.
- characterization of tensors (order, dimensions, symmetry, sparsity)
- tensor contractions (tensor products, Kronecker products, etc.)
- tensor notation (matricization, mode-n products, indexing)
- diagrammatic notation (graphs / Feynman diagrams to represent tensor contractions)
Theoretical Fundamentals of Tensor Decompositions
- Motivating application: bilinear algorithms
- tensor decomposition rank (CP, Tucker, tensor train)
- uniqueness, conditioning, and stability (canonical forms)
- hardness of CP decomposition
- generalized tensor decomposition
- tensor eigenvalues
Dense Tensor Decomposition Algorithms
- Motivating application: quantum chemistry
- CP decomposition algorithms (alternating least squares, Gauss-Newton)
- Tucker decomposition algorithms (high-order SVD, high-order orthogonal iteration)
- tensor train SVD
Sparse Tensor Decomposition Algorithms
- Motivating application: recommender systems
- sparse CP and Tucker decomposition
- tensor completion algorithms (alternating least squares, coordinate descent, stochastic gradient descent, Gauss-Newton)
- high-performance implementation of sparse tensor kernels (MTTKRP, TTMc, TTTP) and compressed sparse tensor formats
Tensor Network Methods
- Motivating application: quantum simulation
- tensor network states (1D/MPS and 2D/PEPS)
- least-squares and eigenvalue problems with tensor network states
- alternating minimization techniques (DMRG, TT-ALS)
- time evolution techniques (Trotterization)
Student presentations and lectures on other topics based on student interest
Lectures will be supplemented with short in-class web assignments, which can also be completed asynchronously.
Three short homework assignments are planned to test understanding, which can be completed individually or in small groups.
Students have the choice of giving a presentation or preparing a report on a research project or review.
In-class assignments, homeworks, and the presentation/report will each be worth 1/3 of the grade.
Homework 1: Preconditioning via Low-Rank Approximation »
Homework 2: Accelerating CP-ALS using Tucker »
Homework 3: Review of Tensor Computations »
Quizzes (in-class activities)
Quiz 1: Interest Survey »
Quiz 2: Computing the Maximum Ritz Value »
Quiz 3: Low Rank Approximation: Randomized SVD vs Krylov Subspace Methods »
Quiz 4: Gauss-Newton Method »
Quiz 5: Kronecker Product as a Tensor Operation »
Quiz 6: Converting a CP Decomposition to a Tucker Decomposition »
Quiz 7: HOSVD and TTSVD »
Quiz 8: Bilinear Algorithms for Convolution »
Quiz 9: Bilinear Algorithm for Multiplication of a Symmetric Matrix and a Vector »
Quiz 10: Tensor Contraction with Group Symmetry »
Quiz 11: Alternating Least Squares for CP Decomposition »
Quiz 12: Dimension Trees for CP ALS »
Quiz 13: Gauss-Newton Method for CP Decomposition »
Quiz 14: Sparse Tensor times Matrix using CSF Format »
Quiz 15: Nonnegative Tensor Factorization »
Quiz 16: Imaginary Time Evolution »
Quiz 17: Density Matrix Renormalization Group (DMRG) »
Quiz 18: Tensor Network Canonical Forms »
Quiz 19: Computing Eigenvalues of a Symmetric Tensor »
Student Presentation Activity: Symmetric CP Decomposition »
Project Report »
Matrix Computations Background
Lecture 2 Recording (Aug 26, 66 minutes)
Quiz 2: Computing the Maximum Ritz Value
Lecture 3 Recording (Aug 31, part 1, 32 minutes)
Lecture 3 Recording (Aug 31, part 2, 15 minutes)
Quiz 3: Low Rank Approximation: Randomized SVD vs Krylov Subspace Methods
Lecture 4 Recording (Sep 2, part 1, 43 minutes)
Lecture 4 Recording (Sep 2, part 2, 15 minutes)
Lecture 5 Recording (Sep 9, 58 minutes)
Lecture 6 Recording (Sep 14, part 1, 12 minutes)
Lecture 6 Recording (Sep 14, part 2, 15 minutes)
Quiz 4: Gauss-Newton Method
Lecture 7 Recording (Sep 16, starts at 9:39, 60 minutes)
Basics of Tensor Computations
CS 598 EVS: Provably Efficient Algorithms for Numerical and Combinatorial Problems, Spring '20
CS 450: Numerical Analysis, Fall '18
- Matrix Computations Gene Golub and Charles Van Loan, 4th Edition, The John's Hopkins University Press, 2013.
- Tensor Decompositions and Applications T.G. Kolda and B.W. Bader, SIAM Review, 2009.
- A Practical Introduction to Tensor Networks: Matrix Product States and Projected Entangled Pair States R. Orus, Annals of Physics, 2014.
We will be using Python with the libraries
matplotlib for in-class work and
Python Workshop Material
Statement and Support Resources on Mental Health
Diminished mental health, including significant stress, mood changes, excessive worry, substance/alcohol abuse, or problems with eating and/or sleeping can interfere with optimal academic performance, social development, and emotional wellbeing. The University of Illinois offers a variety of confidential services including individual and group counseling, crisis intervention, psychiatric services, and specialized screenings at no additional cost. If you or someone you know experiences any of the above mental health concerns, it is strongly encouraged to contact or visit any of the University’s resources provided below. Getting help is a smart and courageous thing to do -- for yourself and for those who care about you.
Counseling Center: 217-333-3704, 610 East John Street Champaign, IL 61820
McKinley Health Center:217-333-2700, 1109 South Lincoln Avenue, Urbana, Illinois 61801