Fast Algorithms and Integral Equation Methods (CS 598APK) Fall 2017
What  Where 

Time/place  Wed/Fri 2:00pm3:15pm 1109 Siebel / Catalog 
Class URL  https://bit.ly/fastalgf17 
Class recordings  Echo 360 
Piazza  Discuss » 
Calendar  View » 
Class Projects
Materials, reports, and presentations from the projects that students completed as part of this course are now available.
Homework
 Homework set 1
 Homework set 2
 Homework set 3
 Project Proposal
 Homework set 4
 Project Presentation Signup
(use
@illinois.edu
to sign in)  Homework set 5
 Project Material Submission
Why you should take this class
Many of the algorithms of introductory scientific computing have superlinear runtime scaling. Gaussian elimination or LU decomposition are good examples, their runtime scales as $O(n^3)$ with the number of unknowns $n$. Even simple matrixvector multiplication exhibits quadratic scaling. Problems in scientific computing, especially those arising from science and engineering questions, often demand largescale computation in order to achieve acceptable fidelity, and such computations will not tolerate superlinear, let alone quadratic or cubic scaling.
This class will teach you a set of techniques and ideas that successfully reduce this asymptotic cost for an important set of operations. This leads to these techniques being called "fast algorithms". We will begin by examining some of these ideas from a linearalgebraic perspective, where for matrices with special structure, large cost gains can be achieved. We will then specialize to PDE boundary value problems, which give rise to many of the largestscale computations. We will see that integral equations are the natural generalization of the linearalgebraic tools encountered earlier, and we will understand the mathematical and algorithmic foundations that make them powerful tools for computation. All throughout, we will pay much attention to the idea of representation–i.e. the choice of what the numerical unknowns of the system to be solved should be.
Instructor
Course Outline
I will insert links to class material, books, and papers into this tree as time goes on.
Note: the section headings in this tree are clickable to reveal more detail.
 Introduction

Dense Matrices and Computation
 Notes
 Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions by Halko/Martinsson/Tropp
 Sources and Targets
 Point Interactions and Matrix Entries
 Rank and Complexity
 Numerical Rank
 Representation as Right Preconditioning
 Demo: Conditioning of Derivative Matrices
 Demo: Floating point vs Finite Differences
 Demo: Rank of a Potential Evaluation Matrix
 Code: m_echelon.py

Tools for LowRank Matrices
 Notes
 Randomized LowRank Approximation
 Randomized SVD
 Finding an Orthogonal Basis for the Range of a Matrix
 RankRevealing QR
 Interpolative Decomposition
 Demo: Interpolative Decomposition
 Demo: Randomized SVD
 Demo: RankRevealing QR

Rank and Smoothness
 Notes
 Taylor Expansions
 SpecialPurpose Expansions
 Validity of LowRank Representations
 Demo: Checking Rank Estimates
 Demo: Multipole and Local Expansions
 Demo: Skeletonization using Proxies

Near and Far: Separating out HighRank Interactions
 Notes
 Ewald Summation
 Quadtrees and Octrees
 Tree Codes
 Fast Multipole Methods
 Solving the Linear System: Iterative and Direct Methods

Outlook: Building a Fast PDE Solver
 Notes
 Prototypical elliptic PDEs
 Fundamental Solutions
 Layer Potentials
 Integral Equations

Going Infinite: Integral Operators and Functional Analysis
 Notes
 Operators
 Boundedness
 Compactness
 Riesz and Fredholm
 Spectral Theory
 Kress: Linear Integral Equations (UIUC Ebook available)

Singular Integrals and Potential Theory
 Notes
 Green's Formula
 Jump Conditions
 Exterior Domains

Boundary Value Problems
 Notes
 Green's Formula
 Jump Conditions
 Exterior Domains
 Uniqueness

Back from Infinity: Discretization
 Notes
 Nyström and Galerkin
 Unstructured and High Order: Meshes and Elements
 Polynomial Spaces and Mappings
 Demo: 2D Interpolation Nodes
 Demo: Choice of Nodes for Polynomial Interpolation
 Demo: Vandermonde conditioning
 Demo: Visualizing the 2D PKDO Basis

Computing Integrals: Approaches to Quadrature
 Notes
 Families of Quadrature Methods
 Quadrature for Fourier Modes
 Quadrature by Expansion (QBX)
 Demo: KussmaulMartensen quadrature

More PDEs
 Notes
 Helmholtz
 Maxwell's Equations
 Boundary and Volume Potentials
CAUTION!
These scribbled PDFs are an unedited reflection of what we wrote during class. They need to be viewed in the context of the class discussion that led to them. See the lecture videos for that.
If you would like actual, selfcontained class notes, look in the outline above.
These scribbles are provided here to provide a record of our class discussion, to be used in perhaps the following ways:
 as a way to crosscheck your own notes
 to look up a formula that you know was shown in a certain class
 to remind yourself of what exactly was covered on a given day
By continuing to read them, you acknowledge that these files are provided as supplementary material on an asis basis.
 scribbles20170830.pdf
 scribbles20170901.pdf
 scribbles20170906.pdf
 scribbles20170908.pdf
 scribbles20170913.pdf
 scribbles20170915.pdf
 scribbles20170920.pdf
 scribbles20170922.pdf
 scribbles20170925.pdf
 scribbles20170929.pdf
 scribbles20171004.pdf
 scribbles20171006.pdf
 scribbles20171011.pdf
 scribbles20171013.pdf
 scribbles20171018.pdf
 scribbles20171020.pdf
 scribbles20171025.pdf
 scribbles20171027.pdf
 scribbles20171101.pdf
 scribbles20171103.pdf
 scribbles20171108.pdf
 scribbles20171110.pdf
 scribbles20171115.pdf
 scribbles20171117.pdf
Computing
We will be using Python with the libraries numpy, scipy and matplotlib for assignments. No other languages are permitted. Python has a very gentle learning curve, so you should feel at home even if you've never done any work in Python.
Virtual Machine Image
While you are free to install Python and Numpy on your own computer to do homework, the only supported way to do so is using the supplied virtual machine image.
Books and Papers

For randomized linear algebra: Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions by Halko/Martinsson/Tropp

For FMMs:

A fast algorithm for particle simulations by Greengard and Rokhlin.

A Fast Adaptive Multipole Algorithm for Particle Simulations


For numerical linear algebra:

Golub and van Loan is the definitive reference, with an emphasis on reference

Trefethen and Bau is less comprehensive but more approachable

Mike Heath's sci comp book also contains almost everything you actually need to know about numerical LA


For the functional analysis basics we need as well as the integral equation material, the best match for the course is Kress's book on linear integral equations. The references in the notes are for the second edition, linked above. A third edition is also available.
These can be downloaded as an ebook through the UIUC library's subscription to SpringerLink.
Previous editions of this class
Related Classes Elsewhere
 Alex Barnett (Dartmouth)
 Shravan Veerapaneni (Michigan)
 Leslie Greengard (NYU)
 Gunnar Martinsson: UC Boulder Dartmouth
 Jianlin Xia (Purdue)
 Francesco Andriulli (ENS TELECOM Bretagne)
 Mark Tygert
Python Help
 Python tutorial
 Facts and myths about Python names and values
 Dive into Python 3
 Learn Python the hard way
 Project Euler (Lots of practice problems)
Numpy Help
 Introduction to Python for Science
 The SciPy lectures
 The Numpy MedKit by Stéfan van der Walt
 The Numpy User Guide by Travis Oliphant
 Numpy/Scipy documentation
 More in this reddit thread
 Spyder (a Python IDE, like Matlab) is installed in the virtual machine. (Applications Menu > Development > Spyder)
 An introduction to Numpy and SciPy
 100 Numpy exercises