Probabilistic Graphical Models (CS 598 OOK) Spring 2021
|Tuesday, Thursday 2:00pm-3:15pm Zoom Meeting
|Piazza » (sign up!)
Office hours: Tuesday 3:15PM-4:15PM Zoom Meeting
Probabilistic Graphical Models are efficient representations of joint distributions using graphs, with a range of applications to machine learning, computer vision, natural language processing and computational biology, among other fields. The course will cover the fundamentals of probabilistic graphical models, including techniques for inferring properties of the distribution given the graph structure and parameters, and for learning the graphical model from data. The course will also cover selected special topics such as the basics of causality, approximate inference, Bayesian deep neural networks, and structure learning for graphical models.
Topics to be covered
- Bayesian Networks
- Conditional Random Fields (CRFs)
- Exact Inference
- Kalman Filtering
- Hidden Markov Models
- Variational Optimization View of Inference
- Structure Learning
- Causal inference
Students are expected to have taken a class in linear algebra, probability and statistics and a basic class in theory of computation and algorithms. Students are expected to be familiar with the python programming language.
There is no required textbook for the course. But Michael Jordan's draft of "An Introduction To Probabilistic Graphical Models" (available online) is highly recommended.
Optional books with relevant material:
- Steffen L. Lauritzen, Graphical Models, Oxford University Press, 1996
- Marc Mézard and Andrea Montanari, Information, Physics, and Computation, Oxford University Press, 2009
- M. Wainwright and M. Jordan, Graphical models, exponential families, and variational inference, Foundations and Trends in Machine Learning, 2008
- D. Koller and N. Friedman, Probabilistic Graphical Models: Principles and Techniques, MIT Press, 2009
Lecture slides, course handouts, pointers to relevant papers, and other materials will be available as HTML and PDF documents on Relate and Piazza
There will be 3-5 homework assignments (TBD). The homeworks will consist of a combination of machine problems and written exercises to be submitted on the course relate website. All homeworks are due at Midnight CT on the specified day. Submitted solutions to the machine problems must be written in Python 3, using only standard libraries (numpy / scipy). We will sometimes block libraries as required. Solutions to the written problems must be submitted as LaTeX typeset PDF’s. Each question must begin on a new page. You may use WYSIWYG latex editors such as LyX if you are unfamiliar with LaTeX.
The final projects will be completed in groups of up to 4 students depending on class size. The project may consist of any combination of theoretical analysis, applications, and literature survey (possibly incorporating incorporate all three). The only constraint on the project is that it should include some aspect of probabilistic inference and/or learning. You should aim for a project suitable for submission to a machine learning conference e.g. NIPS, ICML, KDD, or domain-specific conferences e.g. CVPR, EMNLP, RECOMB.
Course details, homework and grades will be managed using Relate. Please self-register at here
Online Discussion Platform
We will use Piazza for our online discussion platform. You are encouraged to use Piazza to ask the course staff and your classmates and course-related questions (we will refer all email queries to Piazza). Please self-register at piazza.com/illinois/spring2021/cs598ook/home
- Module 1: Representation
- Basics, Bayesian Networks
- Bayesian Networks, Undirected Graphical Models
- Chordal (Decomposable) Graphs, Conditional Random Fields (CRFs)
- Chains, Trees, Factorial Graphs, Applications
- Markov Properties
- Module 2: Exact Inference
- Variable Elimination, Sum Product
- Multivariate Gaussian Distribution, Kalman Filtering, Smoothing
- Hidden Markov Models, Forward Backward Algorithm
- Junction Trees
- Module 3: Approximate Inference
- Exponential Families, Variational Optimization View of Inference
- Message Passing revisited, Mean Field
- Variational Methods
- Sampling Methods (Tentative)
- Maximum A Posteriori (MAP)
- Discriminative Inference
- Mixture Models (Tentative)
- Module 4: Learning
- Sparse Estimation Basics
- Convex Optimization Based Methods
- Greedy Methods
Feel free to discuss the assignment with each other in general terms, and to search the Web for general guidance (not for complete solutions). All solutions should be written up individually. If you make substantial use of some information from outside sources, make sure you acknowledge the sources in your solution. In particular, make sure you acknowledge all other students you worked with on the homework/projects. Failure to do this will result in a zero grade. We will follow the departmental honor code policy here: https://cs.illinois.edu/academics/honor-code
What to do in an emergency
Participation 10%, Homework 40%, Project 50% Project grading: 5% for the proposal, 30% for the manuscript and 15% for the presentation Participation is primarily evaluated during the project presentation portion of the course.
To obtain disability-related academic adjustments and/or auxiliary aids, students with disabilities must contact the course instructor and the Disability Resources and Educational Services (DRES) as soon as possible. To contact DRES, you may visit 1207 S. Oak St., Champaign, call 333-4603, e-mail firstname.lastname@example.org or go to https://www.disability.illinois.edu. If you are concerned you have a disability-related condition that is impacting your academic progress, there are academic screening appointments available that can help diagnosis a previously undiagnosed disability. You may access these by visiting the DRES website and selecting “Request an Academic Screening” at the bottom of the page.
WE DO NOT ACCEPT LATE HOMEWORK. The lowest homework grade will be dropped.