You're not currently signed in.
Sign in »

What | Where |
---|---|

Time/place |
TR 2:00pm-3:15pm 0216 Siebel / Catalog |

Class URL |
https://relate.cs.illinois.edu/course/cs598ook-sp20 |

Recorded Lecture |
https://echo360.com/, Instructions |

Transcribed Lecture |
https://classtranscribe.illinois.edu |

Web forum |
Piazza » (sign up!) |

Calendar |
View » |

Schedule |
View » |

Sanmi Koyejo

Office: 3314 SC

Office hours: Tues 3:30-4:30pm; or by appointment.

E-mail: sanmi

name | contact (illinois.edu) | office hours | where |
---|---|---|---|

Sahand Mozaffari | sahandm2 | 12:00pm-2:00pm Fridays | On the whiteboard by SC 3407 (Tentative) |

Probabilistic Graphical Models are efficient representations of joint distributions using graphs, with a range of applications to machine learning, computer vision, natural language processing and computational biology, among other fields. The course will cover the fundamentals of probabilistic graphical models, including techniques for inferring properties of the distribution given the graph structure and parameters, and for learning the graphical model from data. The course will also cover selected special topics such as the basics of causality, approximate inference, Bayesian deep neural networks, and structure learning for graphical models.

- Bayesian Networks
- Conditional Random Fields (CRFs)
- Exact Inference
- Kalman Filtering
- Hidden Markov Models
- Variational Optimization View of Inference
- Structure Learning
- Causal inference

Students are expected to have taken a class in linear algebra, probability and statistics and a basic class in theory of computation and algorithms. Students are expected to be familiar with the python programming language.

There is no required textbook for the course. But Michael Jordan's draft of "An Introduction To Probabilistic Graphical Models" (available online) is highly recommended.

- Steffen L. Lauritzen, Graphical Models, Oxford University Press, 1996
- Marc Mézard and Andrea Montanari, Information, Physics, and Computation, Oxford University Press, 2009
- M. Wainwright and M. Jordan, Graphical models, exponential families, and variational inference, Foundations and Trends in Machine Learning, 2008
- D. Koller and N. Friedman, Probabilistic Graphical Models: Principles and Techniques, MIT Press, 2009

Lecture slides, course handouts, pointers to relevant papers, and other materials will be available as HTML and PDF documents on Relate and Piazza

There will be 3-5 homework assignments (TBD). The homeworks will consist of a combination of machine problems and written exercises to be submitted on the course relate website. All homeworks are due at Midnight CT on the specified day.
* Submitted solutions to the machine problems must be written in Python 3, using only standard libraries (numpy / scipy). We will sometimes block libraries as required.
* Solutions to the written problems must be submitted as LaTeX typeset PDF’s. Each question must begin on a new page. You may use WYSIWYG latex editors such as LyX if you are unfamiliar with LaTeX.

The final projects will be completed in groups of up to 4 students depending on class size. The project may consist of any combination of theoretical analysis, applications, and literature survey (possibly incorporating incorporate all three). The only constraint on the project is that it should include some aspect of probabilistic inference and/or learning. You should aim for a project suitable for submission to a machine learning conference e.g. NIPS, ICML, KDD, or domain-specific conferences e.g. CVPR, EMNLP, RECOMB.

Course details, homework and grades will be managed using Relate. Please self-register at here

We will use Piazza for our online discussion platform. You are encouraged to use Piazza to ask the course staff and your classmates and course-related questions (we will refer all email queries to Piazza). Please self-register at piazza.com/illinois/spring2020/cs598ook/home

**Module 1: Representation**- Basics, Bayesian Networks
- Bayesian Networks, Undirected Graphical Models
- Chordal (Decomposable) Graphs, Conditional Random Fields (CRFs)
- Chains, Trees, Factorial Graphs, Applications
- Markov Properties

**Module 2: Exact Inference**- Variable Elimination, Sum Product
- Multivariate Gaussian Distribution, Kalman Filtering, Smoothing
- Hidden Markov Models, Forward Backward Algorithm
- Junction Trees

**Module 3: Approximate Inference**- Exponential Families, Variational Optimization View of Inference
- Message Passing revisited, Mean Field
- Variational Methods
- Sampling Methods (Tentative)
- Maximum A Posteriori (MAP)
- Discriminative Inference
- Mixture Models (Tentative)

**Module 4: Learning**- Sparse Estimation Basics
- Convex Optimization Based Methods
- Greedy Methods

Feel free to discuss the assignment with each other in general terms, and to search the Web for general guidance (not for complete solutions). All solutions should be written up individually. If you make substantial use of some information from outside sources, make sure you acknowledge the sources in your solution. In particular, make sure you acknowledge all other students you worked with on the homework/projects. Failure to do this will result in a zero grade. We will follow the departmental honor code policy here: https://cs.illinois.edu/academics/honor-code

Participation 10%, Homework 40%, Project 50% Project grading: 5% for the proposal, 30% for the manuscript and 15% for the presentation Participation is primarily evaluated during the project presentation portion of the course.

**WE DO NOT ACCEPT LATE HOMEWORK**. The lowest homework grade will be dropped.