Inference and Representation (DS-GA-1005, CSCI-GA.2569)

Course staff:

this is a simple test

  Name E-mail
Instructor Joan Bruna bruna@cims.nyu.edu
TA Vlad Kobzar vk283@nyu.edu

Syllabus

This graduate level course presents fundamental tools of probabilistic graphical models, with an emphasis on designing and manipulating generative models, and performing inferential tasks when applied to various types of data.

We will study latent graphical models (Latent Dirichlet Allocation, Gaussian Processes), state-space models for (Kalman Filter, HMMs), Gibbs Models, Deep generative models (Variational autoencoders, GANs) covering both the methods (inference, sampling algorithms, learning, exponential families) and modeling applications to text, images and physics data.

Lecture Location

Monday, 6:20-8:00pm, in 60 FA 110

[Recitation/Laboratory] (required for all students)

Mondays, 8:10-9:00pm in 60 FA 110

Office hours

JB: Monday, 9:00-11:00am. Location: 60 5th ave, 6th floor, room 612.

VK: Wednesday, 9:30-10:30am. Location: 60 5th ave, 7th floor, room 737.

Grading

problem sets (45%) + midterm exam (25%) + final project (25%) + participation (5%).

Piazza

We will use Piazza to answer questions and post announcements about the course. Students’ use of Piazza, particularly for adequately answering other students’ questions, will contribute toward their participation grade.

Online recordings

Most of the lectures videos will be posted to NYU Classes. Note, however, that class attendance is required.

Schedule

| Week | Lecture Date | Topic | Reference | Deliverables | | —————|—————-| ————|———————————|—————| | 2 | 9/11 | Lec1 Introduction and Logistics. Inference Examples. Bayesian Networks. Slides | Murphy Chapter 1 (optional; review for most)

Notes on Bayesian networks (Sec. 2.1)

Algorithm for d-separation (optional)| PS1, due 9/18 | | 3 | 9/18 | Lec2 Undirected Graphical Models. Markov Random Fields. Ising Model. Applications to Statistical Physics. Slides |Notes on MRFs (Sec. 2.2-2.4)

Notes on exponential families

Notes on Hammersley-Clifford Theorem | PS2 [data], due 9/25 | | 4 | 9/25 | Guest Lecture Kyle Cranmer (NYU): Likelihood-free Inference Slides | | | | 5 | 10/2 | Lec4 Belief-Propagation. Gibbs sampling. Slides | Barber 27.1-27.3.1

Murphy Sec. 24.1-24.2.4

Introduction to Probabilistic Topic Models

Explore topic models of: politics over time, state-of-the-union addresses, Wikipedia | PS3, due 10/17

Project Proposal, due 10/23 | | 7 | 10/16 | Lec5 PCA. ICA. Applications to Survey Data Slides | Elements of Statistical Learning, Ch.14

Finding Structure in Randomness (…), Halko, Martinsson, Tropp | PS4, due 10/25 | | 8 | 10/23 | Lec6 Clustering. EM. Markov Chain Monte-Carlo (MCMC). slides | MIT Lecture 18 Notes

Elements of Stat. Learning 14.5 and 8.5

Hamilton Monte-Carlo (optional)| | | 9 | 10/30 | Midterm Exam | | | | 10 | 11/6 | Lec7 Variational Inference. Revisiting EM. Mean Field. slides | Graphical Models, Exponential Families and Variational INference, Chapter 3 Variational INference with Stochastic Search | | | 11 | 11/13 | Lec8 Variational Inference (cont’d). Variational Autoencoders slides | Variational Inference: A review for Statisticians, by Blei, McAuliffe, Kucukelbir AutoEncoding Variational Bayes (Kingma, Welling | PS5, due 11/22 | | 12 | 11/20 | Lec9 Structured Output Prediction. Conditional Random Fields (CRF), Hidden Markov Models, Moment Matching slides | Murphy, Secs. 19.5 & 19.6
Notes on pseudo-likelihood
An Introduction to Conditional Random Fields (section 4; optional)
Approximate maximum entropy learning in MRFs (optional) Tree reweighted BP algorithms and approximate ML estimation by pseudo-moment matching An Asymptotic Analysis of Generative, Discriminative and Pseudolikelihood estimators, Liang and Jordan | | | 14 | 11/27 | Lec10 Output Structured Prediction (cont’d). slides | Fixing Max-Product: Convergent Message Passing Algorithms for MAP LP Relaxations, Globerson and Jaakkola | | | 15 | 12/4 | Guest Lecture TBA | | PS6, due 12/8 | | 16 | 12/11 | Lec11 Modeling Images and high-dimensional data. Deep Auto-regressive Models. Normalizing Flows. Generative Adversarial Networks slides | references in slides | | | | 12/12 | Lec12 Further applications of GANS. Open Problems slides | references in slides | Project writeup, due 12/19. | | 17 | 12/18 | Final Day Poster Presentations of Final Projects

Location: Center for Data Science, 60 5th ave, in the 7th floor open space | | |

Bibliography

There is no required book. Assigned readings will come from freely-available online material.

Core Materials

Background on Probability and Optimization

Further Reading

Academic Honesty

We expect you to try solving each problem set on your own. However, when being stuck on a problem, we encourage you to collaborate with other students in the class, subject to the following rules:

Late submission policy

During the semester you are allowed at most two extensions on the homework assignment. Each extension is for at most 48 hours and carries a penalty of 25% off your assignment.