MathsDL-spring18

Topics course Mathematics of Deep Learning, NYU, Spring 18. CSCI-GA 3033.

Logistics

Instructors

Lecture Instructor: Joan Bruna (bruna@cims.nyu.edu)

Tutor (Parallel Curricula): Cinjon Resnick (cinjon@nyu.edu)

Syllabus

This Graduate-level topics course aims at offering a glimpse into the emerging mathematical questions around Deep Learning. In particular, we will focus on the different geometrical aspects surounding these models, from input geometric stability priors to the geometry of optimization, generalisation and learning. We will cover both the background and the current open problems.

Besides the lectures, we will also run a parallel curricula (optional), which, starting from a landmark recent DL paper (AlphaGo), will trace back the fundamentals of Dynammic Programming, Policy Learning and Monte-Carlo Tree Search through the literature and lab materials.

Detailed Syllabus

Pre-requisites

Multivariate Calculus, Linear Algebra, Probability and Statistics at solid undergraduate level.

Notions of Harmonic Analysis, Differential Geometry and Stochastic Calculus are nice-to-have, but not essential.

Grading

The course will be graded with a final project – consisting in an in-depth survey of a topic related to the syllabus, plus a participation grade. The detailed abstract of the project will be graded at the mid-term.

Final Project is due May 1st by email to the instructors

Lectures

Week Lecture Date Topic References
1 1/23 Lec1 Introduction: The Curse of Dimensionality in ML Slides References
2 1/30 Lec2 Euclidean Geometric Stability. Slides References
3 2/6 Guest Lecture: Leon Bottou (Facebook/NYU) Slides References
4 2/13 Lec3 Scattering Transforms and CNNs Slides References
5 2/20 Lec4 Non-Euclidean Geometric Stability. Gromov-Hausdorff distances. Graph Neural Nets Slides References
6 2/27 Lec5 Graph Neural Network Applications Slides References
7 3/6 Lec6 Unsupervised Learning under Geometric Priors. Implicit vs Explicit models. Optimal Transport models. Microcanonical Models. Open Problems Slides References
8 3/13 Spring Break References
9 3/20 Lec7 Discrete vs Continuous Time Optimization. The Convex Case. Slides References
10 3/27 Lec8 Discrete vs Continuous Time Optimization. Stochastic and Non-convex case Slides References
11 4/3 Lec9 Gradient Descent on Non-convex Optimization. Slides References
12 4/10 Lec10 Gradient Descent on Non-convex Optimization. Escaping Saddle Points efficiently. Slides References
13 4/17 Lec11 Landscape of Deep Learning Optimization. Spin Glasses, Kac-Rice, RKHS, Topology. Slides References
14 4/24 Lec12 Guest Lecture: Behnam Neyshabur (IAS/NYU): Generalization in Deep Learning Slides References
15 5/1 Lec13 Landscape of Deep Learning Optimization. Positive and Negative results. Open Problems. Slides References

Lab sessions / Parallel Curricula

DeepStack living document: https://goo.gl/zzMzoz

AlphaGoZero living document: https://goo.gl/iFZ4XD