Previous Sessions

Princeton Machine Learning Theory Summer School

June 26 - June 30, 2023

About

Welcome to the website for the Princeton Machine Learning Theory Summer School. The school will run in person June 26 - June 30 at Princeton and is aimed at PhD students interested in machine learning theory. The primary goal is to showcase, through four main courses, a range of exciting recent developments in the subject. The primary focus this year is on theoretical advances in deep learning. An important secondary goal is to connect young researchers and foster community within theoretical machine learning.

Courses

Principle courses, each consisting of four to five hours of lecture, are:

  1. Statistical mechanics of deep learning dynamics
    Instructor: Cengiz Pehlevan (Harvard)
  2. Wasserstein gradient flows for sampling and estimation
    Instructor: Philippe Rigollet (MIT)
  3. Exact analysis of deep learning in high-dimensions
    Instructors: Jeffrey Pennington(Google) and Ben Adlam (Google)
  4. Regimes of training in DDNs: Neural Tangent Kernel, Feature Learning, and Sparsity
    Instructor: Arthur Jacot (NYU)
  5. Dynamical mean field theory and the replica method
    Instructor: Francesca Mignacco (Princeton)

Organizers & Sponsors

This summer school is organized by Boris Hanin (Princeton ORFE). Support was provided by the NSF via NSF CAREER Grant DMS-2143754, the Department of Operations Research and Financial Engineering (ORFE) at Princeton, the Center for Statistics and Machine Learning (CSML) at Princeton, the Princeton School of Engineering and Applied Sciences (SEAS), and the Program on Applied and Computational Mathematics (PACM) at Princeton.

Princeton Machine Learning Theory Summer School

June 13 - June 17, 2022

About

Welcome to the website for the 2022 Princeton Machine Learning Theory Summer School. The school will run in person June 13 to June 17, 2022 and is aimed at PhD students interested in machine learning theory. The primary goal is to showcase, through four main courses, a range of exciting recent developments in the subject. The primary focus this year is on theoretical advances in deep learning. An important secondary goal is to connect young researchers and foster a closer community within theoretical machine learning.

Courses

There will be four principle courses, each consisting of four to five hours of lecture. The courses are:

  1. Implicit Complexity Control in Deep and Underdetermined Models
    Instructor: Nati Srebro (TTIC)
  2. Graph Neural Networks and Equivariant Machine Learning
    Instructor: Soledad Villar (JHU)
  3. The Law of Robustness, a Story of Small and Large Neural Networks
    Instructor: Sebastien Bubeck (MSR)
  4. Understanding Self-supervised Learning with Neural Networks
    Instructor: Tengyu Ma (Stanford)

Organizers and Sponsors

This summer school is organized by Boris Hanin (Princeton ORFE ). Support was provided by the NSF via NSF CAREER Grant DMS-2143754, the Department of Operations Research and Financial Engineering (ORFE) at Princeton, the Center for Statistics and Machine Learning (CSML) at Princeton, the Princeton School of Engineering and Applied Sciences (SEAS), and the Program on Applied and Computational Mathematics (PACM) at Princeton.

Deep Learning Theory Summer School at Princeton

July 27 - August 4, 2021

About

Welcome to the website for the Deep Learning Theory Summer School at Princeton 2021. The school will run remotely from July 27 to August 4, 2021 and is aimed at graduate students interested in the theory of deep learning. The primary goal is to showcase, through three main courses and a variety of short talks, a range of exciting developments. An important secondary goal is to connect young researchers and foster a closer community within theoretical machine learning. All graduate students with a technical background are to apply.

Courses

There will be three principle courses. Each course will consist of five one-hour lectures as well as pre-readings, problem sets, and TA sessions. These courses are: 

  1. Modern Machine Learning and Deep Learning Through the Prism of Interpolation
    Instructor: Misha Belkin (UCSD Halicioglu); Pre-Readings: RKHS and this article.
  2. Deep Learning: a Statistical Viewpoint
    Instructor: Andrea Montanari (Stanford); Pre-Readings: Linear algebra at a somewhat advanced level (eigenvalues inequalities, perturbation theory etc), high dimensional probability (e.g. Chapters 1-6 in Vershynin), and classical statistical learning theory is welcome but not required (e.g. Ben-David and Shalev-Shwartz).
  3. Effective Theory of Deep Learning: Beyond the Infinite-Width Limit
    Instructors:  Dan Roberts (MIT, Salesforce) and Sho Yaida (Facebook); Pre-Readings:    Chapters 0-2 in this book.; Slides: Lecture 1Lecture 2Lecture 3Lecture 4Lecture 5

Overall Schedule

There will also be a number of individual talks from researchers in industry and academia, including Ben Adlam (Google), Leon Bottou (Facebook), Ethan Dyer (Google), Gintare Karolina Dziugaite (Element AI), Suriya Gunasekar (MSR), Guy Gur-Ari (Google), Daniel Park (Google), Jeffrey Pennington (Google), Marc'Aurelio Ranzato (Facebook), David Schwab (CUNY), Atlas Wang (Austin), Greg Yang (MSR).

Lecture Video

Organizers and Sponsors

This summer school is organized by Boris Hanin(Princeton ORFE ) with support from the Center for Statistics and Machine Learning (CSML). Funding was generously provided via the DataX program at Princeton.