Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

IE 690: Applied Probability for Machine Learning, Lecture notes of Machine Learning

The course IE 690: Applied Probability for Machine Learning offered in Spring 2020. The course aims to study recent developments in the application of probabilistic tools and methods to the analysis of statistical machine learning algorithms. The intended audience of this course are Ph.D. students who work on theoretical questions surrounding statistical machine learning. The course requires a solid background in stochastic processes and real analysis. The course will be driven by paper readings carefully chosen and assigned to each student.

Typology: Lecture notes

2019/2020

Uploaded on 05/11/2023

marylen
marylen 🇺🇸

4.6

(24)

250 documents

1 / 2

Toggle sidebar

Related documents


Partial preview of the text

Download IE 690: Applied Probability for Machine Learning and more Lecture notes Machine Learning in PDF only on Docsity! IE 690: Applied Probability for Machine Learning Spring 2020 1 Description: The broad goal of this course is to study recent developments in the application of proba- bilistic tools and methods to the analysis of statistical machine learning algorithms. The intended audience of this course are Ph.D. students who work on theoretical questions surrounding statistical machine learning (or intend to do so). A solid background in stochastic processes (especially Markov processes) and real analysis is necessary. Ideal preparation includes measure theory and grad- uate level probability. The course will be driven by paper readings carefully chosen and assigned to each student. There is no required textbook, but “Applied Stochastic Analysis,” by Weinan, Li and Vanden-Eijnden is a useful resource for the basic analytical tools needed. Here is a tentative list of topics that will be covered in this course. It should be noted that there is significant freedom in choosing topics, depending on the class interest. Week 1-5 Stochastic analysis of “phase-space” algorithms: we will study the asymptotics and transient behavior of algorithms like stochastic gradient descent (SGD), Langevin Markov chain Monte Carlo (MCMC), stochastic Langevin gradient descent (SGLD) etc. by passing to a continuum limit. Particular focus on weak convergence to steady state behavior, large deviations principles and metastability analyses. Week 6-11 Stochastic analysis of “measure-space” algorithms: we will study variational ap- proximations for Bayesian computation and distributionally robust optimization (DRO). Particular focus on connections to gradient flow theory on Wasserstein spaces, connections with the analysis of phase-space algorithms. Week 12-14 Applications of the analytical tools to reinforcement learning algorithms and/or inference of dynamical systems from time series (time permitting). 1