AI4OPT Short Course III

Five Lectures: Apr 15, 16, 18, 19, 22, 1 pm – 3 pm

Location: Groseclose 402

Speaker: Victor de la Pena


Decoupling and Self-normalized Inequalities with Applications in Machine Learning

Overview: Decoupling and Self-normalization are areas that grew out of the need to extend martingale methods to high and infinite dimensions as well as to  complex nonlinear dependence structures. Decoupling provides tools for treating dependent variables as if they were independent. In addition, decoupling provides natural tools for the  development of sharp concentration of measure inequalities. Prototypical examples of self-normalized processes are the t-statistic in dependent random variables and  (self-normalized) extensions of the law of the iterated logarithm of Kolmogorov. As will be described in the course, these results have been extensively applied in Machine Learning in the development of algorithms.

Schedule: 

This will be a five part course. Each part is two hour long with a short break.

  1. Lecture 1: Mon, Apr 15, 1 pm – 3 pm , Groseclose 402 (Lunch will be provided at the beginning)
  2. Lecture 2: Tue, Apr 16, 1 pm – 3pm, Groseclose 402 (Coffee and Snacks will be provided)
  3. Lecture 3: Thu, Apr 18, 1 pm – 3 pm , Groseclose 402 (Lunch will be provided at the beginning)
  4. Lecture 4: Fri, Apr 19, 1 pm – 3 pm, Groseclose 402 (Coffee and Snacks will be provided)
  5. Lecture 5: Mon, Apr 22, 1 pm – 3 pm , Groseclose 402 (Lunch will be provided at the beginning)
     

Bio: Victor H. de la Pena is a Professor of Statistics at Columbia University. He is a fellow of the Institute of Mathematical Statistics and a Medallion Lecturer. His books  on Decoupling and Self-Normalization are widely used standards in both areas. In addition, his work has been published by The Annals of Probability  and other widely recognized venues. In particular, his invited  paper Decoupling of U-statistics and Quadratic Forms  with S. J. Montgomery-Smith was published by the Bulletin of the American Mathematical Society as one of the top results for the year.  His 2019 invited paper From Decoupling and Self-Normalization to Machine Learning was published in the Notices of the American Society and distributed to over 30,000  AMS members and member institutions worldwide.

List of Topics:
- Overview of the course
- Martingale Inequalities and the K-function
- Complete Decoupling and Optimization
- Tangent Decoupling of Arbitrary Variables
- Decoupling of U-Statistics and Random Graphs
- Self-Normalized Bernstein Inequality
- Pseudo Maximization and the Method of Mixtures
- Self-normalized  Gaussian Bounds
- Boundary Crossing and the Law of the Iterated Logarithm
- Applications in Machine Learning

Main References:
1. de la Pena, V. H. From Decoupling and Self-normalization to Machine Learning, Notices of the American Mathematical Society, November 2019.
2. de la Pena, V. H., Gine, E. Decoupling: From Dependence to Independence, Springer, New York, 1999.
3. de la Pena, V.H., Lai, T. L., Shao, Q-M. Self-normalized Processes: Limit theory and Statistical Applications, Springer, 2009.
4. Makarychev, K. Sviridenko, M. Solving Optimization Problems with Disecotomies of Scale via Decoupling, Journal of the ACM, Vol.  65 (6), Article 42, November 2018.