Some Statistical Results on Deep Learning: Interpolation, Optimality and Sparsity

Seminar on Theoretical Machine Learning
Topic:Some Statistical Results on Deep Learning: Interpolation, Optimality and Sparsity
Speaker:Guang Cheng
Affiliation:Purdue University; Member, School of Mathematics
Date:Wednesday, November 13
Time/Room:12:00pm - 1:30pm/Dilworth Room
Video Link:https://video.ias.edu/machinelearning/2019/1113-GuangCheng

This talk discusses three aspects of deep learning from a statistical perspective: interpolation, optimality and sparsity. The first one attempts to interpret the double descent phenomenon by precisely characterizing a U-shaped curve within the “over-fitting regime,” while the second one focuses on the statistical optimality of neural network classification in a student-teacher framework. This talk is concluded by proposing sparsity induced training of neural network with statistical guarantee.