Skip to content

Latest commit

 

History

History
4 lines (3 loc) · 833 Bytes

README.md

File metadata and controls

4 lines (3 loc) · 833 Bytes

A Theoretical Analysis of "Lazy Training" in Deep Learning

Final Project completed by Henry Smith for S&DS 492, Advanced Optimization Techniques, at Yale University

For my final project, I prove three main results, Theorems 2.2 and 2.4 as well as Proposition A.1, from "On Lazy Training in Differentiable Programming" by Chizat, Oyallon, and Bach. These results constitute the foundation of "lazy training" for differentiable optimization problems. This theory was the primary motivation for my senior thesis in the Statistics & Data Science Department, "Implicit Regularization in Deep Learning: The Kernel and Rich Regimes" supervised by Dr. Harrison Zhou. For more information about my research into lazy training and its generalization properties, see my thesis.