Skip to content

Yale S&DS 432 final project studying lazy training dynamics for differentiable optimization problems

Notifications You must be signed in to change notification settings

smithhenryd/Lazy-Training

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

A Theoretical Analysis of "Lazy Training" in Deep Learning

Final Project completed by Henry Smith for S&DS 492, Advanced Optimization Techniques, at Yale University

For my final project, I prove three main results, Theorems 2.2 and 2.4 as well as Proposition A.1, from "On Lazy Training in Differentiable Programming" by Chizat, Oyallon, and Bach. These results constitute the foundation of "lazy training" for differentiable optimization problems. This theory was the primary motivation for my senior thesis in the Statistics & Data Science Department, "Implicit Regularization in Deep Learning: The Kernel and Rich Regimes" supervised by Dr. Harrison Zhou. For more information about my research into lazy training and its generalization properties, see my thesis.

About

Yale S&DS 432 final project studying lazy training dynamics for differentiable optimization problems

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages