Title: Interplay between Statistics and Computation in Machine Learning
Reporter: Prof. YING Yiming(University at Albany, State University of New York)
Time: April 20, 2023(Thursday)AM: 09:00-11:00
Location: Tencent Meeting ID: 442 599 718
Contact: XU Min Tel: 84708351-8101
Abstract: Stochastic gradient methods (SGMs) have become the workhorse of machine learning (ML) due to their incremental nature with a computationally cheap update. In this talk, I will first discuss the close interaction between statistical generalization and computational optimization for SGMs in the framework of statistical learning theory (SLT). The core concept for this study is algorithmic stability which characterizes how the output of an ML algorithm changes upon a small perturbation of the training data. Our theoretical studies have led to new insights into understanding the generalization of overparameterized neural networks trained by SGD. Then, I will describe how this interaction framework can be used to derive lower bounds for the convergence of existing methods in the task of maximizing the AUC score which further inspires a new direction for designing efficient AUC optimization algorithms. Finally, I will briefly talk about future research directions.
The brief introduction to the reporter: Yiming is a Professor at the Department of Mathematics and Statistics, SUNY Albany, and the founding director of the machine learning lab. Before that, he was an Assistant Professor in the Department of Computer Science at the University of Exeter, England. His research interests include Statistical Learning Theory, Machine Learning, and Optimization. He currently serves as an associate editor of Transactions of Machine Learning Research, Neuro- computing, and Mathematics of Computation and Data Science, and the managing editor for Mathematical Foundation of Computing. He also serves as a Senior Program Member/Area Chair for major machine learning conferences such as NeurIPS, ICML, and AISTATS.