报告题目:A stochastic semismooth Newton method for nonsmooth nonconvex optimization
报告人: Andre Milzarek ( 北京大学 北京国际数学研究中心)
报告时间: 2018年11月26日上午10:00-11:00
报告地点: 创新园大厦A1101
报告校内联系人:肖现涛 副教授 (联系电话:84708351-8307)
报告摘要: In this talk, we present a globalized semismooth Newton method for solving stochastic optimization problems involving smooth nonconvex and nonsmooth convex terms in the objective function. The class of problems that can be solved within our algorithmic framework comprises a large variety of applications such as l1-logistic regression, structured dictionary learning, and other minimization problems arising in machine learning, statistics, or image processing. We assume that only noisy gradient and Hessian information of the smooth part of the objective function is available via calling stochastic first- and second-order oracles. Our approach utilizes approximate second order information and stochastic semismooth Newton steps for a prox-type fixed-point equation, representing the associated optimality conditions, to accelerate the basic stochastic proximal gradient method for convex composite programming. Inexact growth conditions are introduced to monitor the quality and acceptance of the Newton steps and to combine the two different methods. We prove that the proposed algorithm converges globally to stationary points in expectation and almost surely. Moreover, under standard assumptions, the method can be shown to locally turn into a pure semismooth Newton method and fast local convergence can be established with high probability. Finally, we provide numerical experiments illustrating the efficiency of the stochastic semismooth Newton method.
报告人简介: Andre Milzarek received his doctoral degree in mathematics from the Technical University of Munich in Germany under the supervision of Michael Ulbrich in 2016. Currently, he is a postdoctoral researcher at the Beijing International Center for Mathematical Research at the Peking University. His main research directions and interests cover nonsmooth optimization, large-scale and stochastic optimization, second order methods and theory. From 2010 to 2012 he was supported by the Max-Weber program of the state of Bavaria and in 2017 he received the Boya Postdoctoral Fellowship at Peking University. He published papers in SIAM Journal on Optimization, SIAM Journal on Scientific Computing and SIAM Journal on Matrix Analysis and Applications.