报 告 人:唐新东 助理教授 (香港浸会大学)
报告时间:2023年12月14日 (周四) 下午14:00-15:00
报告地点:线上报告 腾讯会议ID:467-428-786
报告校内联系人:郭 峰 副教授 联系电话:84708351-8088
报告摘要:We consider polynomial optimization with correlative sparsity. We construct correlatively sparse Lagrange multiplier expressions (CS-LMEs) and propose CS-LME reformulations for polynomial optimization problems using the Karush-Kuhn-Tucker optimality conditions. Correlatively sparse sum-of-squares (CS-SOS) relaxations are applied to solve the CS-LME reformulation. We show that the CS-LME reformulation inherits the original correlative sparsity pattern, and the CS-SOS relaxation provides sharper lower bounds when applied to the CS-LME reformulation, compared with when it is applied to the original problem. Moreover, the convergence of our approach is guaranteed under mild conditions. In numerical experiments, our new approach usually finds the global optimal value (up to a negligible error) with a low relaxation order, for cases where directly solving the problem fails to get an accurate approximation. Also, by properly exploiting the correlative sparsity, our CS-LME approach requires less computational time than the original LME approach to reach the same accuracy level.
报告人简介:唐新东,2016年于四川大学取得学士学位,2021年于加州大学圣地亚哥分校取得博士学位,2021年-2023年于香港理工大学任研究助理教授,目前于香港浸会大学任助理教授。主要从事多项式优化、广义纳什均衡问题、张量计算及其应用等方面的工作,目前已在Mathematical Programming,Mathematics of Operations Research,SIAM Journal on Optimization等国际权威期刊发表SCI论文10余篇。