首页  学术活动 > 正文

关于南方科技大学张进副教授和尧伟助理教授来校讲学的通知

作者: 时间:2023-04-14 点击数:3


应数学与计算科学学院、广西应用数学中心(十大网赌博正规信誉网址)及广西高校数据分析与计算重点实验室邀请,南方科技大学张进副教授和尧伟助理教授将于2023年4月15日来校讲学,欢迎全校师生踊跃参加。报告具体安排如下:

报告题目一:Towards Gradient-based Bilevel Optimization in Machine Learning

主讲人:张进 副教授

时间:2023年4月15日(周六)上午8:40-10:40

地点:金鸡岭校区图书馆3楼305会议室

报告摘要:Recently, Bi-Level Optimization (BLO) techniques have received extensive attentions from machine learning communities. In this talk, we will discuss some recent advances in the applications of BLO. First, we study a gradient-based bi-level optimization method for learning tasks with convex lower level. In particular, by formulating bi-level models from the optimistic viewpoint and aggregating hierarchical objective information, we establish Bi-level Descent Aggregation (BDA), a flexible and modularized algorithmic framework for bi-level programming. Second, we focus on a variety of BLO models in complex and practical tasks are of non-convex follower structure in nature. In particular, we propose a new algorithmic framework, named Initialization Auxiliary and Pessimistic Trajectory Truncated Gradient Method (IAPTT-GM), to partially address the lower level non-convexity. By introducing an auxiliary as initialization to guide the optimization dynamics and designing a pessimistic trajectory truncation operation, we construct a reliable approximation to the original BLO in the absence of lower level convexity hypothesis. Extensive experiments justify our theoretical results and demonstrate the superiority of the proposed BDA and IAPTT-GM for different tasks, including hyper-parameter optimization and meta learning.

主讲人简介:

南方科技大学数学系副教授,深圳国家应用数学中心协理副主任,国家优青、广东省杰青、深圳市优青。2007、2010年本科、硕士毕业于大连理工大学,2014年博士毕业于加拿大维多利亚大学。2015至2018年间任职香港浸会大学,2019年初加入南方科技大学。致力于最优化理论和应用研究,代表性成果发表在Math ProgramSIAM J OptimMath Oper ResSIAM J Numer AnalJ Mach Learn ResIEEE Trans Pattern Anal Mach Intell,以及ICMLNeurIPS等有重要影响力的最优化、计算数学、机器学习期刊与会议上。研究成果获得2020年中国运筹学会青年科技奖、2022年广东省青年科技创新奖,主持国家自然科学基金/广东省自然科学基金/深圳市科创委/香港研究资助局面上项目。


报告题目二:Averaged Method of Multipliers for Bi-Level Optimization without Lower-Level Strong Convexity

主讲人:尧伟 助理教授

时间:2023年4月15日(周六)上午10:40-12:40

地点:金鸡岭校区图书馆3楼305会议室

报告摘要:Gradient methods have become mainstream techniques for Bi-Level Optimization (BLO) in learning fields. The validity of existing works heavily rely on either a restrictive Lower- Level Strong Convexity (LLSC) condition or on solving a series of approximation subproblems with high accuracy or both. In this work, by averaging the upper and lower level objectives, we propose a single loop Bi-level Averaged Method of Multipliers (sl-BAMM) for BLO that is simple yet efficient for large-scale BLO and gets rid of the limited LLSC restriction. We further provide non-asymptotic convergence analysis of sl-BAMM towards KKT stationary points, and the comparative advantage of our analysis lies in the absence of strong gradient boundedness assumption, which is always required by others. Thus our theory safely captures a wider variety of applications in deep learning, especially where the upper-level objective is quadratic w.r.t. the lower-level variable. Experimental results demonstrate the superiority of our method.

主讲人简介:

武汉大学本科毕业,博士毕业于香港中文大学,现任职南方科技大学数学系和深圳国家应用数学中心研究助理教授,主要研究方向包括双层规划算法和理论分析及其在机器学习和机制设计领域的应用。代表性论文发表在SIAM J OptimCalculus of Variations and Partial Differential EquationsJournal of Differential Equations等运筹优化、偏微分方程领域的国际著名期刊上




上一篇:下一篇:

Copyright© 2018 All Rights Reserved. 十大网赌博正规信誉网址数学与计算科学学院

Baidu
sogou