复旦大学-浙江师范大学“机器学习与视觉前沿论坛”系列报告二
报告题目:Boosting with Structural Sparsity: A Differential Inclusion Approach
报告专家:姚远(香港科技大学)
报告时间:11月29日9:15-10:00
报告地点:腾讯会议号766-826-483(复旦浙师大MLV前沿论坛)
报告摘要:Boosting, as gradient descent method, is arguably the 'best off-the-shelf' methods in machine learning. Here a novel Boosting-type algorithm is proposed based on restricted gradient descent. In fact, we present an iterative regularization path with structural sparsity where the parameter is sparse under some linear transforms, based on the Linearized Bregman Iteration or sparse mirror descent. Despite its simplicity, it outperforms the popular (generalised) Lasso in both theory and experiments. A theory of path consistency is presented that equipped with a proper early stopping, it may achieve model selection consistency. A data adaptive early stopping rule is developed based on Knockoff method which aims to control the false discovery rates of causal variables. The utility and benefit of the algorithm are illustrated by various applications including sparse variable selection, learning graphical models, etc.
邀请人: 郑忠龙