报告嘉宾:张弘扬(Carnegie Mellon University)
报告时间:2018年09月19日(星期三)下午14:00(北京时间)
报告题目:New Paradigms and Global Optimality in Non-Convex Optimization
主持人:刘日升(大连理工)
报告人简介:
Hongyang Zhang is currently a fourth-year Ph.D. candidate in Machine Learning Department, Carnegie Mellon University, Pittsburgh, USA. Before that, he graduated from Peking University in 2015. His research interests include machine learning, optimization, theoretical computer science, statistics, and their applications. He has published several papers in the top conferences/journals, including ICML, NIPS, COLT, ICALP, ITCS, IEEE TIT, Proceedings of the IEEE, etc. He co-authored a book entitled "Low-Rank Models in Visual Analysis" in 2017. He also serves as program committee members and reviewers for various top conferences/journals, such as STOC, COLT, ICML, NIPS, AISTATS, ICLR, AAAI, IJCAI, ISIT, JMLR, TPAMI, Proceedings of the IEEE, IEEE TSP, and many others.
个人主页:
http://www.cs.cmu.edu/~hongyanz/
报告摘要:
Non-convex optimization is ubiquitous in various areas, ranging from machine learning, signal processing, to statistics and theoretical computer science. Though non-convex optimization typically enjoys better sample complexity guarantees compared with its convex counterpart, it usually suffers from computational issues: local searching algorithms, such as gradient descent or stochastic gradient descent, only converge to the saddle points or the local minima, rather than the global optimality that we desire. There are three main approaches to address the issues: 1. having a good initialization; 2. carefully solving a sequence of convex problems; 3. studying the nice properties of loss landscape, such as strong duality or ``local minima = global minima''.
In this talk, we will focus on the latter two approaches, which are new paradigms of global optimality in the non-convex optimization. In the first part of the talk, we will see how we can get arbitrarily close to the global optimality of non-convex problems for learning of halfspaces and 1-bit compressed sensing. Via the localization technique, we solve the expected 0-1 loss minimization problem by solving a sequence of convex problems. We also supplement our positive results with a hardness result, showing that any one-shot minimization does not work in this setting.
In the second part of the talk, we study the strong duality of matrix factorization problem. Strong duality is well-studied for the convex optimization, but little was known about the non-convex community. We show that strong duality holds for the matrix completion and its related problems with nearly optimal sample complexity. For the hardness result, we also show that generic matrix factorization requires at least 2^n time to be solved, where n is the size of matrix.
参考文献:
[1] Learning and 1-bit Compressed Sensing under Asymmetric Noise, Pranjal Awasthi, Maria-Florina Balcan, Nika Haghtalab, Hongyang Zhang (α-β order), COLT 2016.
http://www.cs.cmu.edu/~hongyanz/publications/COLT_Learning.pdf.
[2] Matrix Completion and Related Problems via Strong Duality, Maria-Florina Balcan, Yingyu Liang, David P. Woodruff, Hongyang Zhang (α-β order), ITCS 2018.
http://www.cs.cmu.edu/~hongyanz/publications/ITCS_Matrix.pdf
18-29期VALSE在线学术报告参与方式:
长按或扫描下方二维码,关注“VALSE”微信公众号(valse_wechat),后台回复“29期”,获取直播地址。
特别鸣谢本次Webinar主要组织者:
VOOC责任委员:刘日升(大连理工)
活动参与方式:
1、VALSE Webinar活动依托在线直播平台进行,活动时讲者会上传PPT或共享屏幕,听众可以看到Slides,听到讲者的语音,并通过聊天功能与讲者交互;
2、为参加活动,请关注VALSE微信公众号:valse_wechat 或加入VALSE QQ群(目前A、B、C、D、E、F、G群已满,除讲者等嘉宾外,只能申请加入VALSE H群,群号:701662399);
*注:申请加入VALSE QQ群时需验证姓名、单位和身份,缺一不可。入群后,请实名,姓名身份单位。身份:学校及科研单位人员T;企业研发I;博士D;硕士M。
3、在活动开始前5分钟左右,讲者会开启直播,听众点击直播链接即可参加活动,支持安装Windows系统的电脑、MAC电脑、手机等设备;
4、活动过程中,请不要说无关话语,以免影响活动正常进行;
5、活动过程中,如出现听不到或看不到视频等问题,建议退出再重新进入,一般都能解决问题;
6、建议务必在速度较快的网络上参加活动,优先采用有线网络连接;
7、VALSE微信公众号会在每周一推送上一周Webinar报告的总结及视频(经讲者允许后),每周四发布下一周Webinar报告的通知及直播链接。