时间:2019年7月4日(周四)下午15:30-16:30
地点:中山北路校区理科大楼A1514会议室
题目:Understanding Weight Normalized Deep Neural Networks
报告人:Xiao Wang 教授 Purdue University
摘要:Analysis of big data demands computer aided or even automated model building. It becomes extremely difficult to analyze such data with traditional statistical models. Deep learning has proved to be successful for a variety of challenging problems such as AlphaGo, driverless cars, and image classification. Understanding of deep learning has however apparently been limited, which makes it difficult to be fully developed. In this talk, we study the capacity as well as generalization properties of deep neural networks (DNNs) under different scenarios of weight normalization. We establish the upper bound on the Rademacher complexities of this family. In particular, with an L_{1,q} normalization, we have an architecture-independent capacity control. For the regression problem, we provide both the generalization bound and the approximation bound. It is shown that both generalization and approximation errors can be controlled by the L_{1,\infty} weight normalized neural network without relying on the network width and depth.
报告人介绍:
Xiao Wang教授,本科和硕士毕业于中国科学技术大学,博士毕业于美国密歇根大学安娜堡分校。2005年—至今在美国普渡大学历任助理教授、副教授和教授。在包括统计学顶级杂志Annals of Statistics, Journal of American Statistical Association,Biometrika和优化方向顶级杂志SIAM Journal of Control and Optimization等杂志上发表论文几十篇。