Time: 15:30-16:30pm, July. 4th, 2019, Thursday
Venue: RoomA1514, Science Building, North Zhongshan Road Campus
Spreker: Professor Xiao Wang, Purdue University
Abstract: Analysis of big data demands computer aided or even automated model building. It becomes extremely difficult to analyze such data with traditional statistical models. Deep learning has proved to be successful for a variety of challenging problems such as AlphaGo, driverless cars, and image classification. Understanding of deep learning has however apparently been limited, which makes it difficult to be fully developed. In this talk, we study the capacity as well as generalization properties of deep neural networks (DNNs) under different scenarios of weight normalization. We establish the upper bound on the Rademacher complexities of this family. In particular, with an L_{1,q} normalization, we have an architecture-independent capacity control. For the regression problem, we provide both the generalization bound and the approximation bound. It is shown that both generalization and approximation errors can be controlled by the L_{1,\infty} weight normalized neural network without relying on the network width and depth.
Speaker’s Bio:Xiao Wang教授,本科和硕士毕业于中国科学技术大学,博士毕业于美国密歇根大学安娜堡分校。2005年—至今在美国普渡大学历任助理教授、副教授和教授。在包括统计学顶级杂志Annals of Statistics, Journal of American Statistical Association,Biometrika和优化方向顶级杂志SIAM Journal of Control and Optimization等杂志上发表论文几十篇。