Quicklists
Javascript must be enabled

Rong Ge : Learning Two-Layer Neural Networks with Symmetric Inputs (Feb 27, 2019 11:55 AM)

Deep learning has been extremely successful in practice. However, existing guarantees for learning neural networks are limited even when the network has only two layers - they require strong assumptions either on the input distribution or on the norm of the weight vectors. In this talk we give a new algorithm that is guaranteed to learn a two-layer neural network under much milder assumptions on the input distribution. Our algorithms works whenever the input distribution is symmetric - which means two inputs $x$ and $-x$ have the same probability.

Based on joint work with Rohith Kuditipudi, Zhize Li and Xiang Wang

Please select playlist name from following

Report Video

Please select the category that most closely reflects your concern about the video, so that we can review it and determine whether it violates our Community Guidelines or isn’t appropriate for all viewers. Abusing this feature is also a violation of the Community Guidelines, so don’t do it.

0 Comments

Comments Disabled For This Video