Skip to main content

Salon with Greg Yang

Greg talks about his recent paper "Feature Learning in Infinite-Width Neural Networks".
Created on April 19|Last edited on April 21


This salon features Greg Yang, of Microsoft Research AI, presenting his recent paper "Feature Learning in Infinite Width Neural Networks", which demonstrates how with the correct limit, infinitely-wide neural networks are capable of pre-training and feature learning, unlike the limits based on the neural tangent kernel (NTK) and neural network Gaussian process (NNGP).