Salon with Diganta Misra
Diganta talks about smooth activations, robustness, and catastrophic forgetting.
Created on April 21|Last edited on April 21
Comment
This salon features Diganta Misra, of Hong Kong University and Landskape, who presents some thoughts on the role of smooth non-linearities, like Swish and Mish, in the robustness of neural networks and in catastrophic forgetting.
Add a comment