Title:Demystifying Neural Tangent Kernel
Speaker:Prof.Yi Da Xu(HKBU)
Time: Monday ,November 29 , 2021, AM:10:00-11:00
Tencent Conference: 971 465 049
Inviter: Prof.Delu Zeng
Abstract:For the most part of last decade, deep neural networks have achieved many monumental empirical successes in various fields of machine learning. However, most of their inner workings remain a mystery. Recently, many researchers have tried to disentangle this mystery by observing what happens when the width of the neural network reaches infinity. To this end, there have been several important studies to address the expressivity and trainability of neural networks recently. In this talk, I will start by briefly describe what a Gaussian process is and why a neural network can be expressed as a Gaussian process. Then, through the lens of the so-called Neural Tangent Kernel, I will explain why using gradient descent in neural network training can be regarded as solving linear ordinary differential equations. Finally, I will show some exciting research directions in this area.