|Time:||June 4, 2019, 1:15 p.m. (CEST)|
|Lecturer:||Sebastian Goldt (Institut de Physique Théorique, Université Paris)|
|Download as iCal:||
Machine learning techniques and deep neural networks in particular have recently achieved super-human performance in tasks as different as image classiffation, natural language processing or the play of board games like Chess or Go. However, our theoretical understanding of deep learning has not quite kept the pace. In fact, while there is a general consensus that a comprehensive theory of deep learning is lacking, the precise nature of the key open questions is still debated. In this talk, I will start with a quick introduction to deep neural networks and outline four key challenges that a theory of deep learning should address from my point of view. I will then focus on one of these problems, namely the observation that deep neural networks seem to be immune to a certain kind of over-fitting, and show how models and tools from statistical physics can help analyse this problem.