Seminar (2019/7/31): Deep limits of residual neural networks

2019-7-31 13:00 - 2019-7-31 14:00
Faculty of Science Building #3, Room 307
Yves van Gennip (Technische Universiteit Delft)
Through recent work of Haber and Ruthotto and others, it has been recognised that certain neural network architectures can be interpreted as discretised ordinary differential equations (ODEs). In this talk we will see an application of a method developed by Slepcev and Garcia Trillos which allows us to make this interpretation rigorous in a variational framework: We will show that the training of a residual neural network can be formulated as a constrained discrete variational problem, whose deep layer limit (i.e. #layers --> infinity) is given by a continuum variational problem constrained by an ODE.

This is joint work with Matthew Thorpe.