**Title:** Compositional Backpropagation

**Speaker:** Filip Smola

**Abstract: **

In this talk I will first give an overview of the paper “Backprop as Functor: A compositional perspective on supervised learning” by Fong, Spivak and Tuyéras. This paper takes a structural perspective on backpropagation, giving a functor that converts any differentiable parametrised function into a supervised learning algorithm in a compositional way. With this we can for example factor neural networks into subunits that are simpler to analyse, and more interestingly we can do this in more complex ways than simply by layers. I will then give an overview of our progress mechanising the contents of the paper in Isabelle and describe some of the interesting problems we have faced so far.