Skip to content
Date: 24th March 2023
Time: 14:00 - 16:00
Location: IF 1.15
Talks

Title: Constrained Training of Neural Networks via Theorem Proving
Speakers: Mark Chevallier and Matt Whyte
Abstract:

Robotic movement can be trained using neural networks but the process can be lengthy and has no guarantee that safety rules are learned. A neurosymbolic approach can bring the benefits of formal logical constraints specifying safety rules that can be injected into the training process. We introduce our work using logical constraints to assist the training of neural networks via a theorem proving process. In addition to learning via imitation, the neural network evaluates and learns from error caused by breaching these constraints. Our process formally proves the soundness of the logical loss function and guarantees correct implementation of that function using code generation. We discuss our existing work using linear temporal logic to train dynamic movement primitives and go on to discuss future extensions.