AIAI Seminar - 28 February 2022 - Talks by James Vaughan, Jonathan Feldstein and Andreas Bueff

Talk by James Vaughan

Title:

Mapping the Network of Formal Mathematics

Abstract:

Proof assistants, such as Isabelle, contain vast libraries of mathematics that a user can build on to formalise and prove their own theories. Though finding those particular facts that are relevant to the current problem is frequently a challenge for both humans and machines. In our work, we approach this challenge from the perspective of network science; as a growing network of mathematical dependencies. I will discuss how we plan to improve our model of each fact's function by aligning them with the mathematical concepts contained in collaborative knowledge graphs, such as DBpedia.

 

Talk by Jonathan Feldstein  

Title:   

Neuro-Symbolic AI - a high-level overview

Abstract:

Deep neural networks have achieved outstanding levels of precision on many tasks over the past decade. However, DNNs still have several major limitations, such as being data hungry, being blackbox systems and thus making it difficult to understand how decisions have been reached, and they don't allow for an easy integration of prior domain expert knowledge. Statistical relational learning (SRL) on the other hand provides solutions to these limitations but SRL frameworks by themselves usually scale poorly and need expensive knowledge engineering. Neuro-Symbolic AI tries to combine the two areas to take advantage of the benefits of both approaches. In this talk, I will give a high-level introduction to different branches of neuro-symbolic AI, their benefits and their limitations to help machine learning engineers understand how to easily integrate logic into their neural networks.

 

Talk by  Andreas Bueff

Title:

Learning Explanatory Logical Rules in Mixed Discrete-Continuous Domains: A Neuro-Symbolic Approach

Abstract: 

One approach to explaining the hierarchical levels of understanding within a machine learning model is the symbolic method of inductive logic programming (ILP) which is data-efficient and capable of learning first-order logic rules that can entail data behaviours often in the static setting. Recent works have expanded the ILP framework to allow for end-to-end learning, and this has resulted in research that has sought to combine deep learning architectures with the relational models derived from ILP. In particular, a recently proposed ILP extension so-called differentiable neural logic (dNL) networks, utilizes differentiable neural logic layers to learn Boolean functions. The concept is to define Boolean functions that can be combined in a similar cascading architecture akin to neural networks. This gives deep learning an explicit symbolic representation that is interpretable. It also redefines ILP as an optimization problem. The dNL architecture uses membership weights and conjunctive and disjunctive layers with forward chaining to remove the need for the rule template and solve ILP problems.

In the literature there was a lack of utilization of this continuous dNL framework, thus in this work we combine continuous and non-linear predicate functions with an extension to the original dNL framework. This is accomplished by exploring standard non-linear functions such as the power function and sine function, as well as operation functions such as multiplication and addition. Provided a tabular dataset comprising of purely non-linear mathematical relations, our non-linear dNL solver can extract the non-linear functions in a clear interpretable manner. We parse the extracted non-linear functions and assess their performance on the training set by calculating the loss using the training outputs. With this regression loss, we can optimize the learning process further for the symbolic framework and provide a clear method of learning a concise non-linear function from mathematically constructed data.

 

 

 

 

 

 

 

 

 

 

 

 

 

Feb 28 2022 -

AIAI Seminar - 28 February 2022 - Talks by James Vaughan, Jonathan Feldstein and Andreas Bueff

AIAI Seminar talks hosted by James Vaughan, Jonathan Feldstein and Andreas Bueff

Online