ANC Workshop - 01/12/2020
Speaker: Angus Chadwick
Title: Cortical network dynamics , learning, and sensory coding: theory and experiment
Cortical circuits integrate input dynamically through recurrent interactions. How cortical dynamics shape integration and transmission of sensory information is not well understood. In this talk, I will present some theoretical results on neural coding in recurrently connected networks of neurons driven by noisy sensory input. Using Fisher Information to quantify the capacity of a downstream decoder to discriminate stimuli based on network output, I will show how general principles relating network dynamics to sensory coding emerge. To test predictions of the theory, I will then present an analysis of data imaged from populations of neurons in visual cortex of mice as they learned to discriminate visual features. By fitting a simple dynamical system model to the population activity, I will show how cortical dynamics reorganise over learning to selectively enhance integration of relevant sensory input. These findings suggest that temporal dynamics are tuned and continuously remodelled to optimise integration and transmission of relevant information.
Speaker: Valentin Radu
Title: Performance Aware Convolutional Neural Network Channel Pruning
Abstract: Convolutional Neural Networks are increasingly being used for applications on smaller and smaller devices, many times just by porting large models designed for the server space to the edge with some compression. One model compression technique is channel pruning.
We are seeing more devices developed with an embedded GPU, which is ideal for the parallel computations of neural networks with lower energy cost per operation. These computations are performed by highly optimised specialized libraries. We find that these libraries are optimized for the most common network shapes, making uninstructed channel pruning inefficient. In this talk I will present the evaluation of three popular higher level libraries utilising the embedded GPU for neural computations. I will show that choices intended for optimizing the neural computations have a detrimental effect onto the system performance.
Bio: Valentin Radu is a Lecturer in the Department of Computer Science at the University of Sheffield. His research aims to make intelligent systems ubiquitous, by advancing distributed machine learning and efficient edge inference. Before joining the University of Sheffield, Valentin was a researcher at the University of Edinburgh, where he also received his PhD. He is the recipient of a Google IoT Research Award and other local SICSA awards. His research has been influenced by his visits at the University of Cambridge (2016), Intel R&D Ireland (2017) and Samsung (2015).
ANC Workshop - 01/12/2020