IPAB Seminar-29/04/2021

 

 

TITLE: Flower: A Friendly Federated Learning Research Framework ABSTRACT: Federated Learning (FL) has emerged as a promising technique for edge devices to collaboratively learn a shared prediction model, while keeping their training data on the device, thereby decoupling the ability to do machine learning from the need to store the potentially privacy sensitive user data in the cloud. However, despite the rapid progress made in FL in recent years, it still remains far too difficult to evaluate FL algorithms under a full range of realistic system constraints (viz. compute, memory, energy, wired/wireless networking) and scale (thousands of federated devices and larger). As a consequence, our understanding of how these factors influence FL performance and should shape the future evolution of FL algorithms remains in a very underdeveloped state. In this talk, I will describe how we have begun to address this situation by developing Flower -- an open-source framework (<http://flower.dev/>http://flower.dev<http://flower.dev/>) built to help bridge this gap in evaluation and design. Through Flower, it becomes relatively simple to measure the impact of common real-world FL situations, such as if devices have limited compute (e.g., an embedded device), or when network speeds are highly varied and unstable. I will highlight early empirical observations, made using Flower, as to what the implications are for existing algorithms under the types of heterogeneous large-scale FL systems we anticipate will increasingly appear. I will also show the benefits of Flower for FL design by detailing our recent study of federated end-to-end speech recognition. Using Flower, we devise a solution able to train effective *federated* speech models even under extremely heterogeneous data: speech samples from 2,000 real users -- this represents the largest successful experiment of its type to date. Finally, to showcase the flexibility of Flower, I will show how it can even be used to make assessments of the carbon footprint of FL in various settings -- to the best of our knowledge, this is the first time FL has been studied from the perspective of its environmental impact. SPEAKER BIO: Nic Lane (<http://niclane.org/>http://niclane.org<http://niclane.org/>) is a Senior Lecturer (Associate Professor) in the department of Computer Science and Technology at the University of Cambridge where he leads the Machine Learning Systems Lab (CaMLSys -- <http://http//mlsys.cst.cam.ac.uk/http://http://mlsys.cst.cam.ac.uk/<http://http//mlsys.cst.cam.ac.uk/>). Alongside his academic role, he is also a Director (On-Device and Distributed Machine Learning) at the Samsung AI Center in Cambridge. Of late, Nic research has specialized in the study of efficient machine learning, and over the last five years he has pioneered a range of embedded and mobile forms of deep learning. Nic has received multiple best paper awards, including ACM/IEEE IPSN 2017 and two from ACM UbiComp (2012 and 2015). In 2018 and 2019, he (and his co-authors) received the ACM SenSys Test-of-Time award and ACM SIGMOBILE Test-of-Time award for pioneering research, performed during his PhD thesis, that devised machine learning algorithms used today on devices like smartphones. Most recently, Nic is the 2020 ACM SIGMOBILE Rockstar award winner for his contributions to “the understanding of how resource-constrained mobile devices can robustly understand, reason and react to complex user behaviors and environments through new paradigms in learning algorithms and system design.”

Apr 29 2021 -

IPAB Seminar-29/04/2021

Nic Lane (University of Cambridge)

Blackboard Collaborate