Jakub Konečný; Federated Learning: Privacy-Preserving Collaborative Machine Learning without Centralized Training Data

Jan 30th, 2018, 12:00pm
CSE 403
Jakub KonečnýGoogle

Abstract: Federated Learning is a machine learning setting where the goal is to
train a high quality centralized model while training data remains
distributed over a large number of clients each with unreliable and
relatively slow network connections. We consider learning algorithms
for this setting where on each round, each client independently
computes an update to the current model based on its local data, and
communicates this update to a central server, where the client-side
updates are aggregated to compute a new global model.

In this talk, I will introduce the underlying algorithms, and present
several ideas for improving the overall system in terms of
communication efficiency, security, and differential privacy.

Bio: Jakub Konečný is a research scientist at Google working on Federated
Learning, an effort to decentralize machine learning. Prior to joining
Google, Jakub completed his PhD at University of Edinburgh focusing on
optimization algorithms for machine learning.