Faculty Recruiting Support CICS

Machine Learning and Friends Lunch (Online)

29 Apr
Thursday, 04/29/2021 11:45am to 1:15pm
Virtual via Zoom
Machine Learning and Friends Lunch
Speaker: Adam Dziedzic

Title: "CaPC Learning: Confidential and Private Collaborative Learning"

Abstract: Machine learning benefits from large training datasets, which may not always be possible to collect by any single entity, especially when using privacy-sensitive data. In many contexts, such as healthcare and finance, separate parties may wish to collaborate and learn from each other's data but are prevented from doing so due to privacy regulations. Some regulations prevent explicit sharing of data between parties by joining datasets in a central location (confidentiality). Others also limit implicit sharing of data, e.g., through model predictions (privacy). There is currently no method that enables machine learning in such a setting, where both confidentiality and privacy need to be preserved, to prevent both explicit and implicit sharing of data. Federated learning only provides confidentiality, not privacy, since gradients shared still contain private information. Differentially private learning assumes unreasonably large datasets. Furthermore, both of these learning paradigms produce a central model whose architecture was previously agreed upon by all parties rather than enabling collaborative learning where each party learns and improves their own local model. We introduce Confidential and Private Collaborative (CaPC) learning, the first method provably achieving both confidentiality and privacy in a collaborative setting. We leverage secure multi-party computation (MPC), homomorphic encryption (HE), and other techniques in combination with privately aggregated teacher models. We demonstrate how CaPC allows participants to collaborate without having to explicitly join their training sets or train a central model. Each party is able to improve the accuracy and fairness of their model, even in settings where each party has a model that performs well on their own dataset or when datasets are not IID and model architectures are heterogeneous across parties.

Bio: Adam is a postdoctoral researcher at the Vector Institute and the University of Toronto, advised by Prof. Nicolas Papernot. He earned his PhD at the University of Chicago, where he was advised by Prof. Sanjay Krishnan and carried out research on the Band-Limited convolutional neural networks as well as the out-of-distribution robustness of pre-trained transformers. Adam obtained his Bachelor's and Master's degrees from Warsaw University of Technology in Poland. He also studied at DTU (Technical University of Denmark) and carried out research on databases in the DIAS group at EPFL, Switzerland. He was a PhD intern at Microsoft Research and worked on recommendation of hybrid physical designs (B+ trees and Columnstores) for SQL Server. He also had internships at CERN (Geneva, Switzerland), Barclays Investment Bank (London, UK), and Google (Madison, USA).

To obtain the Zoom link for this event, please see the event announcements from MLFL on the college email lists or contact Kalpesh Krishna.

Host
: