Federated learning allows to train a machine learning model in a distributed manner, i.e., the training data are collected and stored locally by users such as mobile devices or multiple institutes. The training is under the coordination of a central server and performed iteratively. In each iteration, the server sends the current global model to the users, who update their local model and send the local updates to the server for aggregation.
FL is proposed to protect user's sensitive data since these training data never leave the user devices. However, works have shown that the local updates still leaks information about the local datasets. To deal with this leakage, SecAgg[1] is proposed. Secure aggregation is to make sure that the server only obtains the aggregation of the local updates rather than each individual update.
However, recent work [2] has shown that, SecAgg only preserves privacy of the users in a single training round. Due to user selection in federated learning, by observing the aggregated models over multiple training rounds, the server is able to recoverindividual local models of the users.
The goal of this seminar is to study and understand SecAgg [1], the multi-round privacy leakage it suffers and how is this problem solved in [2].
[1]. Bonawitz, Keith, et al. "Practical secure aggregation for privacy-preserving machine learning." proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security. 2017.
[2]. So, Jinhyun, et al. "Securing secure aggregation: Mitigating multi-round privacy leakage in federated learning." Proceedings of the AAAI Conference on Artificial Intelligence. Vol. 37. No. 8. 2023.
Privacy-Preserving Federated Learning Using Advanced Variational Autoencoders
Description
This thesis explores the use of advanced Variational Autoencoder (VAE) architectures to enhance privacy in Federated Learning. The goal is to design and evaluate methods that learn useful representations while minimizing the risk of sensitive data leakage. The approach will involve extending standard VAE models to better support privacy-preserving objectives in distributed environments. The student will investigate trade-offs between privacy and utility, and benchmark the approach against existing techniques.
DNA as a Molecular Computer: Designing Logic, Arithmetic, and Control Circuits
Description
This thesis explores the concept of DNA as a medium for molecular computation, focusing on the design and implementation of fundamental computational circuits using DNA strands. By leveraging reactions of DNA molecules, such as strand displacement and transcription, the research demonstrates how logic gates, arithmetic operations, and basic control mechanisms can be encoded and executed at the molecular level. The work highlights the potential of DNA computing for genetic circuits, parallel processing and other applications.
Xia, Y.; Hofmeister, C.; Egger, M.; Bitar, R.: Byzantine-resilient and Information-Theoretically Private Federated Learning. Munich Workshop on Coding and Cryptography (MWCC), 2024 more…
Xia, Y.; Hofmeister, C.; Egger, M.; Bitar, R.: Byzantine-Resilient and Information-Theoretically Private Federated Learning. IEEE International Symposium on Information Theory (ISIT) , 2024 more…
Xia, Y.; Hofmeister, C.; Egger, M.; Bitar, R.: Byzantine-Resilient Secure Aggregation for Federated Learning Without Privacy Compromises. IEEE Information Theory Workshop (ITW) , 2024 more…