Foto von Maximilian Egger

M.Sc. Maximilian Egger

Technische Universität München

Professur für Codierung und Kryptographie (Prof. Wachter-Zeh)

Postadresse

Postal:
Theresienstr. 90
80333 München

Biografie

Maximilian Egger received the B.Eng. in Electrical Engineering from the University of Applied Sciences Augsburg in 2020, and the M.Sc. in Electrical Engineering and Information Technology from the Technical University of Munich in 2022, both with high distinction (final grades: 1.0). He pursued a dual bachelor study accompanied by engineering positions in different hardware and software development departments at the Hilti AG. Inspiring collaborations at university, industry and the German Academic Scholarship Foundation strengthened his motivation to drive scientific progress. As a doctoral researcher at the Institute for Communications Engineering under the supervision of Prof. Dr.-Ing. Antonia Wachter-Zeh, he is conducting research in the rapidly growing field of large-scale decentralized computing and federated learning. Sensible data, potentially corrupted computations and stochastic environments naturally lead to concerns about privacy, security and efficiency. As part of his research, he investigates these problems from a coding and information-theoretic perspective.

Lehre

Coding Theory for Storage and Networks [Sommersemester 2022]
Fast Secure and Reliable Coded Computing [Wintersemester 2022/23]
Nachrichtentechnik [Sommersemester 2023]

Abschlussarbeiten

Angebotene Abschlussarbeiten

Private and Secure Federated Learning

Beschreibung

In federated learning, a machine learning model shall be trained on private user data with the help of a central server, the so-called federator. This setting differs from other machine learning settings in that the user data shall not be shared with the federator for privacy reasons and/or to decrease the communication load of the system.

Even though only intermediate results are shared, extra care is necessary to guarantee data privacy. An additional challenge arises if the system includes malicious users that breach protocol and send corrupt computation results.

The goal of this work is to design, implement and analyze coding- and information-theoretic solutions for privacy and security in federated learning.

Voraussetzungen

  • Information Theory
  • Coding Theory (e.g., Channel Coding)
  • Machine Learning (Theory and Practice)

Betreuer:

Homomorphic Encryption for Machine Learning

Stichworte:
Partial/Somewhat Homomorphic Encryption, Federated Learning

Beschreibung

Homomorphic encryption (HE) schemes are increasingly attracting attention in the era of large scale computing. While lattice-based approaches have been well-studied, recently first progress has been made towards establishing code-based alternatives. Preliminary results show that such alterative approaches might enable undiscovered functionalities not present in current lattice-based schemes. In this project, we particularily study novel code-based Partial/Somewhat HE schemes tailored to applications in artificial intelligence and federated learning.

After familiarizing with SotA methods in relevant fields (such as [1]), the student should analyze the requirements for use-cases at hand and explore suitable modifications to current schemes and novel approaches.

[1] Aguilar-Melchor, Carlos, Victor Dyseryn, and Philippe Gaborit, "Somewhat Homomorphic Encryption based on Random Codes," Cryptology ePrint Archive (2023).

Voraussetzungen

- Strong foundation in linear algebra
- Channel Coding
- Security in Communications and Storage
- Basic understanding of Machine Learning concepts

Betreuer:

Laufende Abschlussarbeiten

Random Walks for Decentralized Learning

Beschreibung

Fully decentralized schemes do not require a central entity and have been studied in [1, 2]. These works aim to reach consensus on a desirable machine learning model among all clients. We can mainly distinguish between i) gossip algorithms [3] where clients share their result with all neighbors, naturally leading to high communication complexities, and ii) random walk approaches like [4, 5] where the model is communicated only to a specific neighbor until matching certain convergence criteria. Such random walk approaches are used in federated learning to reduce the communication load in the network and at the clients’ side.

The main task of the student is to study the work in [5], which additionally accounts for the heterogeneity of the clients’ data. Further, drawbacks and limitations of the proposed approach should be determined.

[1] J. B. Predd, S. B. Kulkarni, and H. V. Poor, “Distributed learning in wireless sensor networks,” IEEE Signal Process. Mag., vol. 23, no. 4, pp. 56–69, 2006.

[2] S. Boyd, N. Parikh, E. Chu, B. Peleato, J. Eckstein et al., “Distributed optimization and statistical learning via the alternating direction method of multipliers,” Found. Trends Mach. Learn., vol. 3, no. 1, pp. 1–122, 2011.

[3] S. S. Ram, A. Nedi ?c, and V. V. Veeravalli, “Asynchronous gossip algorithms for stochastic optimization,” in IEEE Conf. Decis. Control. IEEE, 2009, pp. 3581–3586.

[4] D. Needell, R. Ward, and N. Srebro, “Stochastic gradient descent, weighted sampling, and the randomized kaczmarz algorithm,” Adv. Neural Inf. Process. Syst., vol. 27, 2014.

[5] G. Ayache, V. Dassari, and S. E. Rouayheb, “Walk for learning: A random walk approach for federated learning from heterogeneous data,” arXiv preprint arXiv:2206.00737, 2022.

Voraussetzungen

- Machine Learning and Statistics
- Information Theory

Betreuer:

Publikationen

2023

  • Egger, Maximilian; Hofmeister, Christoph; Wachter-Zeh, Antonia; Bitar, Rawad: Private Aggregation in Wireless Federated Learning with Heterogeneous Clusters. 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023 mehr… Volltext ( DOI )
  • Maximilian Egger, Christoph Hofmeister, Antonia Wachter-Zeh, Rawad Bitar: Private Aggregation in Wireless Federated Learning with Heterogeneous Clusters. International Symposium on Information Theory, 2023 mehr…
  • Maximilian Egger, Marvin Xhemrishi, Antonia Wachter-Zeh, Rawad Bitar: Sparse and Private Distributed Matrix Multiplication with Straggler Tolerance. International Symposium on Information Theory, 2023 mehr…
  • Maximilian Egger, Rawad Bitar, Antonia Wachter-Zeh, Deniz Gündüz, Nir Weinberger: Maximal-Capacity Discrete Memoryless Channel Identification. International Symposium on Information Theory, 2023 mehr…
  • Maximilian Egger, Serge Kas Hanna, Rawad Bitar: Fast and Straggler-Tolerant Distributed SGD with Reduced Computation Load. International Symposium on Information Theory, 2023 mehr…
  • Thomas Schamberger, Maximilian Egger, Lars Tebelmann: Hide and Seek: Using Occlusion Techniques for Side-Channel Leakage Attribution in CNNs. Artificial Intelligence in Hardware Security Workshop, 2023 mehr…

2022

  • Marvin Xhemrishi, Maximilian Egger, Rawad Bitar: Efficient Private Storage of Sparse Machine Learning Data. 2022 IEEE Information Theory Workshop (ITW), 2022 mehr…
  • Maximilian Egger: Challenges in Federated Learning - A Brief Overview. TUM ICE Workshop Raitenhaslach, 2022 mehr…
  • Maximilian Egger, Rawad Bitar, Antonia Wachter-Zeh, Deniz Gündüz: Efficient Distributed Machine Learning via Combinatorial Multi-Armed Bandits. 2022 IEEE International Symposium on Information Theory (ISIT), 2022 mehr…
  • Maximilian Egger, Rawad Bitar, Antonia Wachter-Zeh, Deniz Gündüz: Cost-Efficient Distributed Learning via Combinatorial Multi-Armed Bandits. 2022 Munich Workshop on Coding and Cryptography (MWCC), 2022 mehr…
  • Maximilian Egger, Rawad Bitar, Antonia Wachter-Zeh, Deniz Gündüz: Cost-Efficient Distributed Learning via Combinatorial Multi-Armed Bandits. 2022 IEEE European School of Information Theory (ESIT), 2022 mehr…
  • Maximilian Egger, Thomas Schamberger, Lars Tebelmann, Florian Lippert, Georg Sigl: A Second Look at the ASCAD Databases. 14th International Workshop on Constructive Side-Channel Analysis and Secure Design, 2022 mehr…
  • Xhemrishi M.; Egger M. Bitar R.: Efficient Private Storage of Sparse Machine Learning Data. IEEE Information Theory Workshop, 2022 mehr…