M.Sc. Maximilian Egger
Technical University of Munich
Associate Professorship of Coding and Cryptography (Prof. Wachter-Zeh)
Postal address
Theresienstr. 90
80333 München
- Phone: +49 (89) 289 - 29051
- Room: 0104.03.416 Homepage: https://www.maximilian-egger.de
- maximilian.egger@tum.de
Biography
Maximilian Egger received the B.Eng. in Electrical Engineering from the University of Applied Sciences Augsburg in 2020, and the M.Sc. in Electrical Engineering and Information Technology from the Technical University of Munich in 2022, both with high distinction (final grades: 1.0). He pursued a dual bachelor study accompanied by engineering positions in different hardware and software development departments at the Hilti AG. Inspiring collaborations at university, industry and the German Academic Scholarship Foundation strengthened his motivation to drive scientific progress. As a doctoral researcher at the Institute for Communications Engineering under the supervision of Prof. Dr.-Ing. Antonia Wachter-Zeh, he is conducting research in the rapidly growing field of large-scale decentralized computing and federated learning. Sensible data, potentially corrupted computations and stochastic environments naturally lead to concerns about privacy, security and efficiency. As part of his research, he investigates these problems from a coding and information-theoretic perspective.
Teaching
Coding Theory for Storage and Networks [Summer Term 2022]
Fast Secure and Reliable Coded Computing [Winter Term 2022/23]
Nachrichtentechnik [Summer Term 2023]
Theses in Progress
Code-Based Homomorphic Encryption for Private Aggregation
Description
-
Supervisor:
Decodability of Non-Uniform Secure Aggregation
Description
-
Supervisor:
Coordinate and Projection-Based Compression in Federated Learning
Description
-
Supervisor:
Minimal Random Coding for Stochastic Federated Learning
Description
Communication efficiency is a major and well-studied premise in federated learning (FL). Various compression schemes have been proposed to alleviate prohibitively high communication costs. In stochastic formulations of FL, principled approaches to compression can substantially improve existing measures. Minimal random coding (MRC), introduced in [1], leverages side information, common randomness, and tools from importance-sampling. The authors of [2] proposed to use MRC for stochastic FL and show substantial improvements in various domains.
The task of the student is to understand the principles of the proposed compression schemes and their application to stochastic FL for different learning narratives. The student should analyze where the substantial communication improvements originate from, and what open problems remain.
[1] Marton Havasi, Robert Peharz, and Jose Miguel ´ Hernandez-Lobato. Minimal random code learning: ´ Getting bits back from compressed model parameters. In International Conference on Learning Representations, 2019.
[2] Berivan Isik, Francesco Pase, Deniz Gunduz, Sanmi Koyejo, Tsachy Weissman, and Michele Zorzi. Adaptive compression in federated learning via side information. In International Conference on Artificial Intelligence and Statistics, pp. 487–495, 2024.
Supervisor:
Random Walks for Decentralized Learning
Description
Fully decentralized schemes do not require a central entity and have been studied in [1, 2]. These works aim to reach consensus on a desirable machine learning model among all clients. We can mainly distinguish between i) gossip algorithms [3] where clients share their result with all neighbors, naturally leading to high communication complexities, and ii) random walk approaches like [4, 5] where the model is communicated only to a specific neighbor until matching certain convergence criteria. Such random walk approaches are used in federated learning to reduce the communication load in the network and at the clients’ side.
The main task of the student is to study the work in [5], which additionally accounts for the heterogeneity of the clients’ data. Further, drawbacks and limitations of the proposed approach should be determined.
[1] J. B. Predd, S. B. Kulkarni, and H. V. Poor, “Distributed learning in wireless sensor networks,” IEEE Signal Process. Mag., vol. 23, no. 4, pp. 56–69, 2006.
[2] S. Boyd, N. Parikh, E. Chu, B. Peleato, J. Eckstein et al., “Distributed optimization and statistical learning via the alternating direction method of multipliers,” Found. Trends Mach. Learn., vol. 3, no. 1, pp. 1–122, 2011.
[3] S. S. Ram, A. Nedi ?c, and V. V. Veeravalli, “Asynchronous gossip algorithms for stochastic optimization,” in IEEE Conf. Decis. Control. IEEE, 2009, pp. 3581–3586.
[4] D. Needell, R. Ward, and N. Srebro, “Stochastic gradient descent, weighted sampling, and the randomized kaczmarz algorithm,” Adv. Neural Inf. Process. Syst., vol. 27, 2014.
[5] G. Ayache, V. Dassari, and S. E. Rouayheb, “Walk for learning: A random walk approach for federated learning from heterogeneous data,” arXiv preprint arXiv:2206.00737, 2022.
Prerequisites
- Machine Learning and Statistics
- Information Theory
Supervisor:
Publications
2023
- Private Aggregation in Wireless Federated Learning with Heterogeneous Clusters. 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023 more… Full text ( DOI )
- Private Aggregation in Wireless Federated Learning with Heterogeneous Clusters. International Symposium on Information Theory, 2023 more…
- Sparse and Private Distributed Matrix Multiplication with Straggler Tolerance. International Symposium on Information Theory, 2023 more…
- Maximal-Capacity Discrete Memoryless Channel Identification. International Symposium on Information Theory, 2023 more…
- Fast and Straggler-Tolerant Distributed SGD with Reduced Computation Load. International Symposium on Information Theory, 2023 more…
- Hide and Seek: Using Occlusion Techniques for Side-Channel Leakage Attribution in CNNs. Artificial Intelligence in Hardware Security Workshop, 2023 more…
2022
- Efficient Private Storage of Sparse Machine Learning Data. 2022 IEEE Information Theory Workshop (ITW), 2022 more…
- Challenges in Federated Learning - A Brief Overview. TUM ICE Workshop Raitenhaslach, 2022 more…
- Efficient Distributed Machine Learning via Combinatorial Multi-Armed Bandits. 2022 IEEE International Symposium on Information Theory (ISIT), 2022 more…
- Cost-Efficient Distributed Learning via Combinatorial Multi-Armed Bandits. 2022 Munich Workshop on Coding and Cryptography (MWCC), 2022 more…
- Cost-Efficient Distributed Learning via Combinatorial Multi-Armed Bandits. 2022 IEEE European School of Information Theory (ESIT), 2022 more…
- A Second Look at the ASCAD Databases. 14th International Workshop on Constructive Side-Channel Analysis and Secure Design, 2022 more…
- Efficient Private Storage of Sparse Machine Learning Data. IEEE Information Theory Workshop, 2022 more…