M.Sc. Christoph Hofmeister
Technische Universität München
Professur für Codierung und Kryptographie (Prof. Wachter-Zeh)
Postadresse
Postal:
Theresienstr. 90
80333 München
Biografie
- Duales Studium bei Infineon Technologies (2015-2019)
- B.Eng. in Elektro- und Informationstechnik, Hochschule München (2019)
- M.Sc. in Elektro- und Informationstechnik, Technische Universität München (2021)
- Seit Oktober 2021 Doktorand an der Lehr- und Forschungseinheit für Nachrichtentechnik, Professur für Coding und Kryptographie
Lehre
- Coding Theory for Storage and Networks [Sommer 22]
- Fast, Secure, and Reliable Coded Computing [Winter 22/23]
- Channel Coding [Sommer 23]
- Coding for Private Reliable and Efficient Distributed Learning [Winter 23/24]
Abschlussarbeiten
Angebotene Abschlussarbeiten
Laufende Abschlussarbeiten
Gradient Compression
Beschreibung
In many distributed and federated learning systems clients iteratively compute so-called gradient vectors based on their locally stored data and communicate them to a central entity. The gradient vectors are typically high dimensional, so transmitting them directly leads to undesirable amounts of data transmission, holding back the performance of the system.
To alleviate this issue, various gradient compression schemes have been proposed.
The student's task is to analyze and compare multiple proposed schemes based on their advantages and disadvantages. As a starting point, students can use [1, Section 3.2].
[1] S. Ouyang, D. Dong, Y. Xu, and L. Xiao, “Communication optimization strategies for distributed deep neural network training: A survey,” Journal of Parallel and Distributed Computing, vol. 149, pp. 52–65, Mar. 2021, doi: 10.1016/j.jpdc.2020.11.005.
Betreuer:
The Interplay of Fairness and Privacy in Federated Learning
Beschreibung
We study the impact of different measures of fairness on the privacy guarantees for individual clients in federated learning.
Betreuer:
Reliable Over-the-Air Computation for Federated Learning
Beschreibung
We study the usage of channel codes for the setting of federated learning with over-the-air computation so that the sum of codewords over reals can be efficiently and reliably decoded at the federator to obtain the average of partial model updates.
Betreuer:
Publikationen
2023
- Private Aggregation in Wireless Federated Learning with Heterogeneous Clusters. 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023 mehr… Volltext ( DOI )
- Trading Communication for Computation in Byzantine-Resilient Gradient Coding. 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023 mehr… Volltext ( DOI )
2022
- Trading Communication and Computation for Security in Gradient Coding. Munich Workshop on Coding and Cryptography 2022, 2022 mehr…
- Trading Communication and Computation for Security in Gradient Coding. 2022 IEEE European School of Information Theory (ESIT), 2022 mehr…
- Trading Communication and Computation for Security in Gradient Coding. TUM ICE Workshop Raitenhaslach, 2022 mehr…
- Secure Private and Adaptive Matrix Multiplication Beyond the Singleton Bound. IEEE Journal on Selected Areas in Information Theory 3 (2), 2022, 275-285 mehr… Volltext ( DOI )
- Secure Private and Adaptive Matrix Multiplication Beyond the Singleton Bound. WCC 2022: The Twelfth International Workshop on Coding and Cryptography , 2022 mehr…