M.Sc. Christoph Hofmeister
Technical University of Munich
Associate Professorship of Coding and Cryptography (Prof. Wachter-Zeh)
Postal address
Postal:
Theresienstr. 90
80333 München
Biography
- Dual Study Program with Infineon Technologies (2015-2019)
- B.Eng. in Electrical and Computer Engineering, University of Applied Sciences Munich (2019)
- M.Sc. in Electrical and Computer Engineering, Technical University of Munich (2021)
- Since October 2021, doctoral researcher at the Institute of Communications Engineering, Coding and Cryptography group
Teaching
- Coding Theory for Storage and Networks [Summer 22]
- Fast, Secure, and Reliable Coded Computing [Winter 22/23]
- Channel Coding [Summer 23]
- Coding for Private Reliable and Efficient Distributed Learning [Winter 23/24]
Theses in Progress
Gradient Compression
Description
In many distributed and federated learning systems clients iteratively compute so-called gradient vectors based on their locally stored data and communicate them to a central entity. The gradient vectors are typically high dimensional, so transmitting them directly leads to undesirable amounts of data transmission, holding back the performance of the system.
To alleviate this issue, various gradient compression schemes have been proposed.
The student's task is to analyze and compare multiple proposed schemes based on their advantages and disadvantages. As a starting point, students can use [1, Section 3.2].
[1] S. Ouyang, D. Dong, Y. Xu, and L. Xiao, “Communication optimization strategies for distributed deep neural network training: A survey,” Journal of Parallel and Distributed Computing, vol. 149, pp. 52–65, Mar. 2021, doi: 10.1016/j.jpdc.2020.11.005.
Supervisor:
The Interplay of Fairness and Privacy in Federated Learning
Description
We study the impact of different measures of fairness on the privacy guarantees for individual clients in federated learning.
Supervisor:
Reliable Over-the-Air Computation for Federated Learning
Description
We study the usage of channel codes for the setting of federated learning with over-the-air computation so that the sum of codewords over reals can be efficiently and reliably decoded at the federator to obtain the average of partial model updates.
Supervisor:
Publications
2023
- Private Aggregation in Wireless Federated Learning with Heterogeneous Clusters. 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023 more… Full text ( DOI )
- Trading Communication for Computation in Byzantine-Resilient Gradient Coding. 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023 more… Full text ( DOI )
2022
- Trading Communication and Computation for Security in Gradient Coding. Munich Workshop on Coding and Cryptography 2022, 2022 more…
- Trading Communication and Computation for Security in Gradient Coding. 2022 IEEE European School of Information Theory (ESIT), 2022 more…
- Trading Communication and Computation for Security in Gradient Coding. TUM ICE Workshop Raitenhaslach, 2022 more…
- Secure Private and Adaptive Matrix Multiplication Beyond the Singleton Bound. IEEE Journal on Selected Areas in Information Theory 3 (2), 2022, 275-285 more… Full text ( DOI )
- Secure Private and Adaptive Matrix Multiplication Beyond the Singleton Bound. WCC 2022: The Twelfth International Workshop on Coding and Cryptography , 2022 more…