Picture of Christoph Hofmeister

M.Sc. Christoph Hofmeister

Technical University of Munich

Associate Professorship of Coding and Cryptography (Prof. Wachter-Zeh)

Postal address

Postal:
Theresienstr. 90
80333 München

Biography

  • Dual Study Program with Infineon Technologies (2015-2019)
  • B.Eng. in Electrical and Computer Engineering, University of Applied Sciences Munich (2019)
  • M.Sc. in Electrical and Computer Engineering, Technical University of Munich (2021)
  • Since October 2021, doctoral researcher at the Institute of Communications Engineering, Coding and Cryptography group

Theses

Available Theses

Theses in Progress

Fast Matrix Multiplication Algorithms

Description

The search for fast matrix multiplication algorithms started when Volker Strassen found a way to multiply 2x2 matrices using 7 (instead of 8) scalar multiplications [1]. Through recursive application, Strassen's algorithm multiplies n x n matrices in sub-cubic complexity.

Since then, multiple algorithms with successively lower complexity have been discovered.

 

The goal of this seminar topic is to give an overview of these fast matrix multiplication algorithms, focussing on the mathematical concepts relating to their inner workings and discovery.

 

[1] Strassen, V. Gaussian elimination is not optimal. Numer. Math. 13, 354–356 (1969). https://doi.org/10.1007/BF02165411

Supervisor:

Gradient Compression

Description

In many distributed and federated learning systems clients iteratively compute so-called gradient vectors based on their locally stored data and communicate them to a central entity. The gradient vectors are typically high dimensional, so transmitting them directly leads to undesirable amounts of data transmission, holding back the performance of the system.

To alleviate this issue, various gradient compression schemes have been proposed.

The student's task is to analyze and compare multiple proposed schemes based on their advantages and disadvantages. As a starting point, students can use [1, Section 3.2].

[1] S. Ouyang, D. Dong, Y. Xu, and L. Xiao, “Communication optimization strategies for distributed deep neural network training: A survey,” Journal of Parallel and Distributed Computing, vol. 149, pp. 52–65, Mar. 2021, doi: 10.1016/j.jpdc.2020.11.005.

Supervisor:

Publications

2023

  • Egger, Maximilian; Hofmeister, Christoph; Wachter-Zeh, Antonia; Bitar, Rawad: Private Aggregation in Wireless Federated Learning with Heterogeneous Clusters. 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023 more… Full text ( DOI )
  • Hofmeister, C.; Maßny, L.; Yaakobi, E.; Bitar, R.: Trading Communication for Computation in Byzantine-Resilient Gradient Coding. 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023 more… Full text ( DOI )

2022

  • Christoph Hofmeister; Luis Maßny; Rawad Bitar; Eitan Yaakobi: Trading Communication and Computation for Security in Gradient Coding. Munich Workshop on Coding and Cryptography 2022, 2022 more…
  • Christoph Hofmeister; Luis Maßny; Rawad Bitar; Eitan Yaakobi: Trading Communication and Computation for Security in Gradient Coding. 2022 IEEE European School of Information Theory (ESIT), 2022 more…
  • Hofmeister Christoph; Luis Maßny; Rawad Bitar; Eitan Yaakobi: Trading Communication and Computation for Security in Gradient Coding. TUM ICE Workshop Raitenhaslach, 2022 more…
  • Hofmeister, Christoph; Bitar, Rawad; Xhemrishi, Marvin; Wachter-Zeh, Antonia: Secure Private and Adaptive Matrix Multiplication Beyond the Singleton Bound. IEEE Journal on Selected Areas in Information Theory 3 (2), 2022, 275-285 more… Full text ( DOI )
  • Hofmeister, Christoph; Bitar, Rawad; Xhemrishi, Marvin; Wachter-Zeh, Antonia: Secure Private and Adaptive Matrix Multiplication Beyond the Singleton Bound. WCC 2022: The Twelfth International Workshop on Coding and Cryptography , 2022 more…