Foto von Christoph Hofmeister

M.Sc. Christoph Hofmeister

Technische Universität München

Professur für Codierung und Kryptographie (Prof. Wachter-Zeh)


Theresienstr. 90
80333 München


Angebotene Abschlussarbeiten

Private and Secure Federated Learning


In federated learning, a machine learning model shall be trained on private user data with the help of a central server, the so-called federator. This setting differs from other machine learning settings in that the user data shall not be shared with the federator for privacy reasons and/or to decrease the communication load of the system.

Even though only intermediate results are shared, extra care is necessary to guarantee data privacy. An additional challenge arises if the system includes malicious users that breach protocol and send corrupt computation results.

The goal of this work is to design, implement and analyze coding- and information-theoretic solutions for privacy and security in federated learning.


Laufende Abschlussarbeiten

Secure Record Linkage via Multi-Party Computation


Processing a massive amount of collected data by means of machine learning algorithms often becomes infeasible when carried out on single machines. To cope with the computational requirements, distributed cloud computing was introduced. Thereby, a large computational task is split into multiple parts and distributed among worker machines to parallelize the computations and thereby speed up the learning process. However, since confidential data must be shared with third parties and the outcome is threatened by potential corrupt computations, privacy and security has to be ensured. This is particularly critical in medical environments, in which we deal with individual patients' information. 

To motivate the study of these challenges, a competition called iDash privacy and security workshop is hosted every year [1]. This year, the task is to develop a framework that securely links similar patient related entries being stored on different datasets without comprising privacy - for example to avoid double considerations in further processing steps. During this research internship, the student should use multi-party computation tools to develop a framework that complies with the aforementioned requirements.



  • Coding Theory (e.g., Channel Coding)
  • Linear Algebra
  • Information Theory (optional)


Analog Privacy-Preserving Coded Computing


In privacy-preserving coded computing, a main nodes offloads computations to a set of worker nodes without leaking information about the it's private data (unless too many workers collude).

Many approaches in the literature consider computations over finite fields. However, in many applications in practice, e.g. machine learning, computations over (approximations of) the reals are required.

The goal of this seminar work is to review the recent approaches from the literature on privacy-preserving coded computing over analog domains and highlight their benefits and drawbacks.


Relevant literature:

  • M. Soleymani, H. Mahdavifar and A. S. Avestimehr, "Analog Privacy-Preserving Coded Computing," 2021 IEEE International Symposium on Information Theory (ISIT), 2021, pp. 1865-1870, doi: 10.1109/ISIT45174.2021.9517715.
  • M. Soleymani, H. Mahdavifar and A. S. Avestimehr, "Analog Lagrange Coded Computing," in IEEE Journal on Selected Areas in Information Theory, vol. 2, no. 1, pp. 283-295, March 2021, doi: 10.1109/JSAIT.2021.3056377.


  • Coding theory
  • Information theory
  • Linear algebra
  • Having attended the course on "Security in Communications and Storage" would be helpful (but is not required)