Seminare
Attribute-based encryption
Beschreibung
In a public-key cryptosystem each user has an associated public key with which other users can send them an encrypted message. The correspondence between the user's identity and their public key is verified using certificates which are issued by some trusted authority.
Identity-based encryption (IBE) offers an alternate approach where one only needs the identity of the recipient to send them an encrypted message. The recipient then uses a secret key associated only with their identity to decrypt the message.
Attribute-based encryption (ABE) is a modification of the IBE setting for a system where users have attributes associated to them and a data owner wishes to send encrypted messages to subsets of users based on their attributes.
The goal of this seminar is to build up from the idea of an IBE proposed by Shamir in [1] on to the first paper on ABE by Sahai and Waters [2]. From there, two classes of ABEs are studied, namely ciphertext policy [3] and key policy ABEs [4].
[1] Shamir, A. (1985). Identity-Based Cryptosystems and Signature Schemes. In: Blakley, G.R., Chaum, D. (eds) Advances in Cryptology. CRYPTO 1984. Lecture Notes in Computer Science, vol 196. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-39568-7_5
[2] Sahai, A., Waters, B. (2005). Fuzzy Identity-Based Encryption. In: Cramer, R. (eds) Advances in Cryptology – EUROCRYPT 2005. EUROCRYPT 2005. Lecture Notes in Computer Science, vol 3494. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11426639_27
[3] J. Bethencourt, A. Sahai and B. Waters, "Ciphertext-Policy Attribute-Based Encryption," 2007 IEEE Symposium on Security and Privacy (SP '07), Berkeley, CA, USA, 2007, pp. 321-334, doi: 10.1109/SP.2007.11.
[4] Vipul Goyal, Omkant Pandey, Amit Sahai, and Brent Waters. 2006. Attribute-based encryption for fine-grained access control of encrypted data. In Proceedings of the 13th ACM conference on Computer and communications security (CCS '06). Association for Computing Machinery, New York, NY, USA, 89–98. https://doi.org/10.1145/1180405.1180418
Betreuer:
The GPT Cryptosystem and Its Cryptanalysis
Beschreibung
The GPT cryptosystem, introduced by Gabidulin, Paramonov, and Tretjakov in 1991 [1], is one of the earliest public-key encryption schemes based on rank-metric codes. It adapts the McEliece framework to use Gabidulin codes, whose rank-metric structure allows for much more compact public keys than traditional Hamming-metric constructions. The strong algebraic structure of these codes, however, also exposes weaknesses that led to several successful structural attacks, most notably those developed by Overbeck [2].
The focus is on understanding the design of the GPT scheme, its underlying rank-metric hardness assumptions, and the mechanisms by which its structure was exploited. Insights from these attacks are essential for appreciating later developments in rank-metric encryption, such as Loidreau’s 2017 construction [3].
[1] Gabidulin, E.M., Paramonov, A.V. and Tretjakov, O.V., 1991, April. Ideals over a non-commutative ring and their application in cryptology. In Workshop on the Theory and Application of Cryptographic Techniques (pp. 482-489). Berlin, Heidelberg: Springer Berlin Heidelberg.
[2] Overbeck, R., 2008. Structural attacks for public key cryptosystems based on Gabidulin codes. Journal of cryptology, 21(2), pp.280-301.
[3] Loidreau, P., 2017, June. A new rank metric codes based encryption scheme. In International Workshop on Post-Quantum Cryptography (pp. 3-17). Cham: Springer International Publishing.
Betreuer:
Smoothing Techniques for Code Problems
Beschreibung
Smoothing is a technique that helps to study computational problems in cryptography. While it has been primarily applied to lattice-based problems, recent work has begun to explore its relevance in the context of coding problems.
In this seminar, you will learn about the different coding problems that appear in known reductions, as well as the techniques used to establish these reductions. Particular attention will be given to the challenges that arise when attempting to extend these techniques to more general classes of codes. These obstacles will be examined, discussed, and presented in detail.
Reference papers:
- Brakerski, Zvika, et al. "Worst-case hardness for LPN and cryptographic hashing via code smoothing." (https://eprint.iacr.org/2018/279)
- Yu, Yu, and Jiang Zhang. "Smoothing out binary linear codes and worst-case sub-exponential hardness for LPN." (https://eprint.iacr.org/2020/870)
- Debris–Alazard, Thomas, and Nicolas Resch. "Worst and average case hardness of decoding via smoothing bounds." (https://eprint.iacr.org/2022/1744)
Betreuer:
Codes for Substring Edits
file synchronization, coding theory
Beschreibung
The document exchange problem considers the scenario where two parties have slightly different versions of the same file, and one of them wants to update their version to match the other. Instead of sending the whole file, the idea is to share only a small description of the changes. Substring edits, where changes happen locally within the file, are a natural way to model these differences. This seminar will examine how coding theory can be applied to enhance the efficiency of document exchange under substring edits.
Betreuer:
Post-Quantum Key Exchange
Beschreibung
Post-quantum cryptography (PQC) has been an active area of research since the seminal work of Shor.
Indeed, most public-key cryptography currently deployed (such as RSA and elliptic-curve-based schemes) is vulnerable to quantum adversaries.
This applies in particular to public-key encryption (PKE) schemes.
An attacker could record encrypted traffic and later decrypt it once a sufficiently capable quantum computer becomes available - a strategy known as "harvest now, decrypt later."
Post-quantum secure alternatives can be constructed from hard problems on codes and lattices. The recently standardized Kyber and HQC follow an encryption-based approach, while, e.g., NewHope is based on a key reconciliation mechanism:
Alice and Bob obtain noise variants of a common secret, and error correction removes this noise, allowing them to agree on the same shared key.
This project will survey constructions for exchanging a key in code-based and lattice-based cryptography. These constructions are to be categorized based on key properties, such as underlying metric, bandwidth requirements, and underlying assumptions.
An overview of the key techniques is to be developed, with particular focus on the question of whether they can be transferred from codes to lattices and vice versa. The project requires reading and understanding several references; a good starting point can be the following works:
Aguilar-Melchor, Carlos, et al. "Efficient encryption from random quasi-cyclic codes." IEEE Transactions on Information Theory 64.5 (2018): 3927-3943.
Bos, Joppe, et al. "CRYSTALS-Kyber: a CCA-secure module-lattice-based KEM." 2018 IEEE European symposium on security and privacy (EuroS&P). IEEE, 2018.
Alkim, Erdem, et al. "Post-quantum Key Exchange — A new hope." 25th USENIX security symposium (USENIX Security 16). 2016.
Betreuer:
Understanding a Fully Homomorphic Encryption Scheme
fully homomorphic encryption
Beschreibung
Fully Homomorphic Encryption (FHE) enables computations on encrypted data without decryption, providing a powerful tool for privacy-preserving applications.
In this seminar, the student will study the BFV scheme, a widely used homomorphic encryption construction. The main goal is to understand the underlying mathematics, the design rationale of BFV, and how it fits into the broader landscape of lattice-based cryptography.
The student is expected to:
- Review the foundational works by Brakerski and by Fan–Vercauteren.
- Summarize the scheme’s construction, security assumptions, and noise management.
- Present connections to applications and implementations.
References:
[1] Z. Brakerski, "Fully Homomorphic Encryption without Modulus Switching from Classical GapSVP," Advances in Cryptology – CRYPTO 2012, Lecture Notes in Computer Science, vol. 7417, Springer, 2012, pp. 868–886. doi: 10.1007/978-3-642-32009-5_50
[2] J. Fan and F. Vercauteren, "Somewhat Practical Fully Homomorphic Encryption," IACR Cryptology ePrint Archive, vol. 2012/144, 2012. Available:
Betreuer:
Membership Inference Attacks Against against Machine Learning Models
Beschreibung
It has been widely acknowledged that machine learning models can leak a significant amount of (potentially private) information about their training data. Analyzing the amount of information leaked about the training data is important to judge the model's privacy. In practice, so-called membership inference attacks [1,2] are employed for such a privacy audit. A membership inference attack tries to predict whether a particular data sample was used in the training of a machine learning model. Besides empirical research, membership inference attacks have been put on a theoretical foundation through a Bayesian decision framework [3].
The goal of this seminar topic is to understand state-of-the-art membership inference attacks [1,2] and the Bayesian decision framework [3]. Students are encouraged to produce their own results using openly available implementations.
[1] N. Carlini, S. Chien, M. Nasr, S. Song, A. Terzis and F. Tramèr, "Membership Inference Attacks From First Principles," 2022 IEEE Symposium on Security and Privacy (SP), 2022.
[2] S. Zarifzadeh, P. Liu, R. Shokri, "Low-Cost High-Power Membership Inference Attacks," ICML 2024.
[3] A. Sablayrolles, M. Douze, C. Schmid, Y. Ollivier, H. Jegou, "White-box vs Black-box: Bayes Optimal Strategies for Membership Inferenc," Proceedings of the 36th International Conference on Machine Learning, 2019.
Voraussetzungen
Compulsory:
- Solid background in probability theory and hypothesis testing
- Basic knowledge about machine learning methods and neural networks
Optional:
- Implementations of machine learning methods in python
Kontakt
E-mail: luis.massny@tum.de
Betreuer:
Key Attacks on the GPT Cryptosystem
rank-metric finite-field coding-theory cryptography
Beschreibung
The McEliece cryptosystem is a well-known public-key encryption scheme using Goppa codes in the Hamming metric. The GPT cryptosystem is the rank-metric counterpart of this scheme using Gabidulin codes. While this promises much better message security than McEliece, it has been subject to several successful key attacks, most prominent of which is known as Overbeck's attack [1, 2].
The task of the student is to understand the different proposed versions of the GPT cryptosytem and how they were attacked. Further, the student will look at a more recent version of this scheme [3] which has so far resisted Overbeck's attack.
[1] R. Overbeck, “Public Key Cryptography based on Coding Theory,” phd, Technische Universität Darmstadt, Darmstadt, 2007. Accessed: Oct. 01, 2024. [Online]. Available: https://tuprints.ulb.tu-darmstadt.de/823/
[2] R. Overbeck, “Structural Attacks for Public Key Cryptosystems based on Gabidulin Codes,” J Cryptol, vol. 21, no. 2, pp. 280–301, Apr. 2008, doi: 10.1007/s00145-007-9003-9.
[3] P. Loidreau, “A New Rank Metric Codes Based Encryption Scheme,” in Post-Quantum Cryptography, T. Lange and T. Takagi, Eds., Cham: Springer International Publishing, 2017, pp. 3–17. doi: 10.1007/978-3-319-59879-6_1.
Voraussetzungen
- Channel Coding
- Familiarity with the rank metric is beneficial (either from the course "Security in Communications and Storage" or "Coding Theory for Storage and Networks")
Betreuer:
Codes for absorption channels
Beschreibung
The absorption channel is a novel communication model inspired by neuronal information transmission. This model is particularly relevant to in-vivo nano-machines, emerging medical applications, and brain-machine interfaces that communicate via the nervous system.
Specifically, an absorption error occurs when a transmitted symbol is completely lost or "absorbed" during communication, rather than being substituted or distorted. This means the receiver gets fewer symbols than were originally sent, creating a form of synchronization error.
?[1] focusses on developing error-correcting codes tailored to address absorption errors within this channel. It also proposes specific constructions to correct a single absorption error for alphabets of size three or more. Additionally, the paper applies syndrome compression techniques with pre-coding to develop subcodes capable of correcting multiple absorption errors while maintaining low redundancy. Efficient encoding and decoding algorithms are provided for each proposed code, enhancing their practical applicability in communication systems affected by absorption errors. ?
[2] extends this work by introducing novel methods for constructing non-binary error-correcting codes tailored to address absorption errors. The authors propose novel code constructions that effectively correct single and multiple absorption errors while minimizing redundancy. Their approach involves partitioning codewords into segments and applying specialized coding techniques to each segment, enabling efficient detection and correction of absorption errors.
[1] Z. Ye, W. Yu, and O. Elishco, “Codes Over Absorption Channels,” IEEE Trans. Inform. Theory, pp. 1–1, 2024, doi: 10.1109/TIT.2023.3346882.
[2] T. T. Nguyen, K. Cai, T. Q. S. Quek, and K. A. S. Immink, “Efficient Constructions of Non-Binary Codes Over Absorption Channels,” in 2024 IEEE International Symposium on Information Theory (ISIT), Athens, Greece: IEEE, Jul. 2024, pp. 1718–1723. doi: 10.1109/ISIT57864.2024.10619179.
Betreuer:
Graph Entropy in Combinatorics
Beschreibung
Information theory and combinatorics are deeply intertwined. Beyond the use of combinatorics in coding theory and compression, there are many -sometimes surprising- connections.
One such connection is the use of graph entropy in combinatorial existence proofs.
This seminar topic is about explaining the proof technique introduced in [1] and [2] and applied in [3]. The goal is a tutorial-style paper with the focus on clear exposition through well chosen worked examples and visualizations.
[1] M. Fredman, and J. Komlós, On the Size of Separating Systems and Perfect Hash Functions, SIAM J. Alg. Disc. Meth., 5 (1984), pp. 61-68.
[2] J. Körner, Fredman-Komlós bounds and information theory, SIAM J. on Algebraic and Discrete Meth., 4(7), (1986), pp. 560–570.
[3] N. Alon, E. Fachini, and J. Körner, Locally Thin Set Families, Combinatorics, Probability and Computing, vol. 9 (Nov. 2000), pp. 481–488.
Betreuer:
Ordered Statistics Decoder: A Review
Ordered Statistics Decoder (OSD), universal decoders, short blocklength coding
The student is expected to provide an overview of the way of working of the decoder, its strengths and its limitations.
Beschreibung
5G and beyond wireless communication systems gave/are expected to give rise to many new applications with stringent requirements that were not achievable with 4G or earlier systems. These requirements are: low latency, high reliability, battery life maximization among others.
To this extent, short blocklength coding has gained particular interest, and new research on efficient coding schemes is emerging. An idea previously discussed in the literature is the Ordered Statistics Decoder (OSD) [1]. Recent works in literature are addressing its strengths, its shortcomings and are suggesting ways to make this decoder more practical and efficient in the short blocklength regime.
The objective of the student is to study the seminal paper [1], that introduces the idea of ordered statistics decoding and be able to describe how the decoder works, what are its strengths, what are its limitations. Then, the student is expected to understand and explain the way of working of newer algorithms [2] [3] and explain if and how they improve performance.
[3] Box and match techniques applied to soft-decision decoding | IEEE Journals & Magazine | IEEE Xplore
Voraussetzungen
It is nice for the student to have some background knowledge on linear algebra and channel coding, e.g., what is a channel code, how it is represented, how it is used etc. However, introductory material can be provided if a student is eager to learn, and questions on topics that are unclear to the student are always welcome.
In general, this is a student-driven task, therefore it is the student's job to plan and execute the review of the given paper. Support and guidance will be gladly provided if requested. There will also be a clear discussion of what is required in the final presentation as well as evaluation points, directly after the topic assignment.