Foto von Rawad Bitar

Dr. Ph.D. Rawad Bitar

Technische Universität München

Professur für Codierung und Kryptographie (Prof. Wachter-Zeh)

Postadresse

Postal:
Theresienstr. 90
80333 München

Biografie

Senior researcher and lecturer doing a habilitation with Prof. Dr.-Ing Antonia Wachter-Zeh., Prof. Deniz Gündüz and Prof. Sidharth Jaggi as mentors. I obtained a PhD from the ECE department of Rutgers University in January 2020. During my PhD, I held short term visiting positions at Aalto, Technical University of Berlin and the Chinese University of Hong Kong. In addition, I spent three years as PhD candidate at the ECE department of Illinois Institute of Technology (IIT). From August 2014 till January 2020, I was a member of the CSI lab supervised by Prof. Salim El Rouayheb.


In terms of studies, I received a masters degree in Information and Communication from the Lebanese University after doing the thesis at IIT in 2014. I graduated as a Computer and Communication Engineer from the Lebanese University in 2013 after doing the engineering senior project at Center of Nuclear Science and Science of Matter (CSNSM) in Paris, France and  an internship at procsim-consulting company in EPFL, Lausanne, Switzerland, in 2012.

 

Forschung

My research interest is centered around privacy, scalability, security and reliability of distributed systems. The applications may differ but the goal remains the same: study theoretical and fundamental limits of innovative storage and computing systems and design codes achieving those limits. My current work is motivated by the following research directions.

  • Federated learning
  • Private and secure distributed computing
  • Distributed storage
  • DNA-based storage systems
  • Private and secure network coding.

Awards

  • DFG (German Research Foundation) grant for a temporary position for a Principal Investigator "Private Secure and Efficient Codes for Distributed Machine Learning". (2023 -- 2026)
  • EuroTech Visiting Research Programme grant. (March 2023)
  • Best Poster Award on Optimization and Machine Learning in Princeton Day of Optimization, Princeton, New Jersey. (September 2018)

Service

Workshop co-Organizer

  • Munich Workshop on Coding and Cryptography 2022

Guest Editor

  • Frontier in Communications and Networks

Technical Program Committee (TPC) member

  • Conference on Information-Theoretic Cryptography (ITC) 2023
  • IEEE International Workshop on Information Theory (ITW) 2023
  • 13th Annual Non-Volatile Memories Workshop (NVMW) 2022

Session Chair

  • IEEE Information Theory Workshop (ITW) 2022
  • IEEE International Symposium on Information Theory (ISIT) 2022
  • IEEE International Symposium on Information Theory (ISIT) 2021

Reviewer

  • IEEE Transactions on Information Theory
  • IEEE Journal on Selected Areas in Information Theory (JSAIT)
  • IEEE Transactions on Forensics and Information Security
  • IEEE/ACM Transactions on Networking
  • IEEE Journal on Selected Areas in Communications (JSAC)
  • IEEE Transactions on Communications
  • IEEE Transactions on Parallel and Distributed Systems
  • IEEE Transactions on Emerging Topics in Computing
  • IEEE Transactions on Vehicular Technology
  • ScienceDirect Journal of Parallel and Distributed Computing
  • IET Information Security
  • IEEE International Symposium on Information Theory (ISIT)
  • IEEE Information Theory Workshop (ITW)
  • IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
  • IEEE Global Communication Conference: Communication Theory (Globecom)
  • IEEE Global Communication Conference: Workshops: Network Coding and Applications (NetCod)
  • IEEE Wireless Communications and Networking Conference (WCNC)

Lehre

  • Winter semester 2021/2022: Security in Communications and Networks, jointly with Prof. Dr.-Ing Antonia Wachter-Zeh
  • Summer semester 2021: Coding Theory for Storage and Networks, jointly with Dr.-Ing Sven Puchinger
  • Winter semester 2020/2021: Security in Communications and Networks, jointly with Prof. Dr.-Ing Antonia Wachter-Zeh
  • Summer semester 2020: Coding Theory for Storage and Networks, jointly with Dr. Alessandro Neri

Abschlussarbeiten

Angebotene Abschlussarbeiten

Adapting Redundancy in Distributed Learning

Stichworte:
Distributed Learning, Gradient Coding, Straggler tolerance

Beschreibung

In this project, we investigate the interplay between redundancy and straggler tolerance in distributed learning.

The setting is that of a main node distributing computational tasks to available workers as part of a machine learning algorithm, e.g., training a neural network. Waiting for all workers to return their computations suffers from the presence of stragglers, i.e., slow or unresponsive nodes. Mitigating the effect of the stragglers can be done through the use of redundancy or by leveraging the properties of the convergence of the machine learning algorithm. 

The goal of this work is to compare when redundancy is needed. In this case, we aim to first analyze the convergence speed with and without redundancy. Then, we aim at design schemes that adaptively increase the redundancy to speed up the convergence.

Further reading: 

R. Bitar, M. Wootters and S. El Rouayheb, Stochastic Gradient Coding for Straggler Mitigation in Distributed Learning, IEEE Journal on Selected Areas in Information Theory (JSAIT), Vol. 1, No. 1, May 2020. arXiv:1905.05383

S. Kas Hanna, R. Bitar, P. Parag, V. Dasari and S. El Rouayheb, Adaptive Stochastic Gradient Descent for Fast and Communication-Efficient Distributed Learning, preprint, arXiv:2208.03134.

Voraussetzungen

Knowledge in the following topics:

  • Probability Theory
  • Gradient descent and stochastic gradient descent
  • Coding theory

Independence and motivation to work on a research topic

Knowledge of implementing neural networks is a plus

 

Kontakt

Dr. Rawad Bitar: rawad.bitar@tum.de

Betreuer:

MAB-Based Efficient Distributed ML on the Cloud

Stichworte:
Distributed Machine Learning (ML), Multi-Armed Bandits (MABs), Cloud Simulations (AWS, GCP, ...)

Beschreibung

We consider the problem of running a distributed machine learning algorithm on the cloud. This imposes several challenges. In particular, cloud instances may have different performances/speeds. To fully leverage the performance of the instances, we want to characterize their speed and potentially use the fastest ones. To explore the speed of the instances while exploiting them (assigning computational tasks), we use the theory of multi-armed bandits (MABs).

The goal of the research intership is to start by implementing existing theoretical algorithms [1] and possibly adapting them based on the experimental observations.

[1] M. Egger, R. Bitar, A. Wachter-Zeh and D. Gündüz, Efficient Distributed Machine Learning via Combinatorial Multi-Armed Bandits, submitted to IEEE Journal on Selected Areas in Communications (JSAC), 2022.

Voraussetzungen

  • Information Theory
  • Machine Learning Basics
  • Python (Intermediate Level)

Betreuer:

Laufende Abschlussarbeiten

Coding for Privacy and Security in Federated Learning

Beschreibung

In this internship, the student will read and summarize the recent progress on codes for privacy and security in federated learning.

Depending on the student's progress, we can investigate new ideas for security in private federated learning that improve upon the state0of0the-art.

Betreuer:

The Interplay of Fairness and Privacy in Federated Learning

Beschreibung

We study the impact of different measures of fairness on the privacy guarantees for individual clients in federated learning.

Betreuer: