Jonas Kantic, M.Sc.

Research Associate 

Technische Universität München
Department of Electrical and Computer Engineering
Chair of Integrated Systems
Arcisstr. 21
80333 München
Germany

Tel.: +49.89.289.22962
Fax: +49.89.289.28323
Gebäude: N1 (Theresienstr. 90)
Raum: N2118
Email: jonas.kantic@tum.de

Curriculum Vitae

Education

  • 2017 - 2020 Master's Studies in Technical Informatics, Leibniz University Hannover
  • 2018 - 2019 Chinese Language, Beijing Foreign Studies Univeristy, Beijing
  • 2013 - 2017 Bachelor's Studies in Technical Informatics, Leibniz University Hannover

Work Experience

  • Since 2022 PhD student at the Chair of Integrated Systems, Technical University of Munich
  • 2021 Regular Research Assistant, Institute of Microelectronic Systems, Leibniz University Hannover
  • 2019 - 2020 Internship: BMW China Services, Beijing
  • 2014 - 2020 Student Research Assistant, Institute of Microelectronic Systems, Leibniz University Hannover

Open Student Work

Current Student Work

Leveraging Sparsity in CNN Accelerators for Efficient Edge AI Inference

Keywords:
CNN, Accelerator, Sparsity, Systolic Array

Description

The applications of Convolutional Neural Networks (CNNs) are spreading through
all sectors with their usage not being limited to only powerful computers. They are
also increasingly being used inside embedded devices with limited capabilities.
However, running such complex architectures inside resource-constrained devices
is challenging since they are expensive in terms of memory usage, energy
consumption, execution speed, etc. Therefore, continuous research is being
conducted to devise optimization techniques in CNN accelerators for efficient
inference.
One property that can be exploited inside CNNs is sparsity, i.e., how dense the zeros
within a computation are. In theory, zeros do not need to be calculated since they
do not contribute to the final result which introduces an opportunity for optimizing
the computations performed in a CNN accelerator to improve the efficiency.

The motivation behind this Master’s thesis is to investigate the architectures used
inside modern CNN hardware accelerators and to define architecture changes to
leverage sparse computations for improved efficiency.
The working packages of the Master’s thesis include:

  1. Research state-of-art architectures used in CNN hardware accelerators.
  2. Research techniques to improve hardware efficiency by leveraging sparsity.
  3. Design architecture changes for an existing CNN accelerator to optimize sparse calculations.
  4. Implement the architecture changes in RTL.
  5. Evaluate the new accelerator architecture against the original one.

 

Contact

frieder.jespers@nxp.com

Supervisor:

Jonas Kantic - Frieder Jespers (NXP Semiconductors)

Memory Capacity in Echo State Networks

Keywords:
Reservoir Computing, ESN, Memory Capacity
Short Description:
In this seminar work, the memory capacity of echo state networks shall be analyzed by reviewing relevant literature.

Description

In this seminar work, the memory capacity of echo state networks shall be analyzed by reviewing relevant literature. The following questions indicate possible topics for discussion and analysis:

  • How can memory in echo state networks be characterized and differentiated?
  • How can the memory capacity be measured?
  • Which parameters and / or architectural design decisions have an effect on the memory capacity?

The following papers may be used as a starting point:

  1. Jaeger: "Short term memory in echo state networks"
  2. Verstraeten et al.: "Memory versus Non-Linearity in Reservoirs"
  3. Rodan and Tino: "Minimum complexity echo state network"

 

Prerequisites

To successfully carry out this seminar work, you should:

  • work independently and self-organized
  • have strong reading and writing comprehension of scientific papers
  • perform structured literature research

 

Contact

Jonas Kantic

Room: N2118
Tel.: +49.89.289.22962
E-Mail: jonas.kantic@tum.de

 

Supervisor:

Hyperdimensional Computing and Integer Echo State Networks

Keywords:
Hyperdimensional Computing, Reservoir Computing, Echo State Networks
Short Description:
In this seminar work, the student is asked to discuss Hyperdimensional Computing and its application in Reservoir Computing.

Description

Echo State Networks (ESN) form a recent approach for adaptive and AI-based time series processing and analysis. In contrast to classical deep neural networks, they consist of only three layers: an input layer, the reservoir, and an output layer. Therefore, ESNs have a promising architecture for efficient hardware implementations.

Hyperdimensional Computing is a mathematical framework for computing in distributed representations with high-dimensional random vectors and neural symbolic representation.

Recently the reservoir of an ESN has been realized by a hypervector of n-bit integers and based on hyperdimensional computing resulting in an approximation of ESNs, called Integer Echo State Networks (intESN) [1].

In this seminar work, the student shall summarize the hyperdimensional computing framework. Furthermore, the application of hyperdimensional computing on ESNs in the form of the intESN.

Relevant Paper:

[1] Kleyko, D., Frady, E. P., Kheffache, M., & Osipov, E. (2017). Integer Echo State Networks: Efficient Reservoir Computing for Digital Hardware.
ArXiv. https://doi.org/10.1109/TNNLS.2020.3043309

 

Prerequisites

To successfully carry out this seminar work, you should:

  • work independently and self-organized
  • have strong reading and writing comprehension of scientific papers
  • perform structured literature research

 

Contact

Jonas Kantic

Room: N2118
Tel.: +49.89.289.22962
E-Mail: jonas.kantic@tum.de

 

Supervisor:

Hybrid Cellular Automata in Reservoir Computing

Keywords:
Cellular Automata, Reservoir Computing, ReCA
Short Description:
In this student work, hybrid CA-based reservoirs shall be analyzed, implemented, and evaluated.

Description

Introduction

Reservoir Computing (RC) is a promising and efficient computing framework that has been derived from neural networks and is especially suitable for time series data. In contrast to deep neural networks, which stack several layers one after the other, RC models only have three layers: an input layer, a reservoir, and an output layer.
Initially, the reservoir consists of several recurrently connected neurons and the model is called Echo State Networks (ESN). However, other reservoir implementations have been developed and employed since any dynamic system can serve as a reservoir within the RC framework.

Among the simplest types of dynamic systems are elementary Cellular Automata (CA). Acting on a regular grid of cells, each cell of a CA changes its state over time according to a simple predefined local rule. Despite their simplicity, CA can exhibit rich and complex behavior. Simple elementary CA have been shown to serve as the reservoir in RC models effectively. A modification of this approach is to use hybrid CA, in which not every cell adheres to the same rule.

Tasks

In this work, the tasks may include:

  • To implement a hybrid CA-based reservoir for RC models in Python using the TensorFlow framework
  • To evaluate the hybrid CA-based RC model and compare it with regular CA reservoirs as well as ESNs
  • Optional: To analyze the dynamic behavior of hybrid CA in terms of cycles and transients, and compare it with homogeneous (regular) CA

Prerequisites

In order to successfully carry out this work, you should:

  • be able to work independent and self-organized
  • have strong mathmatical skills; preferably have knowledge about finite fields / Galois fields
  • have good practice in programming with Python and the TensorFlow framework
  • have profound expertise in machine learning principles

Contact

Jonas Kantic | Room: N2118 | Tel: +49.89.289.22962 | E-Mail: jonas.kantic@tum.de

Supervisor:

Completed Student Work

Contact

Jonas Kantic

Chair of Integrated Systems

Office N2118, Building N1

Supervisor:

Jonas Kantic

Student

Yizhe Zhang

Contact

Email: fabian.legl@ifta.com

Supervisor:

Jonas Kantic, Fabian Legl (IfTA)
- Fabian Legl (IfTA)

Supervisor:

Jonas Kantic

Supervisor:

Jonas Kantic

Publications

(No documents in this view)

Preprints

J. Kantic and F. C. Legl and W. Stechele and J. Hermann. "ReLiCADA - Reservoir Computing using Linear Cellular Automata Design Algorithm" in arXiv, 2023, eprint: arXiv:2308.11522, DOI: https://doi.org/10.48550/arXiv.2308.11522.

Publications (Pre-TUM)

Journals

Klein, Simon Christian, Kantic, Jonas and Blume, Holger. "Fixed Point Analysis Workflow for efficient Design of Convolutional Neural Networks in Hearing Aids" in Current Directions in Biomedical Engineering, vol. 7, no. 2, 2021, pp. 787-790, DOI: https://doi.org/10.1515/cdbme-2021-2201.