Neural Network Quantization
Neural Networks, Compression, Accuracy
Neural networks achieve state-of-the-art performance in many complex machine learning tasks (e.g., Object Detection, Image Classification, Audio Recognition etc.) In doing so, the respective models (weights and biases) size has exploded. This results in great power consumption, high inference latency and icreased memory complexity. It is therefore of interest to find ways to compress these models in order to achieve energy savings, inference speed and storing requirements. One very popular method to do so, is quantization. The task of the student is to explain how Post Training Quantization (PTQ) is applied, given that fixed-point representation is assumed .
 2106.08295.pdf (arxiv.org)
It is nice for the student to have some background knowledge on deep learning, e.g., what is a neural network, how it is represented, how it is trained etc. However, introductory material can be provided if a student is eager to learn, and questions on topics that are unclear to the student are always welcome.
In general, this is a student-driven task, therefore it is the student's job to plan and execute the review of the given paper. Support and guidance will be gladly provided if requested. There will also be a clear discussion of what is required in the final presentation as well as evaluation points, directly after the topic assignment.
Ordered Statistics Decoder: A Review
Ordered Statistics Decoder (OSD), universal decoders, short blocklength coding
The student is expected to provide an overview of the way of working of the decoder, its strengths and its limitations.
5G and beyond wireless communication systems gave/are expected to give rise to many new applications with stringent requirements that were not achievable with 4G or earlier systems. These requirements are: low latency, high reliability, battery life maximization among others.
To this extent, short blocklength coding has gained particular interest, and new research on efficient coding schemes is emerging. An idea previously discussed in the literature is the Ordered Statistics Decoder (OSD) . Recent works in literature are addressing its strengths, its shortcomings and are suggesting ways to make this decoder more practical and efficient in the short blocklength regime.
The objective of the student is to study the seminal paper , that introduces the idea of ordered statistics decoding and be able to describe how the decoder works, what are its strengths, what are its limitations.
 Soft-decision decoding of linear block codes based on ordered statistics