## Neural Networks (NNs) for Direct Detection

#### Description

In [1] we consider a short-reach fiber-optic link with a single photodiode at the receiver, which is a so-called direct detector (DD). The DD outputs a signal, propotional to the squared **magnitude** of its input. At first glance, this makes phase modulation challgenging. In [1] we showed that inter-symbol intereference (ISI) can be used to retrieve the phase. A suboptimal symbol-wise MAP detector was then proposed for phase retrieval. However, the detector exhibits a large complexity, which grows exponentially in the amount of ISI.

The task of the student is to efficiently approximate the MAP detector using a NN. An appropriate NN type/structure needs to be selected. Finally, lower bounds on the achievable rates are computed to evaluate the performance of the NN and compare it to the MAP detector [1].

[1] D. Plabst *et al*., "Achievable Rates for Short-Reach Fiber-Optic Channels With Direct Detection," in *Journal of Lightwave Technology*, vol. 40, no. 12, pp. 3602-3613, 15 June15, 2022, doi: 10.1109/JLT.2022.3149574.

#### Prerequisites

Machine Learning

Statistical Signal Processing

#### Supervisor:

## Capacity upper Bounds for ISI Channels with Direct Detection

#### Description

We are interested in computing upper bounds (on capacity) for frequency-selective channels with a memoryless nonlineary at the transmitter/receiver.

One application for these bounds are short-reach fiber-optic communication systems with a single photodiode at the receiver. The photodiode is a memoryless nonlinearity, as it produces an output that is proportional to the squared magnitude of the input signal.

A simple upper bound for the above model is given in [Sec. III D, 2].

D. Plabst*et al*., "Achievable Rates for Short-Reach Fiber-Optic Channels With Direct Detection," in

*Journal of Lightwave Technology*, vol. 40, no. 12, pp. 3602-3613, 15 June15, 2022, doi: 10.1109/JLT.2022.3149574.

#### Prerequisites

Information Theory

Linear System Theory

#### Supervisor:

## Capacity Lower Bounds for ISI Channels

#### Description

Capacity and capacity-achieving distributions are available for some memoryless channels. For the average-power constrained memoryless channel with additive white Gaussian noise, Shannon provided a closed form description of capacity. Closed form solutions for capacity also exist for some simple discrete memoryless channels (DMCs). In addition, numerical algorithms for calculating the capacity (and capacity-achieving discrete distributions) of general DMCs are available through the works of Arimoto and Blahut.

However, often wireless or fiber-optic channels of practical interest contain intersymbol interference (ISI), and the memory due to ISI can no longer be neglected. When considering discrete channel inputs, one may be interested in determining the capacity and capacity-achieving distribution for these ISI channels. In fact, discrete stationary Markov processes asymptotically achieve the capacity of finite-state ISI channels as the order of the Markov process goes to infinity.

An algorithm [1] for optimizing discrete Markov processes of a certain order M is presented to maximize the information rate between channel input and output. As the order of the Markov process increases, tighter lower bounds on capacity are obtained.

[1] A. Kavcic, "On the capacity of Markov sources over noisy channels," GLOBECOM'01. IEEE Global Telecommunications Conference (Cat. No.01CH37270), 2001, pp. 2997-3001 vol.5, doi: 10.1109/GLOCOM.2001.965977.

Further reading (if required):

[2] P. O. Vontobel, A. Kavcic, D. M. Arnold and H. -A. Loeliger, "A Generalization of the Blahut–Arimoto Algorithm to Finite-State Channels," in IEEE Transactions on Information Theory, vol. 54, no. 5, pp. 1887-1918, May 2008, doi: 10.1109/TIT.2008.920243.

[3] T. Lang, A. Mezghani and J. A. Nossek, "Channel-matched trellis codes for finite-state intersymbol-interference channels," 2010 IEEE 11th International Workshop on Signal Processing Advances in Wireless Communications (SPAWC), 2010, pp. 1-5, doi: 10.1109/SPAWC.2010.5670890.

#### Prerequisites

Information Theory