Matthias Kissel, M.Sc.
Technische Universität München
Lehrstuhl für Datenverarbeitung
Tel.: +49 (0)89 289 23623
Fax.: +49 (0)89 289 23600
- Utilization of structures in (deep) neural networks
- Neural network architectures and their influence on learning performance
- Use of expert knowledge for the training of neural networks
- Design of new training algorithms for neural networks
- Kissel, Matthias, and Klaus Diepold. "Sobolev Training with Approximated Derivatives for Black-Box Function Regression with Neural Networks." ECML/PKDD (2). 2019.
- Kissel, Matthias, Martin Gottwald, and Klaus Diepold. "Neural Network Training with Safe Regularization in the Null Space of Batch Activations." International Conference on Artificial Neural Networks. Springer, Cham, 2020.
- M. Corletto, M. Kissel and K. Diepold, "Impact of Real-World Market Conditions on Returns of Deep Learning based Trading Strategies," 2021 International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME), 2021, pp. 01-06.
Selected supervised Theses
- GANs learning to draw oil paintings (Beyrem Kaddech)
- Optimierung der Architektur neuronaler Netze für modulare Probleme (Anna Schua)
- Sobolev Training with higher order Derivatives (Thomas Hartmann)
- Machine Learning for Time Series Prediction in the Financial Industry (Arturo Buitrago Mendez)
- On Generating Mathematical Formulae (Luca Sacchetto)
- Extreme Learning Machines with Structured Matrices (Ruslan Mammadov)
- Automatisierung eines 3D-Druck-Labors (Matthias Emde)
- Detection of Teeth Grinding and Clenching using Surface Electromyography (Hella Toto-Kiesa)
- Latency Prediction for Wireless System Synchronisation (Julian Eder)
- Deep Learning for Financial Time Series Prediction (Mirko Corletto)
- Ridge Regression for Big Data Applications: Trade-Off between Memory Consumption and Learning Speed (Alexander Schindler)
- Federated Learning in Pedestrian Trajectory Prediction Tasks (Claas Brüß)
- Distilling Neural Networks for Real-Time Drone Control (Anna Schua)
- Approximative Sparse Factorization of Neural Network Weight Matrices (Michael Brandner)
- Causal Regularization in Deep Learning Using the Average Causal Effect (Kathrin Khadra)
- Mahalanobis distance based time series classification for resource constraint systems (Daniel Stümke)
- Linear and Logarithmic Quantization Approaches for Efficient Inference with Deep Neural Networks (Constantin Berger)
- Neural Network Online-Pruning: Accelerating Weighted Sum Calculation by Early Stopping (Julian Lorenz)
- Approximation of Weight Matrices using Hierarchical Matrices (Till Hülder)
- Algorithms for Matrix Approximations with Time Varying Systems (Stephan Nüßlein)