Project Lab Human Activity Understanding

Lecturer (assistant)
TypePractical course
Duration5 SWS
TermWintersemester 2023/24
Language of instructionEnglish
Position within curriculaSee TUMonline
DatesSee TUMonline

Admission information


Upon successful completion of this module, students are able to understand the challenges in Human Activity Understanding and design processes for automatic sensor-based recognition of ongoing human activity. Students are able to collect and utilize synthetic data as well as multi-camera sequential data in ego-perspective and stationary setups, annotating and extracting relevant semantic information, and learning about representation for spatial and temporal data. Students are able to learn how to use AI models and algorithms to extract information available from a scene and recognize and predict human activity based on the extracted information. They are eventually able to analyze and evaluate the results of the various algorithms involved as well as the solutions they have designed.


Sensor data collection and annotation - Multi-sensor and multi-view data collection and processing, including color/depth/IMU - Synthetic data generation for Human Actions - Accelerated ground truth annotation using interactive instance segmentation and tracking Semantic inference building blocks - Object detection - Human and Object pose estimation/tracking Graph representation of spatial and temporal data - 3D scene graphs - Spatio-Temporal graphs - Knowledge Bases (Ontologies) Sequential deep learning models for Human Activity Recognition and Anticipation - Recurrent Neural Networks - Graph Networks - Transformers

Teaching and learning methods

- Supervised weekly lab sessions with several introductory lectures by research assistants at the beginning of the course and supervised practical implementation based on the provided skeleton codes. - Individual methods and solutions introduced by the student - Lectures on theoretical basics of project planning and technical management. and tools for collaboration (SCRUM, Gitlab, Wiki, etc.) - Final project: individual and group work with independent planning, execution, and documentation - Seminar: Presentation and final results and discussion (reflection, feedback). Media formats: The following media forms will be used: - Presentations - Script and review articles from the technical literature - Tutorials and software documentation - Development Environment (virtual machines on a GPU server) - Simulation environment - Data collection setup


- [20%] Implementation of introductory practical tasks in the field of Human Activity Understanding in Python - data acquisition and processing, recognition of people and objects in the scene, obtaining a semantic understanding of ongoing activity (4 programming tasks). - [60%] Hands-on project work - creating initial project plans and presenting them (8-10 Min. Presentation), regularly discussing work progress and next steps with supervisor, technical problem solving and using appropriate tools for efficient teamwork (4 project sessions). - [20%] Ca. 20-minute presentation of results, including demo, followed by a ca. 10-minute discussion.


Previous Lab Project Demos

Kick off meeting announcement

The lab kick-off meeting will be in-person on 19.10.2023 at the Seminar Room 0406 (13:15 to 14:45)