(keine Einträge)
Career and Jobs
For Graduates & Professionals
Student Projects & Jobs
Multifingered Robot Hand Simulation and Control
Human-Robot Interaction
Project description:
As robotics continues to evolve, there is a critical need for robust vision-guided safety approaches that effectively integrate human visual tracking methods to ensure compliance with current safety standards and operational guidelines in collaborative and shared environments. Achieving this requires precise human pose tracking, which forms the basis for safety algorithms that enable robots to dynamically adapt to human movements while maintaining task performance. However, high tracking accuracy often comes at high costs, including complex sensor fusion setups that must integrate multiple camera inputs with human motion models. The latter can be further guided by task-specific knowledge and requirements. To address this challenge, this project focuses on developing robust, scalable human tracking approaches that balance the accuracy, cost, and complexity of human motion tracking setups.
Within the context of our ongoing research, you will have the opportunity to collaborate with experienced researchers at our lab, gaining hands-on experience with different camera systems, pose tracking algorithms and real robots. This opportunity will allow you to expand your robotics development and programming skills while contributing to cutting-edge research in safe human-robot collaboration.
Possible topics to be addressed within the scope of the project:
- Markerless human pose tracking solutions and reconstruction frameworks against marker-based MoCap system (using Vicon MoCap as reference ground truth)
- Performance evaluation of multiple commercial RGB-D cameras and open-source tracking solutions for the motion tracking of human body landmark points
- Proposal of suitable visual sensor fusion approaches for merging the pose tracking outputs and improving its performance
- Development of human motion model(s) and its synchronization with the sensor fusion estimations
- Quantification of safe performance trade-offs vs. visual tracking errors
Prerequisites:
- Master-level studies in Electrical Engineering, Informatics, Computer Science or any relevant programme
- Working knowledge of computer vision and camera-based systems
- Practical experience with ROS, with good C++/Python programming skills
- Good knowledge of state estimation and sensor fusion algorithms (e.g., Kalman Filter)
- Proficiency with MATLAB/Simulink
- Ability to work in a well-structured and organized manner
Work places:
- Georg-Brauchle-Ring 60-62, 80992 München
- Carl-Zeiss-Straße 8, 85748 Garching bei München
Contact info:
- Mazin Hamad, M.Sc. (mazin.hamad@tum.de)
- Dr. Samuel Kangwagye (s.kangwagye@tum.de)
Robot Control
Project description:
A critical factor in agile production is the efficient flow of material, also known as intralogistics. Although very promising as a flexible component in intralogistics chains, robotics has not yet found its way into agile production. Especially for operation alongside humans, current robots lack the required high degree of flexibility, capability, cost-effectiveness and safety. We are currently developing cutting-edge agile production robotic systems to execute highly dynamic yet efficient motions and manipulation tasks. These systems will have predictive planning capabilities, which allow safe and efficient operation within unknown, changing environments shared with humans. They must meet the following objectives in terms of efficiency in manipulation and human-robot co-production:
- Dynamic whole-body motion/manipulation capabilities
- Risk-aware motion planning and safety
- Energy efficiency
- Human-like performance
Within the context of this project, you will have the opportunity to collaborate with experienced researchers at our lab. Furthermore, this position also allows you to improve your robotics development and programming skills by working on a mobile robotic manipulator system.
Possible topics to be addressed within the scope of the project:
- Modular whole-body dynamic modeling and identification
- Whole-body motion control of wheeled mobile manipulators
- Safety issues emerging from mobile robots navigating in industrial environments
- Human safety in collaboration with mobile manipulation systems in industrial use-cases
Prerequisites:
- Master-level studies in Electrical Engineering, Informatics, Computer Science or any relevant programme
- Good knowledge of robotics software development, especially dynamics, motion planning and control.
- Practical experience with ROS (hands-on experience/previous projects with Franka Emika robot arm is a plus)
- Excellent C++ programming skills
- Proficiency with Matlab/Simulink
- Ability to work well structured and organized
Work places:
- Georg-Brauchle-Ring 60-62, 80992 München
- Carl-Zeiss-Straße 8, 85748 Garching bei München
Contact info:
- Mazin Hamad, M.Sc. (mazin.hamad@tum.de)
- Dr. Samuel Kangwagye (s.kangwagye@tum.de)
Telepresence
Human modeling
Brain Computer Interface Systems
Robot Learning
Apply before April 30, 2025
Background
Uncertainty poses a significant challenge in modern control systems, particularly for real-time networked applications. This research project explores cutting-edge online learning techniques using distributed Gaussian Processes (GPs) to enhance the performance of networked control systems under uncertainty [1,2]. You will design a computationally efficient, real-time learning framework to enable adaptive and robust control, focusing on Euler-Lagrange systems with unknown dynamics. The approach integrates a data-driven model, accounts for communication delays, and adapts to dynamic network conditions [3]. Stability will be rigorously analyzed using the Lyapunov theorem, with the proposed controller validated through simulations and experiments on FRANKA robots.
Your Tasks
- Develop an online GP-based learning framework for networked control systems
- Investigate real-time inference methods for GPs to ensure computational efficiency
- Tackle challenges such as communication delays and distributed learning
- Implement and evaluate the framework in simulation and on real robotic hardware
Requirements
- Highly self-motivated and self-independent
- Solid knowledge on Machine Learning, Control Theory and Robotics
- Python and C++ and/or MATLAB programming experience
Application Process
To apply, send your CV, transcript, and supporting documents to Dr. Zewen Yang (zewen.yang(at)tum.de) and Dr. Hamid Sadeghian (hamid.sadeghian(at)tum.de), before 30 April 2025.
Chair of Robotics and Systems Intelligence (RSI)
Munich Institute of Robotics and Machine Intelligence (MIRMI)
Technical University of Munich
Reference:
[1] Zewen Yang, Xiaobing Dai, Hirche Sandre. "Asynchronous Distributed Gaussian Process Regression." In the Thirty-Ninth Conference on Artificial Intelligence (AAAI), Philadelphia, Pennsylvania, USA, 2025.
[2] Zewen Yang, Songbo Dong, Armin Lederer, Xiaobing Dai, Siyu Chen, Stefan Sosnowski, Georges Hattab, and Sandra Hirche. "Cooperative Learning with Gaussian Processes for Euler-Lagrange Systems Tracking Control under Switching Topologies." In 2024 American Control Conference (ACC), pp. 560-567. IEEE, 2024.
[3] Xiao Chen, Youssef Michel, Hamid Sadeghian, and Sami Haddadin. "Network-aware Shared Autonomy in Bilateral Teleoperation." In 2024 IEEE-RAS 23rd International Conference on Humanoid Robots (Humanoids), pp. 888-894. IEEE, 2024.
Electronics
Mechatronics System Developement
Internship/Master thesis | Application deadline: May 20, 2025

Recent advancements in robotics, especially concerning humanoids and quadrupeds, are largely due to the adoption of novel actuator technologies. These technologies are involving high power density BLDC motors with a lower gear ratio, with the aim of maintaining good proprioceptive feedback for control. [1] However, active development of new actuation concepts is ongoing. One of the important directions is an augmentation of such actuators with mechanical springs for storing and releasing energy at the dynamic peaks.
Our work focuses on the development and testing of one such actuator, a version of the Parallel Elastic Actuator. This particular task includes the manufacturing of a simple test-actuator that integrates a backdrivable motor with a spring and a torque sensor connected in parallel. Further, a performance characterization of the motor will be conducted (Bode plot analysis, etc.). Additionally, an impedance controller will be evaluated on such a setup.
What you will gain:
- Experience in Modeling and Control of Robotics systems
- Experience building, prototyping
- Best design practices for torque sensor integration in actuators
- Insights into our System Development and access to our community
- BLDC motor control
Requirements from candidates:
- Mechanical Engineering background
- Completed classical control courses (or some project experience in control)
- Any CAD software for the Part designs (such as Solidworks, Fusion 360,etc.)
- Matlab skills
- Basic skills in Electronics
- Plus are:
- Understanding how Motors work
- Familiarity with GIT
- Working skills in Ubuntu operating system
To apply, you can send your CV, and short motivation to:
Supervisors
[1] P. M. Wensing, A. Wang, S. Seok, D. Otten, J. Lang and S. Kim, "Proprioceptive Actuator Design in the MIT Cheetah: Impact Mitigation and High-Bandwidth Physical Interaction for Dynamic Legged Robots," in IEEE Transactions on Robotics, vol. 33, no. 3, pp. 509-522, June 2017, doi: 10.1109/TRO.2016.2640183.
Studentische Hilfskräfte (HiWi)
Andere Kategorien
Data Protection Information
When you apply for a position with the Technical University of Munich (TUM), you are submitting personal information. With regard to personal information, please take note of the Datenschutzhinweise gemäß Art. 13 Datenschutz-Grundverordnung (DSGVO) zur Erhebung und Verarbeitung von personenbezogenen Daten im Rahmen Ihrer Bewerbung. (data protection information on collecting and processing personal data contained in your application in accordance with Art. 13 of the General Data Protection Regulation (GDPR)). By submitting your application, you confirm that you have acknowledged the above data protection information of TUM.