High-Fidelity Teleoperation and Data Collection Pipeline for Humanoid and Quadruped Robots
- Institut
- Professur für autonome Fahrzeugsysteme (TUM-ED)
- Typ
- Masterarbeit
- Inhalt
- experimentell theoretisch konstruktiv
- Beschreibung
Background
Join our team to build the foundational data engine for next-generation Embodied AI, a crucial step towards training robust models for advanced robotic control! Are you eager to work hands-on with state-of-the-art robotics hardware and human-computer interaction technologies? Do you want to bridge the gap between human intent and robotic execution? This project offers a unique opportunity to directly address the data bottleneck in modern robotics by building a seamless teleoperation and data-logging pipeline for our newly acquired Unitree robots (G1 humanoid, B2W quadruped, Z1 arm). Foundation models in robotics are only as capable as the data they are trained on. To replicate and push beyond state-of-the-art approaches like Physical Intelligence’s $\pi_0$, we need massive amounts of high-fidelity human demonstration data. This requires intuitive teleoperation interfaces that map human kinematics to complex robot embodiments in real-time, while simultaneously logging perfectly synchronized multimodal data (RGB, depth, proprioception, and actions). We will design and implement the teleoperation stack using VR headsets and Motion Capture systems. We will solve complex kinematic retargeting problems, ensure low-latency communication, and build a standardized, open-source-compatible data logging framework (e.g., RLDS, LeRobot formats). Your work will be the direct foundation upon which our VLA models are trained, making you a critical part of our end-to-end robotics pipeline.Example Thesis Topics (subject to availability):
- VR-Based Teleoperation for Dexterous Humanoid Manipulation: Develop a low-latency VR interface (e.g., Meta Quest) to teleoperate the Unitree G1 and Z1 arm, focusing on intuitive mapping of human hand/arm movements to the robot's end-effectors.
- Real-Time Kinematic Retargeting from MoCap to Robot Embodiments: Investigate and implement optimization algorithms to map human joint trajectories captured via a Motion Capture system to the distinct kinematic structures of the G1 humanoid or B2W quadruped.
- Standardized Multimodal Data Pipeline for Embodied AI: Design a highly efficient recording framework that synchronizes high-frequency proprioceptive data, multi-camera video streams, and command actions, outputting directly into formats ready for foundation model training.
- Haptic Feedback Integration for Improved Teleoperation Yield: Explore the integration of force-torque sensors and haptic feedback into the teleoperation loop to improve the quality and success rate of human demonstrations for delicate manipulation tasks.
Technologies Used
Python, C++, ROS 2, VR/AR SDKs, Motion Capture Softwares, Kinematics/Dynamics, Unitree SDK, Hugging Face LeRobot, 3D Geometry, WebSockets/Zenoh.
Your Benefits: Join a High-Performance Robotics Team
- Impactful Research: Work on a project where your code doesn't live in a silo; it is a critical gear in an end-to-end pipeline. Your results will directly enable robots to perform complex tasks.
- Top-Tier Hardware Stack: Gain exclusive hands-on experience with NVIDIA DGX (training), Jetson Thor (inference), and Unitree Humanoids/Quadrupeds - very similar stack used by industry leaders like Tesla, Figure AI, and Physical Intelligence.
- Scientific Publication: We aim for high-impact results. If your work meets the quality standards, we will co-author and submit a paper to top-tier robotics/AI conferences (e.g., ICRA, IROS, CoRL, or CVPR).
- Professional Career Launchpad: This thesis is designed to mirror the workflow of elite AI labs. We provide dedicated mentorship and professional support to help you land roles at top-tier robotics startups or Big Tech AI labs.
- Dynamic Lab Culture: You will be part of a "squad" of motivated Master’s students working in parallel, fostering a collaborative, fast-paced, and supportive environment.
Requirements
We are looking for students who know their thesis is not just as a degree requirement, but as a career-defining project.
Must-Have:
- English Proficiency: High level of written and spoken English (the language of our research and documentation).
- Proactive Mindset: You are comfortable with a "fail fast, learn fast" approach and is comfortable solving hands-on hardware/software integration challenges.
- Independence: Ability to own a technical module and drive it forward while communicating effectively with the rest of the team.
- Growth Path: A passion for Robotics/AI and an eagerness to learn new technologies.
Nice-to-Have (The "Plus"):
- Technical Foundation: Proficiency in Python and/or C++.
- Domain Experience: Prior exposure to PyTorch, ROS 2, or physics simulators (Isaac Sim/MuJoCo).
Hardware Skills: Experience working with robotic hardware, sensors, or VR systems.
Ready to build the future of Embodied AI? Send your CV, recent transcript, and a brief email on why you are the right fit for this specific "squad" and your career goals.
- Möglicher Beginn
- sofort
- Kontakt
-
Roberto Brusnicki
roberto.brusnickitum.de