Skip to content
  • Newsletter
  • About us
  • Contact
  • DE
  • EN
  • Newsletter
  • About us
  • Contact
  • DE
  • EN
  • Overview
  • Research
  • Here & now
  • Smart minds
  • Overview
  • Research
  • Here & now
  • Smart minds
  • Overview
  • Research
  • Here & now
  • Smart minds
  • Contact
  • Newsletter
  • About us
  • Overview
  • Research
  • Here & now
  • Smart minds
  • Contact
  • Newsletter
  • About us
Search
  • DE
  • EN
18. March 2025

Intuitive collaboration between humans and robots

match | Making production more flexible: Through improved human-robot collaboration, intuitive programming and the use of Augmented Reality, the Institute of Assembly Technology and Robotics (match) enables small and medium-sized enterprises to easily access automation.

The increasing automation and autonomy in production is key to addressing global economic and social challenges. While hardware, such as industrial robots, is already well developed, there is still potential to increase the flexibility and user-friendliness of such systems. At the Institute of Assembly Technology and Robotics (match) at Leibniz University Hannover, scientists are working to close this gap – through innovative approaches such as human-robot collaboration and intuitive programming.

Human-Robot Collaboration: Flexibility for SMEs

Industrial robots are already established in many areas, but their rigid programming and high safety requirements often make them unattractive for small and medium-sized enterprises (SMEs). This is where the match focuses on collaborative robots (cobots), which are also interesting for SMEs due to lower acquisition costs and simpler safety precautions.

A central aspect is user-friendly programming, for which SMEs do not need specialized programmers. In programming by demonstration, process experts demonstrate the desired task, which is then converted into a robot program. This “human-in-the-loop” approach uses human cognitive abilities to monitor the correctness of the process and make adjustments if necessary.

AR + 6D Pose Estimation = Intuitive and Flexible Automation

To make programming even more intuitive, the match is researching the use of Augmented Reality (AR). AR systems overlay the real field of vision with virtual content, enabling particularly simple and efficient interaction with the robot. However, for these technologies to reach their full potential, precise 6D pose estimation of objects is required.

6D pose estimation describes the determination of an object’s position and orientation in space. This information is crucial for robots to precisely grasp and place objects. Traditional methods were often limited to specific objects, but modern approaches based on machine learning can also recognize novel objects, provided a 3D model of the object is available.

A breakthrough in this area is foundation models, which are trained on a wide variety of objects and can therefore generalize. To generate the necessary training data, these rely on generative artificial intelligence (AI), which creates and adapts synthetic objects to improve generalizability.

Challenges and Solutions for Industrial Applications

Despite progress, challenges remain, particularly in evaluating methods for industrial use. Current benchmarks for 6D pose estimation are often based on objects that do not meet industry requirements. Metallic, nearly symmetrical, or scale-variant components are particularly underrepresented in test datasets.

To close this gap, scientists at the match have developed a robot-supported setup for automated recording and annotation of test datasets. This uses printed templates for positions on which objects are placed. Through the known transformation between template, robot, and camera, the data can be precisely annotated with the “ground truth”. This approach makes it possible to efficiently create domain-specific test datasets for various industrial applications in the future and to quantitatively evaluate the suitability of 6D pose estimation.

Extrinsic Calibration: Precision for AR Systems

Another research focus is the extrinsic calibration of AR headsets. When AR systems are used to program robots, the positions determined in the AR system must be precisely transferred to the robot’s coordinate system. Here, the match investigates the accuracy of such methods, especially for modern hardware like the Apple Vision Pro, and develops solutions for complete application cases.

Future Vision: Automation for All

The long-term goal of the match is to make automation accessible to companies of all sizes. Through intuitive programming, precise 6D pose estimation, and the use of AR systems, the scientists aim to increase the flexibility and efficiency of production processes. In ten years, robots should be commonplace not only in large industrial companies but also in SMEs – without the need for specialized personnel.

The match is working to make this vision a reality. The combination of human-robot collaboration, machine learning, and augmented reality creates the foundation for more flexible, efficient, and accessible automation.

by David Wendorff

Similar posts

  • Zusammenarbeit von Mensch und Roboter
  • How human skills are transferred to automated industrial trucks
  • Intuitive programming for precision assembly

Significance for Production

  • Increasing flexibility through human-robot collaboration
  • Quick and intuitive adaptation to new tasks through augmented reality
  • Full integration of human process knowledge
  • Increased flexibility through generalized pose estimation without training
  • Automated generation of real test data for industrial applications
A young woman wearing an AR headset interacts with a collaborative robot (Cobot) in a modern industrial facility. She is wearing a black T-shirt with the "JPD" logo and is pointing at the robotic system, which is positioned on a table with a marker pattern.
Intuitive robot programming with the help of augmented reality. (Photo: Sonja Bald, match)
An infographic on 6D pose estimation. At the top, challenges such as a skilled labor shortage and shorter product life cycles, as well as solutions like human-robot collaboration, are illustrated. Below, the process of 6D pose estimation is visualized in three steps: input (RGB image, 3D model, depth image), 2D detection, and determining the 6D position of an object.
6D Pose Estimation as the key to flexible automation and sub-steps of Pose Estimation. (Photo: match)
Various objects placed on a surface with printed markers, including metal brackets, cans, packaging, and plastic parts. The scene displays a mix of industrial and everyday objects, likely used for a robotics test on object recognition.
Exemplary compilation of various test objects with different properties for 6D Pose Estimation. (Photo: match)
A collaborative robotic arm with a sensor determines the 6D position of an object on a table marked with reference points. Different coordinate systems are color-coded: robot, sensor, object, and template. A blue line indicates the 6D pose estimation, while a purple line represents the known ground truth pose.
Test setup for automated data acquisition and annotation of real test data sets. (Photo: match)
A woman wearing an AR headset collaborates with a Cobot in an industrial environment. The robot, object, AR headset, and a marker pattern on the table are highlighted with color-coded coordinate systems and lines representing 6D pose estimation and extrinsic calibration.
Extrinsic calibration to transfer the 6D pose of an object into the robot's coordinate system. (Photo: match)

Contacts

David Wendorff

+49 (0)511 762-18250
wendorff@match.uni-hannover.de
https://www.match.uni-hannover.de/en/

Sebastian Blankemeyer

+49 (0)511 762-18249
blankemeyer@match.uni-hannover.de
https://www.match.uni-hannover.de/en/

This Page

drucken

recommend

  • tweet 

Similar posts

  • Zusammenarbeit von Mensch und Roboter
  • How human skills are transferred to automated industrial trucks
  • Intuitive programming for precision assembly

This Page

drucken

recommend

  • tweet 

ISSN 2198-1922 | Legal Information | Privacy notice | Article Sitemap