PolyU builds advanced human-robot collaboration system, empowering high-end manufacturing tasks
18 Dec 2025
With human-robot collaboration at the core of Industry 5.0, a research team at The Hong Kong Polytechnic University (PolyU) has made significant progress in this field, developing a new generation of “human-machine symbiotic” collaborative manufacturing systems. In addition to perceiving complex environments in real time and accurately interpreting operators’ intentions, the system can achieve skill transfer and self-learning via simple demonstration, while carrying out autonomous process code generation and automatic adjustment for highly accurate task execution. It has been successfully applied to high-end manufacturing tasks such as autonomous drilling on large aircraft and the disassembly of electric vehicle batteries, laying a solid foundation for a new model of human-centric smart manufacturing.
The goal of human-robot synergy is to combine the adaptability and responsiveness of humans with the precision and stability of machines. Led by Prof. ZHENG Pai, Endowed Young Scholar in Smart Robotics and Associate Professor of the PolyU Department of Industrial and Systems Engineering, the research team has developed the “Mutual Cognitive Human-Robot Collaboration Manufacturing System” (MC-HRCMS). With a shift away from pre-programmed operations, this system is centred upon holistic scene perception—by collecting and analysing multimodal sensing data including vision, haptics, language and physiological signals, it enables highly accurate and comprehensive environmental analysis, while carrying out autonomous decision-making and flexible task execution.
The system features advanced machine learning and 3D scene perception capabilities that deliver efficiency and safety, greatly enhancing fluid human-robot interaction in complex manufacturing scenarios. Through industry collaboration projects, the team has tailored human-robot collaboration systems for multiple leading enterprises and successfully deployed them across various scenarios that involve precision and/or complex work procedures.
Prof. Zheng said, “The global manufacturing industry is shifting towards a human-machine symbiotic paradigm that emphasises more flexible automation. Our research aims to develop a paradigm that offers multimodal natural perception, cross-scenario skill transfer and domain foundation-model autonomous execution, so that robots are no longer just tools, but intelligent agents that can evolve with human operators. This provides smart factories with a new path beyond pre-programmed automation.”
Semi-structured and unstructured production scenarios, such as personalised manufacturing, often involve large-scale, complex assembly, disassembly and inspection processes that demand high cognition and rapid adaptation. In this regard, the team introduced a “novel vision-language-guided” planning framework that combines Large Language Models (LLMs), Deep Reinforcement Learning (DRL), and Mixed-Reality Head-Mounted Displays (MR-HMD), enhancing the ability to execute personalised and other unpredictable production tasks.
A key innovation of the framework is the combination of a vision-language-guided target object segmentation model with language-command-driven task planning, allowing the system to integrate visual information with language-based instructions. This enables robots to comprehend complex task semantics, interpret dynamic scenes and collaborate efficiently with human operators. In particular, the head-mounted device enables real-time data acquisition and provides immediate, intuitive guidance to operators, redefining the human-machine interaction interface.
Prof. Zheng said, “The future of smart manufacturing is not about machines getting smarter to replace humans, but about creating systems where humans and robots learn, adapt and succeed together to achieve higher productivity and flexibility. To meet this need, the next-generation robot manipulators should be capable of continual learning and optimisation under human guidance, enabling efficient and natural human-robot interaction.”
To further advance human-robot collaboration, Prof. Zheng will also lead his team in exploring self-configurable human-robot networks, skill transfer mechanisms and autonomous multi-agent task execution. By building a deeply human-centric intelligent manufacturing system and expanding it into more key domains, the team strives to guide society towards a technology-empowered, empathetic and human-oriented smart era.
With his dedication to researching “human-machine symbiotic” collaborative manufacturing systems, Prof. Zheng was awarded funding from the Excellent Young Scientists Fund by the National Natural Science Foundation of China in 2024. He now leads the RAIDS research team on the above projects. For more details, please visit: https://www.raids.group/
***END***
Press Contacts
Communications and Public Affairs Office
You may also like