Skip to main content Start main content

AIoT Empowered All-Terrain Personal Vehicle for People with Disabilities

Principal Investigator:

Mr Huafeng XU, Co-founder and CEO, Libpet Tech Limited (a PolyU startup)

The AIoT empowered all-terrain personal vehicle for people with disabilities incorporates AI technology and cutting-edge features to overcome the challenge of gaps, stairs, sand and grass. With an anti-collision system and precise remote control, it is convenient and safe for users to operate.

Video: https://www.youtube.com/watch?v=clVC4BkDdRs

AI Ambassador (Digital Human Version)

Principal Investigator:

Mr Hoi Ho LAM, Co-founder & CTO, Asiabots Limited (a PolyU startup)

The AI Ambassador, a product of Asiabots, is an innovative approach to AI-driven customer service. Asiabots offers a diverse selection of avatars designed to suit different scenarios. These avatars, which span from the captivating 2.5D Japanese animation style to the incredibly lifelike 3D polygon style, were developed in collaboration with renowned international film production company Digital Domain. The AI Ambassador also boasts super-realistic digital human avatars.

The applications of AI Ambassador are extensive. It is currently deployed in numerous industries – including the real estate, government, banking, insurance, hospitality, education, and commercial sectors – for a wide range of purposes. The AI Ambassador is a valuable tool for automating tasks such as handling inquiries, navigating users to their destinations, promoting sales, and offering guidance.

What makes the AI Ambassador truly unique is that it was originally invented in Hong Kong by Asiabots.

Contrast-free Virtually-enhanced MRI for Precise Tumour Treatment in Carcinoma

Principal Investigator:

Prof. Jing CAI, Associate Dean of Faculty of Health and Social Sciences, Professor of Department of Health Technology and Informatics, Advisor of MedVision Limited (a PolyU Startup)

This innovative system enables precise tumour delineation using contrast-free, virtually enhanced MRI data. It can:

Avoid harmful contrast agents that cause discomfort and side effects for patients
Obtain clear images of tumours comparable to conventional contrast-enhanced MRI
Delineate tumours precisely for accurate diagnosis and personalised treatment
Optimise treatment planning for precise targeting and reduced damage to healthy tissue
Improve efficiency and save time with efficient and reliable systems

This innovation has profound implications for cancer diagnosis and treatment. It enhances patient safety, provides a non-invasive option, and improves the accuracy of tumour delineation. This translates into precise treatment planning, personalised care, and reduced patient risk. The streamlined workflow also improves resource utilisation and can reduce healthcare costs. Ultimately, it contributes to better patient outcomes, higher treatment success rates, and improved overall quality of cancer care.

Light Field Microscope Imaging and 3D Visualizing System

Principal Investigator:

Ir Prof. Wing-bun LEE, Emeritus Professor of Department of Industrial and Systems Engineering, Founder and Director of Watts Optical Instruments Limited (a PolyU academic-led startup)

Optical microscopes have traditionally used prisms to divide the optical path into stereo viewing angles, thereby enabling closer examination of 3D specimens. However, conventional optical microscopes can only capture 2D images. To overcome this limitation, we have developed a novel light field imaging system that can capture and visualise parallax 3D images. Our innovation lies in the imaging software, design, and precision manufacturing of compound eyes or microlens arrays based on biomimicry for use with standard commercial microscopes and camera sensors. Compared with existing microscopes, our light field system offers high-resolution 3D information, an expanded depth of field, complete parallax multi-views, post-capture image refocusing, depth analysis capabilities, and compatibility with a wide range of VR and holographic display technology.  Our innovation is the first of its kind in the market and can find applications in 3D biomedical microscopy and glasses-free 3D microsurgery systems.

Mixed Reality-based Radiotherapy and Imaging Simulation for Clinical Education and Services

Principal Investigator:

Dr Shara LEE Wee-yee, Associate Professor, Department of Health Technology and Informatics

The Hybrid Interactive Virtual Environment (HiVE) for Clinical Education and Training represents a revolutionary leap in healthcare training by seamlessly blending the tactile experience of hands-on practice with the boundless possibilities of virtual immersion. This innovative platform not only enhances clinical skill proficiency but also promotes adaptability in the ever-evolving healthcare landscape.

HiVE’s educational value extends to clinical service and patient care, providing groundbreaking radiotherapy rehearsals tailored specifically for children with cancer. By equipping our students with the most advanced training tools, it ensures that young patients are practically and emotionally prepared for their treatment, thereby minimising anxiety and maximising therapeutic success. An impressive 85.7% of children, especially those aged three to eight, successfully avoided any forms of anaesthesia throughout their entire treatment course after attending our workshop. All 35 caregivers found the workshop exceptionally helpful in preparing their child for radiotherapy in terms of knowledge, patient care, and emotional support.

HiVE represents a melting pot of innovative technology and healthcare solutions, forging a unique composite of unparalleled clinical education and compassionate, patient-centred service for our next generation.

Videos:
(i) HERO-CARE Programme Introduction: https://youtu.be/S1UglMJlzrY 
(ii) HERO-CARE RT Education Workshop Walkthrough: https://youtu.be/_nw_XNpmYWI 

ZC-01TM: Fully Automatic Commercial Washroom Cleaning Robot

Principal Investigator:

Mr Curry LEE Tsz-chung, CEO, ZeeqClean Technology Limited (a PolyU startup)

ZC-01™ is a commercial assistive toilet cleaning robot that can clean toilets and urinals in a contactless way. It uses non-visual LiDAR (Light Detection and Ranging) and infrared sensors for adaptive cruise, which enable it to detect and open the toilet lid before cleaning. It also has drying and UV sterilisation functions to ensure a high level of hygiene. The ZC-01™ is designed for Hong Kong's commercial buildings, government buildings, large public toilets, airport, and the AsiaWorld-Expo, as well as large highway rest areas in the Mainland – all of which require a large amount of cleaning.

ZC-01™ can help reduce commercial cleaning costs, and save energy and chemicals. It is also eco-friendly as it records energy and chemical consumption. Most importantly, ZC-01™ can reduce work aversion in commercial washroom cleaning.

Video: https://youtu.be/4S7EmnLfyw4?si=ncdcjD2EUiSUbwD4

Autonomous-legged Robotic Dog Providing Navigation Support for the Blind

Principal Investigators:

Prof. Kenneth FONG Nai-kuen, Professor of Department of Rehabilitation Sciences, Director of Research Centre for Assistive Technology
Dr Jacky CHUNG Kin-hung, Senior Engineering Manager of Industrial Centre, Co-Principal Investigator of Research Centre for Assistive Technology

This proof-of-concept study explores the use of an autonomous-legged robotic dog as a guide dog for the blind. Its main objectives are to evaluate the practicability and usability of the robotic guide dog in terms of the following:
(1) User-machine interface – How comfortable and easy is it for blind users to handle the robotic guide dog to use the controller? How much trust do they build when using the robotic guide dog to lead them?
(2) Processor – How well can the robotic guide dog find routes and identify obstacles using its LiDAR (Light Detection and Ranging) camera and programmed algorithm?
(3) Activity output – How safely can the robotic guide dog navigate different indoor and outdoor routes with various obstacles and reach the destination?
(4) Environmental interface – How capable is the robotic guide dog of overcoming different social contexts and landscapes? Is it smart enough to avoid obstacles?

The robotic guide dog will provide an option for blind people to navigate in the community, thereby improving their mobility and safety when real guide dogs are unavailable.

Your browser is not the latest version. If you continue to browse our website, Some pages may not function properly.

You are recommended to upgrade to a newer version or switch to a different browser. A list of the web browsers that we support can be found here