We are proud to announce that Prof. WU Yujie, our Assistant Professor and Presidential Young Scholar, has been honoured with the prestigious Brain-inspired Intelligence Award, a category of the inaugural “China Brain-Machine Intelligence Rising Star Awards.” The award ceremony was held during the 2025 World Young Scientist Summit on 24 October 2025 in Wenzhou, Zhejiang Province.   Six outstanding young scientists under the age of 40 were recognised at the first "China Brain-Machine Intelligence Rising Star Awards," jointly organised by the World Association of Young Scientists (WAYS) and the State Key Laboratory of Brain-Machine Intelligence at Zhejiang University. The awards celebrate significant contributions to the fields of brain-inspired intelligence and brain-computer interfaces.   Prof. Wu’s research focuses on brain-inspired artificial intelligence, computational neuroscience, and machine learning, with the primary aim of bridging neuroscience and AI to advance general intelligence. At the summit, Prof. Wu presented his groundbreaking paper, “Brain-inspired multimodal hybrid neural network for robot place recognition,” which showcases innovative approaches to spatial intelligence in robotics.   Paper Abstract:  Place recognition is an essential spatial intelligence capability for robots to understand and navigate the world. However, recognizing places in natural environments remains a challenging task for robots because of resource limitations and changing environments. In contrast, humans and animals can robustly and efficiently recognize hundreds of thousands of places in different conditions. Here, we report a brain-inspired general place recognition system, dubbed NeuroGPR, that enables robots to recognize places by mimicking the neural mechanism of multimodal sensing, encoding, and computing through a continuum of space and time. Our system consists of a multimodal hybrid neural network (MHNN) that encodes and integrates multimodal cues from both conventional and neuromorphic sensors. Specifically, to encode different sensory cues, we built various neural networks of spatial view cells, place cells, head direction cells, and time cells. To integrate these cues, we designed a multiscale liquid state machine that can process and fuse multimodal information effectively and asynchronously using diverse neuronal dynamics and bioinspired inhibitory circuits. We deployed the MHNN on Tianjic, a hybrid neuromorphic chip, and integrated it into a quadruped robot. Our results show that NeuroGPR achieves better performance compared with conventional and existing biologically inspired approaches, exhibiting robustness to diverse environmental uncertainty, including perceptual aliasing, motion blur, light, or weather changes. Running NeuroGPR as an overall multi–neural network workload on Tianjic showcases its advantages with 10.5 times lower latency and 43.6% lower power consumption than the commonly used mobile robot processor Jetson Xavier NX.   Click here to know more about the award. Click here to read the paper.